Sample records for system historical database

  1. A User's Applications of Imaging Techniques: The University of Maryland Historic Textile Database.

    ERIC Educational Resources Information Center

    Anderson, Clarita S.

    1991-01-01

    Describes the incorporation of textile images into the University of Maryland Historic Textile Database by a computer user rather than a computer expert. Selection of a database management system is discussed, and PICTUREPOWER, a system that integrates photographic quality images with text and numeric information in databases, is described. (three…

  2. Spatial cyberinfrastructures, ontologies, and the humanities.

    PubMed

    Sieber, Renee E; Wellen, Christopher C; Jin, Yuan

    2011-04-05

    We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success.

  3. An expert system shell for inferring vegetation characteristics: Changes to the historical cover type database (Task F)

    NASA Technical Reports Server (NTRS)

    1993-01-01

    All the options in the NASA VEGetation Workbench (VEG) make use of a database of historical cover types. This database contains results from experiments by scientists on a wide variety of different cover types. The learning system uses the database to provide positive and negative training examples of classes that enable it to learn distinguishing features between classes of vegetation. All the other VEG options use the database to estimate the error bounds involved in the results obtained when various analysis techniques are applied to the sample of cover type data that is being studied. In the previous version of VEG, the historical cover type database was stored as part of the VEG knowledge base. This database was removed from the knowledge base. It is now stored as a series of flat files that are external to VEG. An interface between VEG and these files was provided. The interface allows the user to select which files of historical data to use. The files are then read, and the data are stored in Knowledge Engineering Environment (KEE) units using the same organization of units as in the previous version of VEG. The interface also allows the user to delete some or all of the historical database units from VEG and load new historical data from a file. This report summarizes the use of the historical cover type database in VEG. It then describes the new interface to the files containing the historical data. It describes minor changes that were made to VEG to enable the externally stored database to be used. Test runs to test the operation of the new interface and also to test the operation of VEG using historical data loaded from external files are described. Task F was completed. A Sun cartridge tape containing the KEE and Common Lisp code for the new interface and the modified version of the VEG knowledge base was delivered to the NASA GSFC technical representative.

  4. Spatial cyberinfrastructures, ontologies, and the humanities

    PubMed Central

    Sieber, Renee E.; Wellen, Christopher C.; Jin, Yuan

    2011-01-01

    We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success. PMID:21444819

  5. Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.

    ERIC Educational Resources Information Center

    Gutmann, Myron P.; And Others

    1989-01-01

    Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)

  6. Unified Database Development Program. Final Report.

    ERIC Educational Resources Information Center

    Thomas, Everett L., Jr.; Deem, Robert N.

    The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…

  7. Integration of Landscape Ecosystem Classification and Historic Land Records in the Francis Marion National Forest

    Treesearch

    Peter U. Kennedy; Victor B. Shelburne

    2002-01-01

    Geographic Information Systems (GIS) data and historical plats ranging from 1716 to 1894 in the Coastal Flatwoods Region of South Carolina were used to quantify changes on a temporal scale. Combining the historic plats and associated witness trees (trees marking the boundaries of historic plats) with an existing database of the soils and other attributes was the basis...

  8. Statewide Inventories of Heritage Resources: Macris and the Experience in Massachusetts

    NASA Astrophysics Data System (ADS)

    Stott, P. H.

    2017-08-01

    The Massachusetts Historical Commission (MHC) is the State Historic Preservation Office for Massachusetts. Established in 1963, MHC has been inventorying historic properties for over half a century. Since 1987, it has maintained a heritage database, the Massachusetts Cultural Resource Information System, or MACRIS. Today MACRIS holds over 206,000 records from the 351 towns and cities across the Commonwealth. Since 2004, a selection of the more than 150 MACRIS fields has been available online at mhcmacris. net. MACRIS is widely used by independent consultants preparing project review files, by MHC staff in its regulatory responsibilities, by local historical commissions monitoring threats to their communities, as well as by scholars, historical organizations, genealogists, property owners, reporters, and the general public interested in the history of the built environment. In 2016 MACRIS began migration off of its three-decade old Pick multivalue database to SQL Server, and in 2017, the first redesign of its thirteen-year old web interface should start to improve usability. Longer-term improvements have the goal of standardizing terminology and ultimately bringing interoperability with other heritage databases closer to reality.

  9. The HISTMAG database: combining historical, archaeomagnetic and volcanic data

    NASA Astrophysics Data System (ADS)

    Arneitz, Patrick; Leonhardt, Roman; Schnepp, Elisabeth; Heilig, Balázs; Mayrhofer, Franziska; Kovacs, Peter; Hejda, Pavel; Valach, Fridrich; Vadasz, Gergely; Hammerl, Christa; Egli, Ramon; Fabian, Karl; Kompein, Niko

    2017-09-01

    Records of the past geomagnetic field can be divided into two main categories. These are instrumental historical observations on the one hand, and field estimates based on the magnetization acquired by rocks, sediments and archaeological artefacts on the other hand. In this paper, a new database combining historical, archaeomagnetic and volcanic records is presented. HISTMAG is a relational database, implemented in MySQL, and can be accessed via a web-based interface (http://www.conrad-observatory.at/zamg/index.php/data-en/histmag-database). It combines available global historical data compilations covering the last ∼500 yr as well as archaeomagnetic and volcanic data collections from the last 50 000 yr. Furthermore, new historical and archaeomagnetic records, mainly from central Europe, have been acquired. In total, 190 427 records are currently available in the HISTMAG database, whereby the majority is related to historical declination measurements (155 525). The original database structure was complemented by new fields, which allow for a detailed description of the different data types. A user-comment function provides the possibility for a scientific discussion about individual records. Therefore, HISTMAG database supports thorough reliability and uncertainty assessments of the widely different data sets, which are an essential basis for geomagnetic field reconstructions. A database analysis revealed systematic offset for declination records derived from compass roses on historical geographical maps through comparison with other historical records, while maps created for mining activities represent a reliable source.

  10. Gregoriano cadastre (1818-35) from old maps to a GIS of historical landscape data

    NASA Astrophysics Data System (ADS)

    Frazzica, V.; Galletti, F.; Orciani, M.; Colosi, L.; Cartaro, A.

    2009-04-01

    Our analysis covered specifically an area located along the "internal Marche ridge" of the Apennines, in the province of Ancona (Marche Region, Italy). The cartographical working-out for our historical analysis has been conduct drawing up maps originating from the nineteenth century Gregoriano Cadastre (Catasto Gregoriano) maps preserved in the State Archive of Rome, which have been reproduced in digital format, georeferenced and vectorialized. With the creation of a database, it has been possible to add to the maps the information gathered from the property registers concerning crop production and socioeconomic variables, in order to set up a Geographical Information System (G.I.S.). The combination of the database with the digitalized maps has allowed to create an univocal relation between each parcel and the related historical data, obtaining an information system which integrally and completely evidences the original cadastre data as a final result. It was also possible to create a three-dimensional model of the historical landscapes which permits to visualize the cultural diversification of that historical period. The integration in Territorial Information System (S.I.T.) of historical information from Gregoriano Cadastre, of socio-economic analyses concerning business changes and in parallel the study of the transformations of territorial framework, showed to be a very important instrument for the area planning, allowing to identify specific planning approaches not only for urban settlement but also for restoration of variety and complexity of agricultural landscape. The work opens further research in various directions, identifying some pilot areas which test new managerial models, foreseeing simulation of management impacts both on business profitability and landscape configuration. The future development of the project is also the upgrade and evolution of the database, followed by the acquisition of data related to the following historical periods. It'll also allow to improve the three-dimensional model (rendering) of the landscape described in the Gregoriano Cadastre.

  11. Overview of Historical Earthquake Document Database in Japan and Future Development

    NASA Astrophysics Data System (ADS)

    Nishiyama, A.; Satake, K.

    2014-12-01

    In Japan, damage and disasters from historical large earthquakes have been documented and preserved. Compilation of historical earthquake documents started in the early 20th century and 33 volumes of historical document source books (about 27,000 pages) have been published. However, these source books are not effectively utilized for researchers due to a contamination of low-reliability historical records and a difficulty for keyword searching by characters and dates. To overcome these problems and to promote historical earthquake studies in Japan, construction of text database started in the 21 century. As for historical earthquakes from the beginning of the 7th century to the early 17th century, "Online Database of Historical Documents in Japanese Earthquakes and Eruptions in the Ancient and Medieval Ages" (Ishibashi, 2009) has been already constructed. They investigated the source books or original texts of historical literature, emended the descriptions, and assigned the reliability of each historical document on the basis of written age. Another database compiled the historical documents for seven damaging earthquakes occurred along the Sea of Japan coast in Honshu, central Japan in the Edo period (from the beginning of the 17th century to the middle of the 19th century) and constructed text database and seismic intensity data base. These are now publicized on the web (written only in Japanese). However, only about 9 % of the earthquake source books have been digitized so far. Therefore, we plan to digitize all of the remaining historical documents by the research-program which started in 2014. The specification of the data base will be similar for previous ones. We also plan to combine this database with liquefaction traces database, which will be constructed by other research program, by adding the location information described in historical documents. Constructed database would be utilized to estimate the distributions of seismic intensities and tsunami heights.

  12. A Database of Historical Information on Landslides and Floods in Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, F.; Tonelli, G.

    2003-04-01

    For the past 12 years we have maintained and updated a database of historical information on landslides and floods in Italy, known as the National Research Council's AVI (Damaged Urban Areas) Project archive. The database was originally designed to respond to a specific request of the Minister of Civil Protection, and was aimed at helping the regional assessment of landslide and flood risk in Italy. The database was first constructed in 1991-92 to cover the period 1917 to 1990. Information of damaging landslide and flood event was collected by searching archives, by screening thousands of newspaper issues, by reviewing the existing technical and scientific literature on landslides and floods in Italy, and by interviewing landslide and flood experts. The database was then updated chiefly through the analysis of hundreds of newspaper articles, and it now covers systematically the period 1900 to 1998, and non-systematically the periods 1900 to 1916 and 1999 to 2002. Non systematic information on landslide and flood events older than 20th century is also present in the database. The database currently contains information on more than 32,000 landslide events occurred at more than 25,700 sites, and on more than 28,800 flood events occurred at more than 15,600 sites. After a brief outline of the history and evolution of the AVI Project archive, we present and discuss: (a) the present structure of the database, including the hardware and software solutions adopted to maintain, manage, use and disseminate the information stored in the database, (b) the type and amount of information stored in the database, including an estimate of its completeness, and (c) examples of recent applications of the database, including a web-based GIS systems to show the location of sites historically affected by landslides and floods, and an estimate of geo-hydrological (i.e., landslide and flood) risk in Italy based on the available historical information.

  13. DEFENSE MEDICAL SURVEILLANCE SYSTEM (DMSS)

    EPA Science Inventory

    AMSA operates the Defense Medical Surveillance System (DMSS), an executive information system whose database contains up-to-date and historical data on diseases and medical events (e.g., hospitalizations, ambulatory visits, reportable diseases, HIV tests, acute respiratory diseas...

  14. AR Based App for Tourist Attraction in ESKİ ÇARŞI (Safranbolu)

    NASA Astrophysics Data System (ADS)

    Polat, Merve; Rakıp Karaş, İsmail; Kahraman, İdris; Alizadehashrafi, Behnam

    2016-10-01

    This research is dealing with 3D modeling of historical and heritage landmarks of Safranbolu that are registered by UNESCO. This is an Augmented Reality (AR) based project in order to trigger virtual three-dimensional (3D) models, cultural music, historical photos, artistic features and animated text information. The aim is to propose a GIS-based approach with these features and add to the system as attribute data in a relational database. The database will be available in an AR-based application to provide information for the tourists.

  15. Evaluation of Marine Corps Manpower Computer Simulation Model

    DTIC Science & Technology

    2016-12-01

    merit- based promotion selection that is in conjunction with the “up or out” manpower system. To ensure mission accomplishment within M&RA, it is...historical data the MSM pulls from an online Oracle database. Two types of data base pulls occur here: acquiring historical data of manpower pyramid...is based off of the assumption that the historical manpower progression is constant, and therefore is controllable. This unfortunately does not marry

  16. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  17. A Support Database System for Integrated System Health Management (ISHM)

    NASA Technical Reports Server (NTRS)

    Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John

    2007-01-01

    The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between system elements to provide the logical context for the database. The historical data archive provides a common repository for sensor data that can be shared between developers and applications. The firmware codebase is used by the developer to organize the intelligent element firmware into atomic units which can be assembled into complete firmware for specific elements.

  18. NASA Astrophysics Data System's New Data

    NASA Astrophysics Data System (ADS)

    Eichhorn, G.; Accomazzi, A.; Demleitner, M.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    2000-05-01

    The NASA Astrophysics Data System has greatly increased its data holdings. The Physics database now contains almost 900,000 references and the Astronomy database almost 550,000 references. The Instrumentation database has almost 600,000 references. The scanned articles in the ADS Article Service are increasing in number continuously. Almost 1 million pages have been scanned so far. Recently the abstracts books from the Lunar and Planetary Science Conference have been scanned and put on-line. The Monthly Notices of the Royal Astronomical Society are currently being scanned back to Volume 1. This is the last major journal to be completely scanned and on-line. In cooperation with a conservation project of the Harvard libraries, microfilms of historical observatory literature are currently being scanned. This will provide access to an important part of the historical literature. The ADS can be accessed at: http://adswww.harvard.edu This project is funded by NASA under grant NCC5-189.

  19. GEOGRAPHIC NAMES INFORMATION SYSTEM (GNIS) ...

    EPA Pesticide Factsheets

    The Geographic Names Information System (GNIS), developed by the U.S. Geological Survey in cooperation with the U.S. Board on Geographic Names (BGN), contains information about physical and cultural geographic features in the United States and associated areas, both current and historical, but not including roads and highways. The database also contains geographic names in Antarctica. The database holds the Federally recognized name of each feature and defines the location of the feature by state, county, USGS topographic map, and geographic coordinates. Other feature attributes include names or spellings other than the official name, feature designations, feature class, historical and descriptive information, and for some categories of features the geometric boundaries. The database assigns a unique feature identifier, a random number, that is a key for accessing, integrating, or reconciling GNIS data with other data sets. The GNIS is our Nation's official repository of domestic geographic feature names information.

  20. Using Geocoded Databases in Teaching Urban Historical Geography.

    ERIC Educational Resources Information Center

    Miller, Roger P.

    1986-01-01

    Provides information regarding hardware and software requirements for using geocoded databases in urban historical geography. Reviews 11 IBM and Apple Macintosh database programs and describes the pen plotter and digitizing table interface used with the databases. (JDH)

  1. Design, Development and Utilization Perspectives on Database Management Systems

    ERIC Educational Resources Information Center

    Shneiderman, Ben

    1977-01-01

    This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)

  2. Applications of Historical Analyses in Combat Modelling

    DTIC Science & Technology

    2011-12-01

    effectiveness, P Dexter, J Battlefield Technology 6, 33-39, (2003). 37. Long term behaviour of solutions of the Lotka- Volterra system under small...2643 database of historical battles. This includes an examination of the inclusion of a fractal model of spatial dispersion on casualty values [6] and...system is viewed as no more than “the sum of its parts” in which all phenomena can be explained in terms of other, more fundamental, phenomena

  3. The U.S. Geological Survey’s nonindigenous aquatic species database: over thirty years of tracking introduced aquatic species in the United States (and counting)

    USGS Publications Warehouse

    Fuller, Pamela L.; Neilson, Matthew E.

    2015-01-01

    The U.S. Geological Survey’s Nonindigenous Aquatic Species (NAS) Database has tracked introductions of freshwater aquatic organisms in the United States for the past four decades. A website provides access to occurrence reports, distribution maps, and fact sheets for more than 1,000 species. The site also includes an on-line reporting system and an alert system for new occurrences. We provide an historical overview of the database, a description of its current capabilities and functionality, and a basic characterization of the data contained within the database.

  4. 10 CFR 766.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Enrichment Services System, which is the database that tracks uranium enrichment services transactions of the... invoicing and historical tracking of SWU deliveries. Use and burnup charges mean lease charges for the...

  5. Using CLIPS in a distributed system: The Network Control Center (NCC) expert system

    NASA Technical Reports Server (NTRS)

    Wannemacher, Tom

    1990-01-01

    This paper describes an intelligent troubleshooting system for the Help Desk domain. It was developed on an IBM-compatible 80286 PC using Microsoft C and CLIPS and an AT&T 3B2 minicomputer using the UNIFY database and a combination of shell script, C programs and SQL queries. The two computers are linked by a lan. The functions of this system are to help non-technical NCC personnel handle trouble calls, to keep a log of problem calls with complete, concise information, and to keep a historical database of problems. The database helps identify hardware and software problem areas and provides a source of new rules for the troubleshooting knowledge base.

  6. The New Zealand Tsunami Database: historical and modern records

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.

    2016-12-01

    A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.

  7. Documentation of Cultural Heritages Using a GIS Based Information and Management System; Case Study of Safranbolu

    NASA Astrophysics Data System (ADS)

    Seker, D. Z.; Alkan, M.; Kutoglu, S. S.; Akcin, H.

    2010-12-01

    Documentation of the cultural heritage sites is extremely important for monitoring and preserves them from natural disasters and human made activities. Due to its very rich historical background from the first human settlements in Catalhoyuk and Alacahoyuk and civilizations such as Byzantine, Seljuk and Ottoman, there are lots of cultural heritage sites in Turkey. 3D modeling and recording of historical buildings using modern tools and techniques in several locations of Turkey have been conducted and still continuing. The nine cultural sites in Turkey are included in the protection list of UNESCO as cultural heritage and one of them is the township of Safranbolu, which is the one of the most outstanding example of the traditional Turkish Architecture and also unique itself in terms of conservation of the human settlement in their authentic environmental motif up till now. In this study outcomes and further studies of a research project related to study area which is supported by the Turkish National Research Center (TUBITAK) with the project number 106Y157, will be presented in details. The basic aim of the study is development a GIS based information and management system for the city of Safranbolu. All historical buildings which are registered are assigned with the database. 3D modeling some of the selected building among the buildings which are registered as historical monuments using different data comes from different sources similar to their original constructions were realized and then it will be distributed via internet by a web-based information system designed during the project. Also some of the buildings were evaluated using close range photogrammetric technique to obtain their façade reliefs, were also assigned with the database. Designed database consists of 3D models, locations, historical information, cadastral and land register data of the selected buildings together with the other data collected during the project related to buildings. Using this system, all kind of spatial and non-spatial analyses were realized and different thematic maps for the historical city were produced. When the project is finalized, all the historical buildings which are consists of houses, mosques, fountains and caravansary in Safranbolu will be recorded permanently and architectural features of them will be integrated to designed spatial information system. In addition, by the help of internet, many people may be reached the data easily which will be very helpful to increase the number of visitor to the town. Also, this project will be guidance for future related studies.

  8. Integration of Jeddah Historical BIM and 3D GIS for Documentation and Restoration of Historical Monument

    NASA Astrophysics Data System (ADS)

    Baik, A.; Yaagoubi, R.; Boehm, J.

    2015-08-01

    This work outlines a new approach for the integration of 3D Building Information Modelling and the 3D Geographic Information System (GIS) to provide semantically rich models, and to get the benefits from both systems to help document and analyse cultural heritage sites. Our proposed framework is based on the Jeddah Historical Building Information Modelling process (JHBIM). This JHBIM consists of a Hijazi Architectural Objects Library (HAOL) that supports higher level of details (LoD) while decreasing the time of modelling. The Hijazi Architectural Objects Library has been modelled based on the Islamic historical manuscripts and Hijazi architectural pattern books. Moreover, the HAOL is implemented using BIM software called Autodesk Revit. However, it is known that this BIM environment still has some limitations with the non-standard architectural objects. Hence, we propose to integrate the developed 3D JHBIM with 3D GIS for more advanced analysis. To do so, the JHBIM database is exported and semantically enriched with non-architectural information that is necessary for restoration and preservation of historical monuments. After that, this database is integrated with the 3D Model in the 3D GIS solution. At the end of this paper, we'll illustrate our proposed framework by applying it to a Historical Building called Nasif Historical House in Jeddah. First of all, this building is scanned by the use of a Terrestrial Laser Scanner (TLS) and Close Range Photogrammetry. Then, the 3D JHBIM based on the HOAL is designed on Revit Platform. Finally, this model is integrated to a 3D GIS solution through Autodesk InfraWorks. The shown analysis presented in this research highlights the importance of such integration especially for operational decisions and sharing the historical knowledge about Jeddah Historical City. Furthermore, one of the historical buildings in Old Jeddah, Nasif Historical House, was chosen as a test case for the project.

  9. Historical hydrology and database on flood events (Apulia, southern Italy)

    NASA Astrophysics Data System (ADS)

    Lonigro, Teresa; Basso, Alessia; Gentile, Francesco; Polemio, Maurizio

    2014-05-01

    Historical data about floods represent an important tool for the comprehension of the hydrological processes, the estimation of hazard scenarios as a basis for Civil Protection purposes, as a basis of the rational land use management, especially in karstic areas, where time series of river flows are not available and the river drainage is rare. The research shows the importance of the improvement of existing flood database with an historical approach, finalized to collect past or historical floods event, in order to better assess the occurrence trend of floods, in the case for the Apulian region (south Italy). The main source of records of flood events for Apulia was the AVI (the acronym means Italian damaged areas) database, an existing Italian database that collects data concerning damaging floods from 1918 to 1996. The database was expanded consulting newspapers, publications, and technical reports from 1996 to 2006. In order to expand the temporal range further data were collected searching in the archives of regional libraries. About 700 useful news from 17 different local newspapers were found from 1876 to 1951. From a critical analysis of the 700 news collected since 1876 to 1952 only 437 were useful for the implementation of the Apulia database. The screening of these news showed the occurrence of about 122 flood events in the entire region. The district of Bari, the regional main town, represents the area in which the great number of events occurred; the historical analysis confirms this area as flood-prone. There is an overlapping period (from 1918 to 1952) between old AVI database and new historical dataset obtained by newspapers. With regard to this period, the historical research has highlighted new flood events not reported in the existing AVI database and it also allowed to add more details to the events already recorded. This study shows that the database is a dynamic instrument, which allows a continuous implementation of data, even in real time. More details on previous results of this research activity were recently published (Polemio, 2010; Basso et al., 2012; Lonigro et al., 2013) References Basso A., Lonigro T. and Polemio M. (2012) "The improvement of historical database on damaging hydrogeological events in the case of Apulia (Southern Italy)". Rendiconti online della Società Geologica Italiana, 21: 379-380; Lonigro T., Basso A. and Polemio M. (2013) "Historical database on damaging hydrogeological events in Apulia region (Southern Italy)". Rendiconti online della Società Geologica Italiana, 24: 196-198; Polemio M. (2010) "Historical floods and a recent extreme rainfall event in the Murgia karstic environment (Southern Italy)". Zeitschrift für Geomorphologie, 54(2): 195-219.

  10. Landsat-4 and Landsat-5 thematic mapper band 6 historical performance and calibration

    USGS Publications Warehouse

    Barsi, J.A.; Chander, G.; Markham, B.L.; Higgs, N.; ,

    2005-01-01

    Launched in 1982 and 1984 respectively, the Landsat-4 and -5 Thematic Mappers (TM) are the backbone of an extensive archive of moderate resolution Earth imagery. However, these sensors and their data products were not subjected to the type of intensive monitoring that has been part of the Landsat-7 system since its launch in 1999. With Landsat-4's 11 year and Landsat-5's 20+ year data record, there is a need to understand the historical behavior of the instruments in order to verify the scientific integrity of the archive and processed products. Performance indicators of the Landsat-4 and -5 thermal bands have recently been extracted from a processing system database allowing for a more complete study of thermal band characteristics and calibration than was previously possible. The database records responses to the internal calibration system, instrument temperatures and applied gains and offsets for each band for every scene processed through the National Landsat Archive Production System (NLAPS). Analysis of this database has allowed for greater understanding of the calibration and improvement in the processing system. This paper will cover the trends in the Landsat-4 and -5 thermal bands, the effect of the changes seen in the trends, and how these trends affect the use of the thermal data.

  11. Glufosinate-ammonium

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Glufosinate - ammoni

  12. Fomesafen

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Fomesafen ; CASRN 72

  13. Pirimiphos-methyl

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Pirimiphos - methyl

  14. Bromoxynil

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bromoxynil ; CASRN 1

  15. Prometryn

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Prometryn ; CASRN 72

  16. Linuron

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Linuron ; CASRN 330

  17. Dimethoate

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Dimethoate ; CASRN 6

  18. Methidathion

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Methidathion ; CASRN

  19. Phenmedipham

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Phenmedipham ; CASRN

  20. Pendimethalin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Pendimethalin ; CASR

  1. Chlorsulfuron

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Chlorsulfuron ; CASR

  2. Thiophanate-methyl

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Thiophanate - methyl

  3. Thiram

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Thiram ; CASRN 137 -

  4. Benefin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Benefin ; CASRN 1861

  5. Cyhalothrin/Karate

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Cyhalothrin / Karate

  6. Triallate

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Triallate ; CASRN 23

  7. Propiconazole

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Propiconazole ; CASR

  8. Chlorimuron-ethyl

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Chlorimuron - ethyl

  9. Cypermethrin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Cypermethrin ; CASRN

  10. Napropamide

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Napropamide ; CASRN

  11. Difenzoquat

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Difenzoquat ; CASRN

  12. Diphenylamine

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Diphenylamine ; CASR

  13. Harmony

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Harmony ; CASRN 7927

  14. Bidrin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bidrin ; CASRN 141 -

  15. Pursuit

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Pursuit ; CASRN 8133

  16. Imazalil

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Imazalil ; CASRN 355

  17. Vinclozolin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Vinclozolin ; CASRN

  18. Chlorpropham

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Chlorpropham ; CASRN

  19. Bayleton

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bayleton ; CASRN 431

  20. Propargite

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Propargite ; CASRN 2

  1. Oxyfluorfen

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Oxyfluorfen ; CASRN

  2. Amdro

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Amdro ; CASRN 67485

  3. Sethoxydim

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Sethoxydim ; CASRN 7

  4. Asulam

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Asulam ; CASRN 3337

  5. Norflurazon

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Norflurazon ; CASRN

  6. Dodine

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Dodine ; CASRN 2439

  7. Dimethipin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Dimethipin ; CASRN 5

  8. Bromoxynil octanoate

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bromoxynil octanoate

  9. Folpet

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Folpet ; CASRN 133 -

  10. Flurprimidol

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Flurprimidol ; CASRN

  11. Oryzalin

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Oryzalin ; CASRN 190

  12. Lactofen

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Lactofen ; CASRN 775

  13. Flutolanil

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Flutolanil ; CASRN 6

  14. Acephate

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Acephate ; CASRN 305

  15. RESIS-II: An Updated Version of the Original Reservoir Sedimentation Survey Information System (RESIS) Database

    USGS Publications Warehouse

    Ackerman, Katherine V.; Mixon, David M.; Sundquist, Eric T.; Stallard, Robert F.; Schwarz, Gregory E.; Stewart, David W.

    2009-01-01

    The Reservoir Sedimentation Survey Information System (RESIS) database, originally compiled by the Soil Conservation Service (now the Natural Resources Conservation Service) in collaboration with the Texas Agricultural Experiment Station, is the most comprehensive compilation of data from reservoir sedimentation surveys throughout the conterminous United States (U.S.). The database is a cumulative historical archive that includes data from as early as 1755 and as late as 1993. The 1,823 reservoirs included in the database range in size from farm ponds to the largest U.S. reservoirs (such as Lake Mead). Results from 6,617 bathymetric surveys are available in the database. This Data Series provides an improved version of the original RESIS database, termed RESIS-II, and a report describing RESIS-II. The RESIS-II relational database is stored in Microsoft Access and includes more precise location coordinates for most of the reservoirs than the original database but excludes information on reservoir ownership. RESIS-II is anticipated to be a template for further improvements in the database.

  16. Fosetyl-al

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Fosetyl - al ; CASRN

  17. NuStar

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016NuStar ; CASRN 85509 -

  18. Merphos oxide

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Merphos oxide ; CASR

  19. S-Ethyl dipropylthiocarbamate (EPTC)

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) S - Ethyl dipropylth

  20. Remote sensing and geographic database management systems applications for the protection and conservation of cultural heritage

    NASA Astrophysics Data System (ADS)

    Palumbo, Gaetano; Powlesland, Dominic

    1996-12-01

    The Getty Conservation Institute is exploring the feasibility of using remote sensing associated with a geographic database management system (GDBMS) in order to provide archaeological and historic site managers with sound evaluations of the tools available for site and information management. The World Heritage Site of Chaco Canyon, New Mexico, a complex of archeological sites dating to the 10th to the 13th centuries AD, was selected as a test site. Information from excavations conducted there since the 1930s, and a range of documentation generated by the National Park Service was gathered. NASA's John C. Stennis Space Center contributed multispectral data of the area, and the Jet Propulsion Laboratory contributed data from ATLAS (airborne terrestrial applications sensor) and CAMS (calibrated airborne multispectral scanner) scanners. Initial findings show that while 'automatic monitoring systems' will probably never be a reality, with careful comparisons of historic and modern photographs, and performing digital analysis of remotely sensed data, excellent results are possible.

  1. NOAA Data Rescue of Key Solar Databases and Digitization of Historical Solar Images

    NASA Astrophysics Data System (ADS)

    Coffey, H. E.

    2006-08-01

    Over a number of years, the staff at NOAA National Geophysical Data Center (NGDC) has worked to rescue key solar databases by converting them to digital format and making them available via the World Wide Web. NOAA has had several data rescue programs where staff compete for funds to rescue important and critical historical data that are languishing in archives and at risk of being lost due to deteriorating condition, loss of any metadata or descriptive text that describe the databases, lack of interest or funding in maintaining databases, etc. The Solar-Terrestrial Physics Division at NGDC was able to obtain funds to key in some critical historical tabular databases. Recently the NOAA Climate Database Modernization Program (CDMP) funded a project to digitize historical solar images, producing a large online database of historical daily full disk solar images. The images include the wavelengths Calcium K, Hydrogen Alpha, and white light photos, as well as sunspot drawings and the comprehensive drawings of a multitude of solar phenomena on one daily map (Fraunhofer maps and Wendelstein drawings). Included in the digitization are high resolution solar H-alpha images taken at the Boulder Solar Observatory 1967-1984. The scanned daily images document many phases of solar activity, from decadal variation to rotational variation to daily changes. Smaller versions are available online. Larger versions are available by request. See http://www.ngdc.noaa.gov/stp/SOLAR/ftpsolarimages.html. The tabular listings and solar imagery will be discussed.

  2. VEMAP phase 2 bioclimatic database. I. Gridded historical (20th century) climate for modeling ecosystem dynamics across the conterminous USA

    Treesearch

    Timothy G.F. Kittel; Nan. A. Rosenbloom; J.A. Royle; C. Daly; W.P. Gibson; H.H. Fisher; P. Thornton; D.N. Yates; S. Aulenbach; C. Kaufman; R. McKeown; Dominque Bachelet; David S. Schimel

    2004-01-01

    Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the...

  3. Shared Web Information Systems for Heritage in Scotland and Wales - Flexibility in Partnership

    NASA Astrophysics Data System (ADS)

    Thomas, D.; McKeague, P.

    2013-07-01

    The Royal Commissions on the Ancient and Historical Monuments of Scotland and Wales were established in 1908 to investigate and record the archaeological and built heritage of their respective countries. The organisations have grown organically over the succeeding century, steadily developing their inventories and collections as card and paper indexes. Computerisation followed in the late 1980s and early 1990s, with RCAHMS releasing Canmore, an online searchable database, in 1998. Following a review of service provision in Wales, RCAHMW entered into partnership with RCAHMS in 2003 to deliver a database for their national inventories and collections. The resultant partnership enables both organisations to develop at their own pace whilst delivering efficiencies through a common experience and a shared IT infrastructure. Through innovative solutions the partnership has also delivered benefits to the wider historic environment community, providing online portals to a range of datasets, ultimately raising public awareness and appreciation of the heritage around them. Now celebrating its 10th year, Shared Web Information Systems for Heritage, or more simply SWISH, continues to underpin the work of both organisations in presenting information about the historic environment to the public.

  4. 4-(2-Methyl-4-chlorophenoxy) butyric acid (MCPB)

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) 4 - ( 2 - Methyl - 4

  5. Digitized Database of Old Seismograms Recorder in Romania

    NASA Astrophysics Data System (ADS)

    Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea

    2016-08-01

    The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.

  6. Integrating GIS, Archeology, and the Internet.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sera White; Brenda Ringe Pace; Randy Lee

    2004-08-01

    At the Idaho National Engineering and Environmental Laboratory's (INEEL) Cultural Resource Management Office, a newly developed Data Management Tool (DMT) is improving management and long-term stewardship of cultural resources. The fully integrated system links an archaeological database, a historical database, and a research database to spatial data through a customized user interface using ArcIMS and Active Server Pages. Components of the new DMT are tailored specifically to the INEEL and include automated data entry forms for historic and prehistoric archaeological sites, specialized queries and reports that address both yearly and project-specific documentation requirements, and unique field recording forms. The predictivemore » modeling component increases the DMT’s value for land use planning and long-term stewardship. The DMT enhances the efficiency of archive searches, improving customer service, oversight, and management of the large INEEL cultural resource inventory. In the future, the DMT will facilitate data sharing with regulatory agencies, tribal organizations, and the general public.« less

  7. 4-(2,4-Dichlorophenoxy)butyric acid (2,4-DB)

    Integrated Risk Information System (IRIS)

    Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) 4 - ( 2,4 - Dichloro

  8. Copyright, Licensing Agreements and Gateways.

    ERIC Educational Resources Information Center

    Elias, Arthur W.

    1990-01-01

    Discusses technological developments in information distribution and management in relation to concepts of ownership. A historical overview of the concept of copyright is presented; licensing elements for databases are examined; and implications for gateway systems are explored, including ownership, identification of users, and allowable uses of…

  9. Data dictionary and discussion for the midnite mine GIS database. Report of investigations/1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, D.C.; Smith, M.A.; Ferderer, D.A.

    1996-01-18

    A geographic information system (GIS) database has been developed by the U.S. Bureau of Mines (USBM) for the Midnite Mine and surroundings in northeastern Washington State (Stevens County) on the Spokane Indian Reservation. The GIS database was compiled to serve as a repository and source of historical and research information on the mine site. The database also will be used by the Bureau of Land Management and the Bureau of Indian Affairs (as well as others) for environmental assessment and reclamation planning for future remediation and reclamation of the site. This report describes the data in the GIS database andmore » their characteristics. The report also discusses known backgrounds on the data sets and any special considerations encountered by the USBM in developing the database.« less

  10. Assessing historical fish community composition using surveys, historical collection data, and species distribution models.

    PubMed

    Labay, Ben; Cohen, Adam E; Sissel, Blake; Hendrickson, Dean A; Martin, F Douglas; Sarkar, Sahotra

    2011-01-01

    Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities.

  11. New Technology, New Questions: Using an Internet Database in Chemistry.

    ERIC Educational Resources Information Center

    Hayward, Roger

    1996-01-01

    Describes chemistry software that is part of a balanced educational program. Provides several applications including graphs of various relationships among the elements. Includes a brief historical treatment of the periodic table and compares the traditional historical approach with perspectives gained by manipulating an electronic database. (DDR)

  12. Summary of recovered historical ground-water-level data for Michigan, 1934-2005

    USGS Publications Warehouse

    Cornett, Cassaundra L.; Crowley, Suzanne L.; McGowan, Rose M.; Blumer, Stephen P.; Reeves, Howard W.

    2006-01-01

    This report documents ground-water-level data-recovery efforts performed by the USGS Michigan Water Science Center and provides nearly three-hundred hydrographs generated from these recovered data. Data recovery is the process of verifying and transcribing data from paper files into the USGS National Water Information System (NWIS) electronic databases appropriate for ground-water-level data. Entering these data into the NWIS databases makes them more useful for USGS analysis and also makes them available to the public through the internet.

  13. The "Prediflood" database of historical floods in Catalonia (NE Iberian Peninsula) AD 1035-2013, and its potential applications in flood analysis

    NASA Astrophysics Data System (ADS)

    Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.

    2014-07-01

    "Prediflood" is a database of historical floods occurred in Catalonia (NE Iberian Peninsula), between 10th Century and 21th Century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, apt to be used in multidisciplinary flood analysis techniques, as meteorological or hydraullic reconstructions.

  14. The "Prediflood" database of historical floods in Catalonia (NE Iberian Peninsula) AD 1035-2013, and its potential applications in flood analysis

    NASA Astrophysics Data System (ADS)

    Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.

    2014-12-01

    "Prediflood" is a database of historical floods that occurred in Catalonia (NE Iberian Peninsula), between the 11th century and the 21st century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, suitable for use in multidisciplinary flood analysis techniques, such as meteorological or hydraulic reconstructions.

  15. Reconstructions of Fire Activity in North America and Europe over the Past 250 Years: A comparison of the Global Charcoal Database with Historical Records

    NASA Astrophysics Data System (ADS)

    Magi, B. I.; Marlon, J. R.; Mouillot, F.; Daniau, A. L.; Bartlein, P. J.; Schaefer, A.

    2017-12-01

    Fire is intertwined with climate variability and human activities in terms of both its causes and consequences, and the most complete understanding will require a multidisciplinary approach. The focus in this study is to compare data-based records of variability in climate and human activities, with fire and land cover change records over the past 250 years in North America and Europe. The past 250 years is a critical period for contextualizing the present-day impact of human activities on climate. Data are from the Global Charcoal Database and from historical reconstructions of past burning. The GCD is comprised of sediment records of charcoal accumulation rates collected around the world by dozens of researchers, and facilitated by the PAGES Global Paleofire Working Group. The historical reconstruction extends back to 1750 CE is based on literature and government records when available, and completed with non-charcoal proxies including tree ring scars or storylines when data are missing. The key data sets are independent records, and the methods and results are independent of any climate or fire-model simulations. Results are presented for Europe, and subsets of North America. Analysis of fire trends from GCD and the historical reconstruction shows broad agreement, with some regional variations as expected. Western USA and North America in general show the best agreement, with departures in the GCD and historical reconstruction fire trends in the present day that may reflect limits in the data itself. Eastern North America shows agreement with an increase in fire from 1750 to 1900, and a strong decreasing trend thereafter. We present ideas for why the trends agree and disagree relative to historical events, and to the sequence of land-cover change in the regions of interest. Together with careful consideration of uncertainties in the data, these results can be used to constrain Earth System Model simulations of both past fire, which explicitly incorporate historical fire emissions, and the pathways of future fire on a warmer planet.

  16. Chirila: Contemporary and Historical Resources for the Indigenous Languages of Australia

    ERIC Educational Resources Information Center

    Bowern, Claire

    2016-01-01

    Here I present the background to, and a description of, a newly developed database of historical and contemporary lexical data for Australian languages (Chirila), concentrating on the Pama-Nyungan family (the largest family in the country). While the database was initially developed in order to facilitate research on cognate words and…

  17. Service Management Database for DSN Equipment

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.

  18. Database assessment of CMIP5 and hydrological models to determine flood risk areas

    NASA Astrophysics Data System (ADS)

    Limlahapun, Ponthip; Fukui, Hiromichi

    2016-11-01

    Solutions for water-related disasters may not be solved with a single scientific method. Based on this premise, we involved logic conceptions, associate sequential result amongst models, and database applications attempting to analyse historical and future scenarios in the context of flooding. The three main models used in this study are (1) the fifth phase of the Coupled Model Intercomparison Project (CMIP5) to derive precipitation; (2) the Integrated Flood Analysis System (IFAS) to extract amount of discharge; and (3) the Hydrologic Engineering Center (HEC) model to generate inundated areas. This research notably focused on integrating data regardless of system-design complexity, and database approaches are significantly flexible, manageable, and well-supported for system data transfer, which makes them suitable for monitoring a flood. The outcome of flood map together with real-time stream data can help local communities identify areas at-risk of flooding in advance.

  19. Historical seismometry database project: A comprehensive relational database for historical seismic records

    NASA Astrophysics Data System (ADS)

    Bono, Andrea

    2007-01-01

    The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical context is introduced. Following are the descriptions of the various planning phases, from the conceptual level to the physical implementation of the scheme. Each time principle instructions, rules, considerations of technical-scientific nature are highlighted that take to the final result: a vanguard relational scheme for historical data.

  20. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.

  1. HANZE: a pan-European database of exposure to natural hazards and damaging historical floods since 1870

    NASA Astrophysics Data System (ADS)

    Paprotny, Dominik; Morales-Nápoles, Oswaldo; Jonkman, Sebastiaan N.

    2018-03-01

    The influence of social and economic change on the consequences of natural hazards has been a matter of much interest recently. However, there is a lack of comprehensive, high-resolution data on historical changes in land use, population, or assets available to study this topic. Here, we present the Historical Analysis of Natural Hazards in Europe (HANZE) database, which contains two parts: (1) HANZE-Exposure with maps for 37 countries and territories from 1870 to 2020 in 100 m resolution and (2) HANZE-Events, a compilation of past disasters with information on dates, locations, and losses, currently limited to floods only. The database was constructed using high-resolution maps of present land use and population, a large compilation of historical statistics, and relatively simple disaggregation techniques and rule-based land use reallocation schemes. Data encompassed in HANZE allow one to "normalize" information on losses due to natural hazards by taking into account inflation as well as changes in population, production, and wealth. This database of past events currently contains 1564 records (1870-2016) of flash, river, coastal, and compound floods. The HANZE database is freely available at https://data.4tu.nl/repository/collection:HANZE.

  2. The Prediflood database. A new tool for an integrated approach to historical floods in Catalonia (NE Iberian Peninsula), AD 1033-2013

    NASA Astrophysics Data System (ADS)

    Barriendos, Mariano; Carles Balasch Solanes, Josep; Tuset, Jordi; Lluís Ruiz-Bellet, Josep

    2014-05-01

    Available information of historical floods can improve the management of hydroclimatic hazards. This approach is useful in ungauged basins or with short instrumental data series. On the other hand, flood risk is increasing due to both the expansion of human land occupation and the modification of rainfall patterns in the present global climatic change scenario. Within the Prediflood Project, we have designed an integrated database of historical floods in Catalonia with the aim to feed data to: 1) Meteorological reconstruction and modelling. 2) Hydrological and hydraulic reconstruction. 3) Human impacts evaluation, of these floods. The firsts steps of the database design focus on spatial location and on the quality of the data sources in three levels: 1) Historical documentary sources and newspapers contemporary with the floods. 2) Local historiography. 3) Technical reports. After the application of historiographical methodologies, more than 2300 flood records have been added to the database so far. Despite the completion of the database is still a work in progress, the firsts analyses are already underway and focus on the largest floods with catastrophic effects simultaneously on more than 15 catchments: November 1617, October 1787, September 1842, May 1853, September 1874, January 1898, October 1907, October 1940, September 1962, November 1982, October 1994 and others.

  3. An on-line expert system for diagnosing environmentally induced spacecraft anomalies using CLIPS

    NASA Technical Reports Server (NTRS)

    Lauriente, Michael; Rolincik, Mark; Koons, Harry C; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred rules and provide links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information (varying degrees of confidence in an answer) or 'unknown' to any question. The expert system not only provides scientists with needed risk analysis and confidence estimates not available in standard numerical models or databases, but it is also an effective learning tool. In addition, the architecture of the expert system allows easy additions to the knowledge base and the database. For example, new frames concerning orbital debris and ionospheric scintillation are being considered. The system currently runs on a MicroVAX and uses the C Language Integrated Production System (CLIPS).

  4. Traditional Medicine Collection Tracking System (TM-CTS): a database for ethnobotanically driven drug-discovery programs.

    PubMed

    Harris, Eric S J; Erickson, Sean D; Tolopko, Andrew N; Cao, Shugeng; Craycroft, Jane A; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E; Eisenberg, David M

    2011-05-17

    Ethnobotanically driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically driven natural product collection and drug-discovery programs. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  5. Traditional Medicine Collection Tracking System (TM-CTS): A Database for Ethnobotanically-Driven Drug-Discovery Programs

    PubMed Central

    Harris, Eric S. J.; Erickson, Sean D.; Tolopko, Andrew N.; Cao, Shugeng; Craycroft, Jane A.; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E.; Eisenberg, David M.

    2011-01-01

    Aim of the study. Ethnobotanically-driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine-Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. Materials and Methods. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. Results. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. Conclusions. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically-driven natural product collection and drug-discovery programs. PMID:21420479

  6. Documentation for Preservation: Methodology and a GIS Database of Three World Heritage Cities in Uzbekistan

    NASA Astrophysics Data System (ADS)

    Vileikis, O.; Escalante Carrillo, E.; Allayarov, S.; Feyzulayev, A.

    2017-08-01

    The historic cities of Uzbekistan are an irreplaceable legacy of the Silk Roads. Currently, Uzbekistan counts with four UNESCO World Heritage Properties, with hundreds of historic monuments and traditional historic houses. However, lack of documentation, systematic monitoring and a digital database, of the historic buildings and dwellings within the historic centers, are threatening the World Heritage properties and delaying the development of a proper management mechanism for the preservation of the heritage and an interwoven city urban development. Unlike the monuments, the traditional historic houses are being demolished without any enforced legal protection, leaving no documentation to understand the city history and its urban fabric as well of way of life, traditions and customs over the past centuries. To fill out this gap, from 2008 to 2015, the Principal Department for Preservation and Utilization of Cultural Objects of the Ministry of Culture and Sports of Uzbekistan with support from the UNESCO Office in Tashkent, and in collaboration with several international and local universities and institutions, carried out a survey of the Historic Centre of Bukhara, Itchan Kala and Samarkand Crossroad of Cultures. The collaborative work along these years have helped to consolidate a methodology and to integrate a GIS database that is currently contributing to the understanding of the outstanding heritage values of these cities as well as to develop preservation and management strategies with a solid base of heritage documentation.

  7. A Database Design for the Brazilian Air Force Military Personnel Control System.

    DTIC Science & Technology

    1987-06-01

    GIVEN A RECNum GET MOVING HISTORICAL". 77 SEL4 PlC X(70) VALUE ". 4. GIVEN A RECNUM GET NOMINATION HISTORICAL". 77 SEL5 PIC X(70) VALUE it 5. GIVEN A...WHERE - "°RECNUM = :RECNUM". 77 SQL-SEL3-LENGTH PIC S9999 VALUE 150 COMP. 77 SQL- SEL4 PIC X(150) VALUE "SELECT ABBREV,DTNOM,DTEXO,SITN FROM...NOMINATION WHERE RECNUM 77 SQL- SEL4 -LENGTH PIC S9999 VALUE 150 COMP. 77 SQL-SEL5 PIC X(150) VALUE "SELECT ABBREVDTDES,DTWAIVER,SITD FROM DESIG WHERE RECNUM It

  8. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  9. Assessing Historical Fish Community Composition Using Surveys, Historical Collection Data, and Species Distribution Models

    PubMed Central

    Labay, Ben; Cohen, Adam E.; Sissel, Blake; Hendrickson, Dean A.; Martin, F. Douglas; Sarkar, Sahotra

    2011-01-01

    Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities. PMID:21966438

  10. Gender Implications in Curriculum and Entrance Exam Grouping: Institutional Factors and Their Effects

    ERIC Educational Resources Information Center

    Hsaieh, Hsiao-Chin; Yang, Chia-Ling

    2014-01-01

    While access to higher education has reached gender parity in Taiwan, the phenomenon of gender segregation and stratification by fields of study and by division of labor persist. In this article, we trace the historical evolution of Taiwan's education system and data using large-scale educational databases to analyze the association of…

  11. XML Storage for Magnetotelluric Transfer Functions: Towards a Comprehensive Online Reference Database

    NASA Astrophysics Data System (ADS)

    Kelbert, A.; Blum, C.

    2015-12-01

    Magnetotelluric Transfer Functions (MT TFs) represent most of the information about Earth electrical conductivity found in the raw electromagnetic data, providing inputs for further inversion and interpretation. To be useful for scientific interpretation, they must also contain carefully recorded metadata. Making these data available in a discoverable and citable fashion would provide the most benefit to the scientific community, but such a development requires that the metadata is not only present in the file but is also searchable. The most commonly used MT TF format to date, the historical Society of Exploration Geophysicists Electromagnetic Data Interchange Standard 1987 (EDI), no longer supports some of the needs of modern magnetotellurics, most notably accurate error bars recording. Moreover, the inherent heterogeneity of EDI's and other historic MT TF formats has mostly kept the community away from healthy data sharing practices. Recently, the MT team at Oregon State University in collaboration with IRIS Data Management Center developed a new, XML-based format for MT transfer functions, and an online system for long-term storage, discovery and sharing of MT TF data worldwide (IRIS SPUD; www.iris.edu/spud/emtf). The system provides a query page where all of the MT transfer functions collected within the USArray MT experiment and other field campaigns can be searched for and downloaded; an automatic on-the-fly conversion to the historic EDI format is also included. To facilitate conversion to the new, more comprehensive and sustainable, XML format for MT TFs, and to streamline inclusion of historic data into the online database, we developed a set of open source format conversion tools, which can be used for rotation of MT TFs as well as a general XML <-> EDI converter (https://seiscode.iris.washington.edu/projects/emtf-fcu). Here, we report on the newly established collaboration between the USGS Geomagnetism Program and the Oregon State University to gather and convert both historic and modern-day MT or related transfer functions into the searchable database at the IRIS DMC. The more complete and free access to these previously collected MT TFs will be of great value to MT scientists both in planning future surveys, and then to leverage the value of the new data at the inversion and interpretation stage.

  12. Developing a Global Database of Historic Flood Events to Support Machine Learning Flood Prediction in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.

    2016-12-01

    There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.

  13. World Ocean Database and the Global Temperature and Salinity Profile Program Database: Synthesis of historical and near real-time ocean profile data

    NASA Astrophysics Data System (ADS)

    Boyer, T.; Sun, L.; Locarnini, R. A.; Mishonov, A. V.; Hall, N.; Ouellet, M.

    2016-02-01

    The World Ocean Database (WOD) contains systematically quality controlled historical and recent ocean profile data (temperature, salinity, oxygen, nutrients, carbon cycle variables, biological variables) ranging from Captain Cooks second voyage (1773) to this year's Argo floats. The US National Centers for Environmental Information (NCEI) also hosts the Global Temperature and Salinity Profile Program (GTSPP) Continuously Managed Database (CMD) which provides quality controlled near-real time ocean profile data and higher level quality controlled temperature and salinity profiles from 1990 to present. Both databases are used extensively for ocean and climate studies. Synchronization of these two databases will allow easier access and use of comprehensive regional and global ocean profile data sets for ocean and climate studies. Synchronizing consists of two distinct phases: 1) a retrospective comparison of data in WOD and GTSPP to ensure that the most comprehensive and highest quality data set is available to researchers without the need to individually combine and contrast the two datasets and 2) web services to allow the constantly accruing near-real time data in the GTSPP CMD and the continuous addition and quality control of historical data in WOD to be made available to researchers together, seamlessly.

  14. Depth-area-duration characteristics of storm rainfall in Texas using Multi-Sensor Precipitation Estimates

    NASA Astrophysics Data System (ADS)

    McEnery, J. A.; Jitkajornwanich, K.

    2012-12-01

    This presentation will describe the methodology and overall system development by which a benchmark dataset of precipitation information has been used to characterize the depth-area-duration relations in heavy rain storms occurring over regions of Texas. Over the past two years project investigators along with the National Weather Service (NWS) West Gulf River Forecast Center (WGRFC) have developed and operated a gateway data system to ingest, store, and disseminate NWS multi-sensor precipitation estimates (MPE). As a pilot project of the Integrated Water Resources Science and Services (IWRSS) initiative, this testbed uses a Standard Query Language (SQL) server to maintain a full archive of current and historic MPE values within the WGRFC service area. These time series values are made available for public access as web services in the standard WaterML format. Having this volume of information maintained in a comprehensive database now allows the use of relational analysis capabilities within SQL to leverage these multi-sensor precipitation values and produce a valuable derivative product. The area of focus for this study is North Texas and will utilize values that originated from the West Gulf River Forecast Center (WGRFC); one of three River Forecast Centers currently represented in the holdings of this data system. Over the past two decades, NEXRAD radar has dramatically improved the ability to record rainfall. The resulting hourly MPE values, distributed over an approximate 4 km by 4 km grid, are considered by the NWS to be the "best estimate" of rainfall. The data server provides an accepted standard interface for internet access to the largest time-series dataset of NEXRAD based MPE values ever assembled. An automated script has been written to search and extract storms over the 18 year period of record from the contents of this massive historical precipitation database. Not only can it extract site-specific storms, but also duration-specific storms and storms separated by user defined inter-event periods. A separate storm database has been created to store the selected output. By storing output within tables in a separate database, we can make use of powerful SQL capabilities to perform flexible pattern analysis. Previous efforts have made use of historic data from limited clusters of irregularly spaced physical gauges. Spatial extent of the observational network has been a limiting factor. The relatively dense distribution of MPE provides a virtual mesh of observations stretched over the landscape. This work combines a unique hydrologic data resource with programming and database analysis to characterize storm depth-area-duration relationships.

  15. Improving retrospective characterization of the food environment for a large region in the United States during a historic time period.

    PubMed

    Auchincloss, Amy H; Moore, Kari A B; Moore, Latetia V; Diez Roux, Ana V

    2012-11-01

    Access to healthy foods has received increasing attention due to growing prevalence of obesity and diet-related health conditions yet there are major obstacles in characterizing the local food environment. This study developed a method to retrospectively characterize supermarkets for a single historic year, 2005, in 19 counties in 6 states in the USA using a supermarket chain-name list and two business databases. Data preparation, merging, overlaps, added-value amongst various approaches and differences by census tract area-level socio-demographic characteristics are described. Agreement between two food store databases was modest: 63%. Only 55% of the final list of supermarkets were identified by a single business database and selection criteria that included industry classification codes and sales revenue ≥$2 million. The added-value of using a supermarket chain-name list and second business database was identification of an additional 14% and 30% of supermarkets, respectively. These methods are particularly useful to retrospectively characterize access to supermarkets during a historic period and when field observations are not feasible and business databases are used. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Alloplastic temporomandibular joint replacement systems: a systematic review of their history.

    PubMed

    De Meurechy, N; Mommaerts, M Y

    2018-06-01

    This systematic review provides an overview of the historical evolution of the prosthetic temporomandibular joint and addresses the challenges and complications faced by engineers and surgeons, in an effort to shed light on why only a few systems remain available. A better understanding of the history of temporomandibular joint prostheses might also provide insights into the origin of the negative public opinion of the prosthesis, which is based on outdated information. A computerized search using the PubMed Central, ScienceDirect, Wiley Online, Ovid, and Cochrane Library databases was performed following the PRISMA guidelines. Out of 7122 articles identified, 41 met the inclusion criteria for this systematic review. Although several historical reviews have been published previously, none has covered such an extensive time period or has described all designs. Furthermore, besides providing a historical overview, this review discusses the rationale behind the evolution in design and biomaterials, which have largely contributed to the outcomes of the prosthetic systems. Copyright © 2018 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  17. Recording, monitoring and managing the conservation of historic sites: a new application for BGS·SIGMA

    NASA Astrophysics Data System (ADS)

    Tracey, Emily; Smith, Nichola; Lawrie, Ken

    2017-04-01

    The principles behind, and the methods of, digital data capture can be applied across many scientific, and other, disciplines, as can be demonstrated by the use of a custom modified version of the British Geological Survey's System for Integrated Geoscience Mapping, (BGS·SIGMA), for the capture of data for use in the conservation of Scottish built heritage. Historic Environment Scotland (HES), an executive agency of the Scottish Government charged with safeguarding the nation's historic environment, is directly responsible for 345 sites of national significance, most of which are built from stone. In common with many other heritage organisations, HES needs a system that can capture, store and present conservation, maintenance and condition indicator information for single or multiple historic sites; this system would then be used to better target and plan effective programmes of maintenance and repair. To meet this need, the British Geological Survey (BGS) has worked with HES to develop an integrated digital site assessment system that provides a refined survey process for stone-built (and other) historic sites. Based on BGS·SIGMA—an integrated workflow underpinned by a geo-spatial platform for data capture and interpretation—the new system is built on top of ESRI's ArcGIS software, and underpinned by a relational database. Users can, in the field or in the office, populate custom-built data entry forms to record maintenance issues and repair specifications for architectural elements ranging from individual blocks of stone to entire building elevations. Photographs, sketches, and digital documents can be linked to architectural elements to enhance the usability of the data. Predetermined data fields and supporting dictionaries constrain the input parameters, ensuring a high degree of standardisation in the datasets and, therefore, enabling highly consistent data extraction and querying. The GIS presentation of the data provides a powerful and versatile planning tool for scheduling works, specifying materials, identifying the skills needed for repairs, and allocating resources more effectively and efficiently. Physical alterations and changes in the overall condition of a single site, or a group of sites can be monitored accurately over time by repeating the original survey (e.g. every 5 years). Other datasets can be linked to the database and other geospatially referenced datasets can be superimposed in GIS, adding considerably to the scope and utility of the system. The system can be applied to any geospatially referenced object in a wide range of situations thus providing many potential applications in conservation, archaeology and other related fields.

  18. Uncertainty in georeferencing current and historic plant locations

    USGS Publications Warehouse

    McEachern, K.; Niessen, K.

    2009-01-01

    With shrinking habitats, weed invasions, and climate change, repeated surveys are becoming increasingly important for rare plant conservation and ecological restoration. We often need to relocate historical sites or provide locations for newly restored sites. Georeferencing is the technique of giving geographic coordinates to the location of a site. Georeferencing has been done historically using verbal descriptions or field maps that accompany voucher collections. New digital technology gives us more exact techniques for mapping and storing location information. Error still exists, however, and even georeferenced locations can be uncertain, especially if error information is not included with the observation. We review the concept of uncertainty in georeferencing and compare several institutional database systems for cataloging error and uncertainty with georeferenced locations. These concepts are widely discussed among geographers, but ecologists and restorationists need to become more aware of issues related to uncertainty to improve our use of spatial information in field studies. ?? 2009 by the Board of Regents of the University of Wisconsin System.

  19. Use of the Geographic Information System (GIS) in nurseries

    Treesearch

    Brent Olson; Chad Loreth

    2002-01-01

    The use of GIs in nursery operations provides a variety of opportunities. All planning activities can be incorporated into an accessible database. GIS can be used to create ways for employees to access and analyze data. The program can be used for historical record keeping. Use of GIS in planning can improve the efficiency of nursery operations. GIS can easily be used...

  20. Nonlinear analysis of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea

    NASA Astrophysics Data System (ADS)

    Rojo-Garibaldi, Berenice; Salas-de-León, David Alberto; Adela Monreal-Gómez, María; Sánchez-Santillán, Norma Leticia; Salas-Monreal, David

    2018-04-01

    Hurricanes are complex systems that carry large amounts of energy. Their impact often produces natural disasters involving the loss of human lives and materials, such as infrastructure, valued at billions of US dollars. However, not everything about hurricanes is negative, as hurricanes are the main source of rainwater for the regions where they develop. This study shows a nonlinear analysis of the time series of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea obtained from 1749 to 2012. The construction of the hurricane time series was carried out based on the hurricane database of the North Atlantic basin hurricane database (HURDAT) and the published historical information. The hurricane time series provides a unique historical record on information about ocean-atmosphere interactions. The Lyapunov exponent indicated that the system presented chaotic dynamics, and the spectral analysis and nonlinear analyses of the time series of the hurricanes showed chaotic edge behavior. One possible explanation for this chaotic edge is the individual chaotic behavior of hurricanes, either by category or individually regardless of their category and their behavior on a regular basis.

  1. Digital geologic map and database of the Chesapeake and Ohio Canal National Historical Park and Potomac River corridor, District of Columbia, Virginia, Maryland, and West Virginia

    USGS Publications Warehouse

    Southworth, C. Scott; Brezinski, David K.; Orndorff, Randall C.; Chirico, Peter G.; Lagueux, Kerry M.

    2001-01-01

    The Chesapeake and Ohio (CO) Canal National Historical Park is unique in that it is the only land within the National Park system that crosses 5 physiographic provinces along a major river. From Georgetown, District of Columbia (D.C.) to Cumberland, Maryland (Md.), the CO Canal provides an opportunity to examine the geologic history of the central Appalachian region and how the canal contributed to the development of this area. The geologic map data covers the 184.5-mile long park in a 2-mile wide corridor centered on the Potomac River

  2. Dangers of Noncritical Use of Historical Plague Data

    PubMed Central

    Roosen, Joris

    2018-01-01

    Researchers have published several articles using historical data sets on plague epidemics using impressive digital databases that contain thousands of recorded outbreaks across Europe over the past several centuries. Through the digitization of preexisting data sets, scholars have unprecedented access to the historical record of plague occurrences. However, although these databases offer new research opportunities, noncritical use and reproduction of preexisting data sets can also limit our understanding of how infectious diseases evolved. Many scholars have performed investigations using Jean-Noël Biraben’s data, which contains information on mentions of plague from various kinds of sources, many of which were not cited. When scholars fail to apply source criticism or do not reflect on the content of the data they use, the reliability of their results becomes highly questionable. Researchers using these databases going forward need to verify and restrict content spatially and temporally, and historians should be encouraged to compile the work.

  3. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Stefan, W; Reeve, D

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets. Longitudinal data can reveal trends that although are within passing criteria indicate underlying system issues.« less

  4. An Examination of Selected Software Testing Tools: 1992

    DTIC Science & Technology

    1992-12-01

    Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows

  5. Retrieving Historical Electrorefining Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, Meagan Daniella

    Pyrochemical Operations began at Los Alamos National Laboratory (LANL) during 1962 (1). Electrorefining (ER) has been implemented as a routine process since the 1980’s. The process data that went through the ER operation was recorded but had never been logged in an online database. Without a database new staff members are hindered in their work by the lack of information. To combat the issue a database in Access was created to collect the historical data. The years from 2000 onward were entered and queries were created to analyze trends. These trends will aid engineering and operations staff to reach optimalmore » performance for the startup of the new lines.« less

  6. Extending the temporal context of ethnobotanical databases: the case study of the Campania region (southern Italy)

    PubMed Central

    De Natale, Antonino; Pezzatti, Gianni Boris; Pollio, Antonino

    2009-01-01

    Background Ethnobotanical studies generally describe the traditional knowledge of a territory according to a "hic et nunc" principle. The need of approaching this field also embedding historical data has been frequently acknowledged. With their long history of civilization some regions of the Mediterranean basin seem to be particularly suited for an historical approach to be adopted. Campania, a region of southern Italy, has been selected for a database implementation containing present and past information on plant uses. Methods A relational database has been built on the basis of information gathered from different historical sources, including diaries, travel accounts, and treatises on medicinal plants, written by explorers, botanists, physicians, who travelled in Campania during the last three centuries. Moreover, ethnobotanical uses described in historical herbal collections and in Ancient and Medieval texts from the Mediterranean Region have been included in the database. Results 1672 different uses, ranging from medicinal, to alimentary, ceremonial, veterinary, have been recorded for 474 species listed in the data base. Information is not uniformly spread over the Campanian territory; Sannio being the most studied geographical area and Cilento the least one. About 50 plants have been continuously used in the last three centuries in the cure of the same affections. A comparison with the uses reported for the same species in Ancient treatises shows that the origin of present ethnomedicine from old learned medical doctrines needs a case-by-case confirmation. Conclusion The database is flexible enough to represent a useful tool for researchers who need to store and compare present and previous ethnobotanical uses from Mediterranean Countries. PMID:19228384

  7. Development and operations of the astrophysics data system

    NASA Technical Reports Server (NTRS)

    Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)

    2005-01-01

    Abstract service - Continued regular updates of abstracts in the databases, both at SA0 and at all mirror sites. - Modified loading scripts to accommodate changes in data format (PhyS) - Discussed data deliveries with providers to clear up problems with format or other errors (EGU) - Continued inclusion of large numbers of historical literature volumes and physics conference volumes xeroxed from the library. - Performed systematic fixes on some data sets in the database to account for changes in article numbering (AGU journals) - Implemented linking of ADS bibliographic records with multimedia files - Debugged and fixed obscure connection problems with the ADS Korean mirror site which were preventing successful updates of the data holdings. - Wrote procedure to parse citation data and characterize an ADS record based on its citation ratios within each database.

  8. Analyzing Historical Primary Source Open Educational Resources: A Blended Pedagogical Approach

    ERIC Educational Resources Information Center

    Oliver, Kevin M.; Purichia, Heather R.

    2018-01-01

    This qualitative case study addresses the need for pedagogical approaches to working with open educational resources (OER). Drawing on a mix of historical thinking heuristics and case analysis approaches, a blended pedagogical strategy and primary source database were designed to build student understanding of historical records with transfer of…

  9. IceVal DatAssistant: An Interactive, Automated Icing Data Management System

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Wright, William B.

    2008-01-01

    As with any scientific endeavor, the foundation of icing research at the NASA Glenn Research Center (GRC) is the data acquired during experimental testing. In the case of the GRC Icing Branch, an important part of this data consists of ice tracings taken following tests carried out in the GRC Icing Research Tunnel (IRT), as well as the associated operational and environmental conditions documented during these tests. Over the years, the large number of experimental runs completed has served to emphasize the need for a consistent strategy for managing this data. To address the situation, the Icing Branch has recently elected to implement the IceVal DatAssistant automated data management system. With the release of this system, all publicly available IRT-generated experimental ice shapes with complete and verifiable conditions have now been compiled into one electronically-searchable database. Simulation software results for the equivalent conditions, generated using the latest version of the LEWICE ice shape prediction code, are likewise included and are linked to the corresponding experimental runs. In addition to this comprehensive database, the IceVal system also includes a graphically-oriented database access utility, which provides reliable and easy access to all data contained in the database. In this paper, the issues surrounding historical icing data management practices are discussed, as well as the anticipated benefits to be achieved as a result of migrating to the new system. A detailed description of the software system features and database content is also provided; and, finally, known issues and plans for future work are presented.

  10. IceVal DatAssistant: An Interactive, Automated Icing Data Management System

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Wright, William B.

    2008-01-01

    As with any scientific endeavor, the foundation of icing research at the NASA Glenn Research Center (GRC) is the data acquired during experimental testing. In the case of the GRC Icing Branch, an important part of this data consists of ice tracings taken following tests carried out in the GRC Icing Research Tunnel (IRT), as well as the associated operational and environmental conditions during those tests. Over the years, the large number of experimental runs completed has served to emphasize the need for a consistent strategy to manage the resulting data. To address this situation, the Icing Branch has recently elected to implement the IceVal DatAssistant automated data management system. With the release of this system, all publicly available IRT-generated experimental ice shapes with complete and verifiable conditions have now been compiled into one electronically-searchable database; and simulation software results for the equivalent conditions, generated using the latest version of the LEWICE ice shape prediction code, are likewise included and linked to the corresponding experimental runs. In addition to this comprehensive database, the IceVal system also includes a graphically-oriented database access utility, which provides reliable and easy access to all data contained in the database. In this paper, the issues surrounding historical icing data management practices are discussed, as well as the anticipated benefits to be achieved as a result of migrating to the new system. A detailed description of the software system features and database content is also provided; and, finally, known issues and plans for future work are presented.

  11. Process identification of the SCR system of coal-fired power plant for de-NOx based on historical operation data.

    PubMed

    Li, Jian; Shi, Raoqiao; Xu, Chuanlong; Wang, Shimin

    2018-05-08

    The selective catalytic reduction (SCR) system, as one principal flue gas treatment method employed for the NO x emission control of the coal-fired power plant, is nonlinear and time-varying with great inertia and large time delay. It is difficult for the present SCR control system to achieve satisfactory performance with the traditional feedback and feedforward control strategies. Although some improved control strategies, such as the Smith predictor control and the model predictive control, have been proposed for this issue, a well-matched identification model is essentially required to realize a superior control of the SCR system. Industrial field experiment is an alternative way to identify the SCR system model in the coal-fired power plant. But it undesirably disturbs the operation system and is costly in time and manpower. In this paper, a process identification model of the SCR system is proposed and developed by applying the asymptotic method to the sufficiently excited data, selected from the original historical operation database of a 350 MW coal-fired power plant according to the condition number of the Fisher information matrix. Numerical simulations are carried out based on the practical historical operation data to evaluate the performance of the proposed model. Results show that the proposed model can efficiently achieve the process identification of the SCR system.

  12. [Implementation of Oncomelania hupensis monitoring system based on Baidu Map].

    PubMed

    Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang

    2017-10-25

    To construct the Oncomelania hupensis snail monitoring system based on the Baidu Map. The environmental basic information about historical snail environment and existing snail environment, etc. was collected with the monitoring data about different kinds of O. hupensis snails, and then the O. hupensis snail monitoring system was built. Geographic Information System (GIS) and the electronic fence technology and Application Program Interface (API) were applied to set up the electronic fence of the snail surveillance environments, and the electronic fence was connected to the database of the snail surveillance. The O. hupensis snail monitoring system based on the Baidu Map were built up, including three modules of O. hupensis Snail Monitoring Environmental Database, Dynamic Monitoring Platform and Electronic Map. The information about monitoring O. hupensis snails could be obtained through the computer and smartphone simultaneously. The O. hupensis snail monitoring system, which is based on Baidu Map, is a visible platform to follow the process of snailsearching and molluscaciding.

  13. Historical return on investment and improved quality resulting from development and mining of a hospital laboratory relational database.

    PubMed

    Brimhall, Bradley B; Hall, Timothy E; Walczak, Steven

    2006-01-01

    A hospital laboratory relational database, developed over eight years, has demonstrated significant cost savings and a substantial financial return on investment (ROI). In addition, the database has been used to measurably improve laboratory operations and the quality of patient care.

  14. A Database Evaluation Based on Information Needs of Academic Social Scientists.

    ERIC Educational Resources Information Center

    Buterbaugh, Nancy Toth

    This study evaluates two databases, "Historical Abstracts" and REESWeb, to determine their effectiveness in supporting academic social science research. While many performance evaluations gather quantitative data from isolated query and response transactions, this study is a qualitative evaluation of the databases in the context of…

  15. RadNet Databases and Reports

    EPA Pesticide Factsheets

    EPA’s RadNet data are available for viewing in a searchable database or as PDF reports. Historical and current RadNet monitoring data are used to estimate long-term trends in environmental radiation levels.

  16. Flood trends and river engineering on the Mississippi River system

    USGS Publications Warehouse

    Pinter, N.; Jemberie, A.A.; Remo, J.W.F.; Heine, R.A.; Ickes, B.S.

    2008-01-01

    Along >4000 km of the Mississippi River system, we document that climate, land-use change, and river engineering have contributed to statistically significant increases in flooding over the past 100-150 years. Trends were tested using a database of >8 million hydrological measurements. A geospatial database of historical engineering construction was used to quantify the response of flood levels to each unit of engineering infrastructure. Significant climate- and/or land use-driven increases in flow were detected, but the largest and most pervasive contributors to increased flooding on the Mississippi River system were wing dikes and related navigational structures, followed by progressive levee construction. In the area of the 2008 Upper Mississippi flood, for example, about 2 m of the flood crest is linked to navigational and flood-control engineering. Systemwide, large increases in flood levels were documented at locations and at times of wing-dike and levee construction. Copyright 2008 by the American Geophysical Union.

  17. Specific character of citations in historiography (using the example of Polish history).

    PubMed

    Kolasa, Władysław Marek

    2012-03-01

    The first part of the paper deals with the assessment of international databases in relation to the number of historical publications (representation and relevance in comparison with the model database). The second part is focused on providing answer to the question whether historiography is governed by similar bibliometric rules as exact sciences or whether it has its own specific character. Empirical database for this part of the research constituted the database prepared ad hoc: The Citation Index of the History of Polish Media (CIHPM). Among numerous typically historical features the main focus was put on: linguistic localism, specific character of publishing forms, differences in citing of various sources (contributions and syntheses) and specific character of the authorship (the Lorenz Curve and the Lotka's Law). Slightly more attention was devoted to the half-life indicator and its role in a diachronic study of a scientific field; also, a new indicator (HL14), depicting distribution of citations younger then half-life was introduced. Additionally, the comparison and correlation of selected parameters for the body of historical science (citations, HL14, the Hirsch Index, number of publications, volume and other) were also conducted.

  18. Agroclimate.Org: Tools and Information for a Climate Resilient Agriculture in the Southeast USA

    NASA Astrophysics Data System (ADS)

    Fraisse, C.

    2014-12-01

    AgroClimate (http://agroclimate.org) is a web-based system developed to help the agricultural industry in the southeastern USA reduce risks associated with climate variability and change. It includes climate related information and dynamic application tools that interact with a climate and crop database system. Information available includes climate monitoring and forecasts combined with information about crop management practices that help increase the resiliency of the agricultural industry in the region. Recently we have included smartphone apps in the AgroClimate suite of tools, including irrigation management and crop disease alert systems. Decision support tools available in AgroClimate include: (a) Climate risk: expected (probabilistic) and historical climate information and freeze risk; (b) Crop yield risk: expected yield based on soil type, planting date, and basic management practices for selected commodities and historical county yield databases; (c) Crop diseases: disease risk monitoring and forecasting for strawberry and citrus; (d) Crop development: monitoring and forecasting of growing degree-days and chill accumulation; (e) Drought: monitoring and forecasting of selected drought indices, (f) Footprints: Carbon and water footprint calculators. The system also provides background information about the main drivers of climate variability and basic information about climate change in the Southeast USA. AgroClimate has been widely used as an educational tool by the Cooperative Extension Services in the region and also by producers. It is now being replicated internationally with version implemented in Mozambique and Paraguay.

  19. Compilation of historical water-quality data for selected springs in Texas, by ecoregion

    USGS Publications Warehouse

    Heitmuller, Franklin T.; Williams, Iona P.

    2006-01-01

    Springs are important hydrologic features in Texas. A database of about 2,000 historically documented springs and available spring-flow measurements previously has been compiled and published, but water-quality data remain scattered in published sources. This report by the U.S. Geological Survey, in cooperation with the Texas Parks and Wildlife Department, documents the compilation of data for 232 springs in Texas on the basis of a set of criteria and the development of a water-quality database for the selected springs. The selection of springs for compilation of historical water-quality data in Texas was made using existing digital and hard-copy data, responses to mailed surveys, selection criteria established by various stakeholders, geographic information systems, and digital database queries. Most springs were selected by computing the highest mean spring flows for each Texas level III ecoregion. A brief assessment of the water-quality data for springs in Texas shows that few data are available in the Arizona/New Mexico Mountains, High Plains, East Central Texas Plains, Western Gulf Coastal Plain, and South Central Plains ecoregions. Water-quality data are more abundant for the Chihuahuan Deserts, Edwards Plateau, and Texas Blackland Prairies ecoregions. Selected constituent concentrations in Texas springs, including silica, calcium, magnesium, sodium, potassium, strontium, sulfate, chloride, fluoride, nitrate (nitrogen), dissolved solids, and hardness (as calcium carbonate) are comparatively high in the Chihuahuan Deserts, Southwestern Tablelands, Central Great Plains, and Cross Timbers ecoregions, mostly as a result of subsurface geology. Comparatively low concentrations of selected constituents in Texas springs are associated with the Arizona/New Mexico Mountains, Southern Texas Plains, East Central Texas Plains, and South Central Plains ecoregions.

  20. Environmental Cost Analysis System (ECAS) Status and Compliance Requirements for EM Consolidated Business Center Contracts - 13204

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanford, P.C.; Moe, M.A.; Hombach, W.G.

    2013-07-01

    The Department of Energy (DOE) Office of Environmental Management (EM) has developed a web-accessible database to collect actual cost data from completed EM projects to support cost estimating and analysis. This Environmental Cost Analysis System (ECAS) database was initially deployed in early 2009 containing the cost and parametric data from 77 decommissioning, restoration, and waste management projects completed under the Rocky Flats Closure Project. In subsequent years we have added many more projects to ECAS and now have a total of 280 projects from 8 major DOE sites. This data is now accessible to DOE users through a web-based reportingmore » tool that allows users to tailor report outputs to meet their specific needs. We are using it as a principal resource supporting the EM Consolidated Business Center (EMCBC) and the EM Applied Cost Engineering (ACE) team cost estimating and analysis efforts across the country. The database has received Government Accountability Office review as supporting its recommended improvements in DOE's cost estimating process, as well as review from the DOE Office of Acquisition and Project Management (APM). Moving forward, the EMCBC has developed a Special Contract Requirement clause or 'H-Clause' to be included in all current and future EMCBC procurements identifying the process that contractors will follow to provide DOE their historical project data in a format compatible with ECAS. Changes to DOE O 413.3B implementation are also in progress to capture historical costs as part of the Critical Decision project closeout process. (authors)« less

  1. Limitations in learning: How treatment verifications fail and what to do about it?

    PubMed

    Richardson, Susan; Thomadsen, Bruce

    The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  2. The Rigid Pavement Database: Overview and Data Collection Plan

    DOT National Transportation Integrated Search

    1998-06-01

    The rigid pavement (RP) database contains historical distress data obtained from more than 400 continuously reinforced concrete pavements(CRCP) and jointed concrete pavements (JCP) across the state of Texas. Data collection efforts began in 1974 and ...

  3. Recovery and validation of historical sediment quality data from coastal and estuarine areas: An integrated approach

    USGS Publications Warehouse

    Manheim, F.T.; Buchholtz ten Brink, Marilyn R.; Mecray, E.L.

    1998-01-01

    A comprehensive database of sediment chemistry and environmental parameters has been compiled for Boston Harbor and Massachusetts Bay. This work illustrates methodologies for rescuing and validating sediment data from heterogeneous historical sources. It greatly expands spatial and temporal data coverage of estuarine and coastal sediments. The database contains about 3500 samples containing inorganic chemical, organic, texture and other environmental data dating from 1955 to 1994. Cooperation with local and federal agencies as well as universities was essential in locating and screening documents for the database. More than 80% of references utilized came from sources with limited distribution (gray literature). Task sharing was facilitated by a comprehensive and clearly defined data dictionary for sediments. It also served as a data entry template and flat file format for data processing and as a basis for interpretation and graphical illustration. Standard QA/QC protocols are usually inapplicable to historical sediment data. In this work outliers and data quality problems were identified by batch screening techniques that also provide visualizations of data relationships and geochemical affinities. No data were excluded, but qualifying comments warn users of problem data. For Boston Harbor, the proportion of irreparable or seriously questioned data was remarkably small (<5%), although concentration values for metals and organic contaminants spanned 3 orders of magnitude for many elements or compounds. Data from the historical database provide alternatives to dated cores for measuring changes in surficial sediment contamination level with time. The data indicate that spatial inhomogeneity in harbor environments can be large with respect to sediment-hosted contaminants. Boston Inner Harbor surficial sediments showed decreases in concentrations of Cu, Hg, and Zn of 40 to 60% over a 17-year period.A comprehensive database of sediment chemistry and environmental parameters has been compiled for Boston Harbor and Massachusetts Bay. This work illustrates methodologies for rescuing and validating sediment data from heterogeneous historical sources. It greatly expands spatial and temporal data coverage of estuarine and coastal sediments. The database contains about 3500 samples containing inorganic chemical, organic, texture and other environmental data dating from 1995 to 1994. Cooperation with local and federal agencies as well as universities was essential in locating and screening documents for the database. More than 80% of references utilized came from sources with limited distribution (gray Task sharing was facilitated by a comprehensive and clearly defined data dictionary for sediments. It also served as a data entry template and flat file format for data processing and as a basis for interpretation and graphical illustration. Standard QA/QC protocols are usually inapplicable to historical sediment data. In this work outliers and data quality problems were identified by batch screening techniques that also provide visualizations of data relationships and geochemical affinities. No data were excluded, but qualifying comments warn users of problem data. For Boston Harbor, the proportion of irreparable or seriously questioned data was remarkably small (<5%), although concentration values for metals and organic contaminants spanned 3 orders of magnitude for many elements or compounds. Data from the historical database provide alternatives to dated cores for measuring changes in surficial sediment contamination level with time. The data indicate that spatial inhomogeneity in harbor environments can be large with respect to sediment-hosted contaminants. Boston Inner Harbor surficial sediments showed decreases in concentrations Cu, Hg, and Zn of 40 to 60% over a 17-year period.

  4. The global historical and future economic loss and cost of earthquakes during the production of adaptive worldwide economic fragility functions

    NASA Astrophysics Data System (ADS)

    Daniell, James; Wenzel, Friedemann

    2014-05-01

    Over the past decade, the production of economic indices behind the CATDAT Damaging Earthquakes Database has allowed for the conversion of historical earthquake economic loss and cost events into today's terms using long-term spatio-temporal series of consumer price index (CPI), construction costs, wage indices, and GDP from 1900-2013. As part of the doctoral thesis of Daniell (2014), databases and GIS layers for a country and sub-country level have been produced for population, GDP per capita, net and gross capital stock (depreciated and non-depreciated) using studies, census information and the perpetual inventory method. In addition, a detailed study has been undertaken to collect and reproduce as many historical isoseismal maps, macroseismic intensity results and reproductions of earthquakes as possible out of the 7208 damaging events in the CATDAT database from 1900 onwards. a) The isoseismal database and population bounds from 3000+ collected damaging events were compared with the output parameters of GDP and net and gross capital stock per intensity bound and administrative unit, creating a spatial join for analysis. b) The historical costs were divided into shaking/direct ground motion effects, and secondary effects costs. The shaking costs were further divided into gross capital stock related and GDP related costs for each administrative unit, intensity bound couplet. c) Costs were then estimated based on the optimisation of the function in terms of costs vs. gross capital stock and costs vs. GDP via the regression of the function. Losses were estimated based on net capital stock, looking at the infrastructure age and value at the time of the event. This dataset was then used to develop an economic exposure for each historical earthquake in comparison with the loss recorded in the CATDAT Damaging Earthquakes Database. The production of economic fragility functions for each country was possible using a temporal regression based on the parameters of macroseismic intensity, capital stock estimate, GDP estimate, year and the combined seismic building index (a created combination of the global seismic code index, building practice factor, building age and infrastructure vulnerability). The analysis provided three key results: a) The production of economic fragility functions from the 1900-2008 events showed very good correlation to the economic loss and cost from earthquakes from 2009-2013, in real-time. This methodology has been extended to other natural disaster types (typhoon, flood, drought). b) The reanalysis of historical earthquake events in order to check associated historical loss and costs versus the expected exposure in terms of intensities. The 1939 Chillan, 1948 Turkmenistan, 1950 Iran, 1972 Managua, 1980 Western Nepal and 1992 Erzincan earthquake events were seen as huge outliers compared with the modelled capital stock and GDP and thus additional studies were undertaken to check the original loss results. c) A worldwide GIS layer database of capital stock (gross and net), GDP, infrastructure age and economic indices over the period 1900-2013 have been created in conjunction with the CATDAT database in order to define correct economic loss and costs.

  5. PropertyQuest

    Science.gov Websites

    range of site-related information easily, especially for historic resources. PropertyQuest draws from databases provided by other DC agencies. Information is presented here for planning purposes only. Please , including: The Office of Planning for historic resources, census information, and boundaries of Chinatown

  6. A Relational Encoding of a Conceptual Model with Multiple Temporal Dimensions

    NASA Astrophysics Data System (ADS)

    Gubiani, Donatella; Montanari, Angelo

    The theoretical interest and the practical relevance of a systematic treatment of multiple temporal dimensions is widely recognized in the database and information system communities. Nevertheless, most relational databases have no temporal support at all. A few of them provide a limited support, in terms of temporal data types and predicates, constructors, and functions for the management of time values (borrowed from the SQL standard). One (resp., two) temporal dimensions are supported by historical and transaction-time (resp., bitemporal) databases only. In this paper, we provide a relational encoding of a conceptual model featuring four temporal dimensions, namely, the classical valid and transaction times, plus the event and availability times. We focus our attention on the distinctive technical features of the proposed temporal extension of the relation model. In the last part of the paper, we briefly show how to implement it in a standard DBMS.

  7. Geodatabase of environmental information for Air Force Plant 4 and Naval Air Station-Joint Reserve Base Carswell Field, Fort Worth, Texas, 1990-2004

    USGS Publications Warehouse

    Shah, Sachin D.; Quigley, Sean M.

    2005-01-01

    Air Force Plant 4 (AFP4) and adjacent Naval Air Station-Joint Reserve Base (NAS-JRB) at Fort Worth, Tex., constitute a government-owned, contractor-operated (GOCO) facility that has been in operation since 1942. Contaminants from the facility, primarily volatile organic compounds (VOCs) and metals, have entered the groundwater-flow system through leakage from waste-disposal sites (landfills and pits) and from manufacturing processes (U.S. Air Force, Aeronautical Systems Center, 1995). The U.S. Geological Survey (USGS), in cooperation with the U.S. Air Force (USAF), Aeronautical Systems Center, Environmental Management Directorate (ASC/ENVR), developed a comprehensive database (or geodatabase) of temporal and spatial environmental information associated with the geology, hydrology, and water quality at AFP4 and NAS-JRB. The database of this report provides information about the AFP4 and NAS-JRB study area including sample location names, identification numbers, locations, historical dates, and various measured hydrologic data. This database does not include every sample location at the site, but is limited to an aggregation of selected digital and hardcopy data of the USAF, USGS, and various consultants who have previously or are currently working at the site.

  8. Spatiotemporal historical datasets at micro-level for geocoded individuals in five Swedish parishes, 1813–1914

    PubMed Central

    Hedefalk, Finn; Svensson, Patrick; Harrie, Lars

    2017-01-01

    This paper presents datasets that enable historical longitudinal studies of micro-level geographic factors in a rural setting. These types of datasets are new, as historical demography studies have generally failed to properly include the micro-level geographic factors. Our datasets describe the geography over five Swedish rural parishes, and by linking them to a longitudinal demographic database, we obtain a geocoded population (at the property unit level) for this area for the period 1813–1914. The population is a subset of the Scanian Economic Demographic Database (SEDD). The geographic information includes the following feature types: property units, wetlands, buildings, roads and railroads. The property units and wetlands are stored in object-lifeline time representations (information about creation, changes and ends of objects are recorded in time), whereas the other feature types are stored as snapshots in time. Thus, the datasets present one of the first opportunities to study historical spatio-temporal patterns at the micro-level. PMID:28398288

  9. Effect of initial conditions of a catchment on seasonal streamflow prediction using ensemble streamflow prediction (ESP) technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar; Zammit, Christian; Hreinsson, Einar; Woods, Ross; Clark, Martyn; Hamlet, Alan

    2013-04-01

    Increased access to water is a key pillar of the New Zealand government plan for economic growths. Variable climatic conditions coupled with market drivers and increased demand on water resource result in critical decision made by water managers based on climate and streamflow forecast. Because many of these decisions have serious economic implications, accurate forecast of climate and streamflow are of paramount importance (eg irrigated agriculture and electricity generation). New Zealand currently does not have a centralized, comprehensive, and state-of-the-art system in place for providing operational seasonal to interannual streamflow forecasts to guide water resources management decisions. As a pilot effort, we implement and evaluate an experimental ensemble streamflow forecasting system for the Waitaki and Rangitata River basins on New Zealand's South Island using a hydrologic simulation model (TopNet) and the familiar ensemble streamflow prediction (ESP) paradigm for estimating forecast uncertainty. To provide a comprehensive database for evaluation of the forecasting system, first a set of retrospective model states simulated by the hydrologic model on the first day of each month were archived from 1972-2009. Then, using the hydrologic simulation model, each of these historical model states was paired with the retrospective temperature and precipitation time series from each historical water year to create a database of retrospective hindcasts. Using the resulting database, the relative importance of initial state variables (such as soil moisture and snowpack) as fundamental drivers of uncertainties in forecasts were evaluated for different seasons and lead times. The analysis indicate that the sensitivity of flow forecast to initial condition uncertainty is depend on the hydrological regime and season of forecast. However initial conditions do not have a large impact on seasonal flow uncertainties for snow dominated catchments. Further analysis indicates that this result is valid when the hindcast database is conditioned by ENSO classification. As a result hydrological forecasts based on ESP technique, where present initial conditions with histological forcing data are used may be plausible for New Zealand catchments.

  10. "Mr. Database" : Jim Gray and the History of Database Technologies.

    PubMed

    Hanwahr, Nils C

    2017-12-01

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

  11. A k-Vector Approach to Sampling, Interpolation, and Approximation

    NASA Astrophysics Data System (ADS)

    Mortari, Daniele; Rogers, Jonathan

    2013-12-01

    The k-vector search technique is a method designed to perform extremely fast range searching of large databases at computational cost independent of the size of the database. k-vector search algorithms have historically found application in satellite star-tracker navigation systems which index very large star catalogues repeatedly in the process of attitude estimation. Recently, the k-vector search algorithm has been applied to numerous other problem areas including non-uniform random variate sampling, interpolation of 1-D or 2-D tables, nonlinear function inversion, and solution of systems of nonlinear equations. This paper presents algorithms in which the k-vector search technique is used to solve each of these problems in a computationally-efficient manner. In instances where these tasks must be performed repeatedly on a static (or nearly-static) data set, the proposed k-vector-based algorithms offer an extremely fast solution technique that outperforms standard methods.

  12. Major technology issues in surgical data collection.

    PubMed

    Kirschenbaum, I H

    1995-10-01

    Surgical scheduling and data collection is a field that has a long history as well as a bright future. Historically, surgical cases have always involved some amount of data collection. Surgical cases are scheduled and then reviewed. The classic method, that large black surgical log, actually still exists in many hospitals. In fact, there is nothing new about the recording or reporting of surgical cases. If we only needed to record the information and produce a variety of reports on the data, then modern electronic technology would function as a glorified fast index card box--or, in computer database terms, a simple flat file database. But, this is not the future of technology in surgical case management. This article makes the general case for integrating surgical data systems. Instead of reviewing specific software, it essentially addresses the issues of strategic planning related to this important aspect of medical information systems.

  13. Visualization of historical data for the ATLAS detector controls - DDV

    NASA Astrophysics Data System (ADS)

    Maciejewski, J.; Schlenker, S.

    2017-10-01

    The ATLAS experiment is one of four detectors located on the Large Hardon Collider (LHC) based at CERN. Its detector control system (DCS) stores the slow control data acquired within the back-end of distributed WinCC OA applications, which enables the data to be retrieved for future analysis, debugging and detector development in an Oracle relational database. The ATLAS DCS Data Viewer (DDV) is a client-server application providing access to the historical data outside of the experiment network. The server builds optimized SQL queries, retrieves the data from the database and serves it to the clients via HTTP connections. The server also implements protection methods to prevent malicious use of the database. The client is an AJAX-type web application based on the Vaadin (framework build around the Google Web Toolkit (GWT)) which gives users the possibility to access the data with ease. The DCS metadata can be selected using a column-tree navigation or a search engine supporting regular expressions. The data is visualized by a selection of output modules such as a java script value-over time plots or a lazy loading table widget. Additional plugins give the users the possibility to retrieve the data in ROOT format or as an ASCII file. Control system alarms can also be visualized in a dedicated table if necessary. Python mock-up scripts can be generated by the client, allowing the user to query the pythonic DDV server directly, such that the users can embed the scripts into more complex analysis programs. Users are also able to store searches and output configurations as XML on the server to share with others via URL or to embed in HTML.

  14. Developing New Rainfall Estimates to Identify the Likelihood of Agricultural Drought in Mesoamerica

    NASA Astrophysics Data System (ADS)

    Pedreros, D. H.; Funk, C. C.; Husak, G. J.; Michaelsen, J.; Peterson, P.; Lasndsfeld, M.; Rowland, J.; Aguilar, L.; Rodriguez, M.

    2012-12-01

    The population in Central America was estimated at ~40 million people in 2009, with 65% in rural areas directly relying on local agricultural production for subsistence, and additional urban populations relying on regional production. Mapping rainfall patterns and values in Central America is a complex task due to the rough topography and the influence of two oceans on either side of this narrow land mass. Characterization of precipitation amounts both in time and space is of great importance for monitoring agricultural food production for food security analysis. With the goal of developing reliable rainfall fields, the Famine Early warning Systems Network (FEWS NET) has compiled a dense set of historical rainfall stations for Central America through cooperation with meteorological services and global databases. The station database covers the years 1900-present with the highest density between 1970-2011. Interpolating station data by themselves does not provide a reliable result because it ignores topographical influences which dominate the region. To account for this, climatological rainfall fields were used to support the interpolation of the station data using a modified Inverse Distance Weighting process. By blending the station data with the climatological fields, a historical rainfall database was compiled for 1970-2011 at a 5km resolution for every five day interval. This new database opens the door to analysis such as the impact of sea surface temperature on rainfall patterns, changes to the typical dry spell during the rainy season, characterization of drought frequency and rainfall trends, among others. This study uses the historical database to identify the frequency of agricultural drought in the region and explores possible changes in precipitation patterns during the past 40 years. A threshold of 500mm of rainfall during the growing season was used to define agricultural drought for maize. This threshold was selected based on assessments of crop conditions from previous seasons, and was identified as an amount roughly corresponding to significant crop loss for maize, a major crop in most of the region. Results identify areas in central Honduras and Nicaragua as well as the Altiplano region in Guatemala that experienced 15 seasons of agricultural drought for the period May-July during the years 1970-2000. Preliminary results show no clear trend in rainfall, but further investigation is needed to confirm that agricultural drought is not becoming more frequent in this region.

  15. Assessment of Fire Occurrence and Future Fire Potential in Arctic Alaska

    NASA Astrophysics Data System (ADS)

    French, N. H. F.; Jenkins, L. K.; Loboda, T. V.; Bourgeau-Chavez, L. L.; Whitley, M. A.

    2014-12-01

    An analysis of the occurrence of fire in Alaskan tundra was completed using the relatively complete historical record of fire for the region from 1950 to 2013. Spatial fire data for Alaskan tundra regions were obtained from the Alaska Large Fire Database for the region defined from vegetation and ecoregion maps. A detailed presentation of fire records available for assessing the fire regime of the tundra regions of Alaska as well as results evaluating fire size, seasonality, and general geographic and temporal trends is included. Assessment of future fire potential was determined for three future climate scenarios at four locations across the Alaskan tundra using the Canadian Forest Fire Weather Index (FWI). Canadian Earth System Model (CanESM2) weather variables were used for historical (1850-2005) and future (2006-2100) time periods. The database includes 908 fire points and 463 fire polygons within the 482,931 km2 of Alaskan tundra. Based on the polygon database 25,656 km2 (6,340,000 acres) has burned across the six tundra ecoregions since 1950. Approximately 87% of tundra fires start in June and July across all ecoregions. Combining information from the polygon and points data records, the estimated average fire size for fire in the Alaskan Arctic region is 28.1 km2 (7,070 acres), which is much smaller than in the adjacent boreal forest region, averaging 203 km2 for high fire years. The largest fire in the database is the Imuruk Basin Fire which burned 1,680 km2 in 1954 in the Seward Peninsula region (Table 1). Assessment of future fire potential shows that, in comparison with the historical fire record, fire occurrence in Alaskan tundra is expected to increase under all three climate scenarios. Occurrences of high fire weather danger (>10 FWI) are projected to increase in frequency and magnitude in all regions modeled. The changes in fire weather conditions are expected to vary from one region to another in seasonal occurrence as well as severity and frequency of high fire weather danger. While the Alaska Large Fire Database represents the best data available for the Alaskan Arctic, and is superior to many other regions around the world, particularly Arctic regions, these fire records need to be used with some caution due to the mixed origin and minimal validation of the data; this is reviewed in the presentation.

  16. Study of Automatic Image Rectification and Registration of Scanned Historical Aerial Photographs

    NASA Astrophysics Data System (ADS)

    Chen, H. R.; Tseng, Y. H.

    2016-06-01

    Historical aerial photographs directly provide good evidences of past times. The Research Center for Humanities and Social Sciences (RCHSS) of Taiwan Academia Sinica has collected and scanned numerous historical maps and aerial images of Taiwan and China. Some maps or images have been geo-referenced manually, but most of historical aerial images have not been registered since there are no GPS or IMU data for orientation assisting in the past. In our research, we developed an automatic process of matching historical aerial images by SIFT (Scale Invariant Feature Transform) for handling the great quantity of images by computer vision. SIFT is one of the most popular method of image feature extracting and matching. This algorithm extracts extreme values in scale space into invariant image features, which are robust to changing in rotation scale, noise, and illumination. We also use RANSAC (Random sample consensus) to remove outliers, and obtain good conjugated points between photographs. Finally, we manually add control points for registration through least square adjustment based on collinear equation. In the future, we can use image feature points of more photographs to build control image database. Every new image will be treated as query image. If feature points of query image match the features in database, it means that the query image probably is overlapped with control images.With the updating of database, more and more query image can be matched and aligned automatically. Other research about multi-time period environmental changes can be investigated with those geo-referenced temporal spatial data.

  17. Annual Surveillance Summary: Bacterial Infections in the Military Health System (MHS), 2016

    DTIC Science & Technology

    2017-06-01

    Approved for public release. Distribution is unlimited. The views expressed in this document are those of the authors and do not necessarily reflect...historic IR to the IR of the current analysis year. c Results are presented by two decimal places to account for low incidence rates. Data Source...NMCPHC HL7-formatted CHCS microbiology and M2 databases. Prepared by the EpiData Center Department, Navy and Marine Corps Public Health Center, on 21

  18. Estimating economic losses from earthquakes using an empirical approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2013-01-01

    We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.

  19. Kawah Ijen volcanic activity: A review

    USGS Publications Warehouse

    Caudron, Corentin; Syahbana, Devy Kamil; Lecocq, Thomas; van Hinsberg, Vincent; McCausland, Wendy; Triantafyllou, Antoine; Camelbeeck, Thierry; Bernard, Alain; Surono,

    2015-01-01

    Kawah Ijen is a composite volcano located at the easternmost part of Java island in Indonesia and hosts the largest natural acidic lake in the world. We have gathered all available historical reports on Kawah Ijen’s activity since 1770 with the purpose of reviewing the temporal evolution of its activity. Most of these observations and studies have been conducted from a geochemical perspective and in punctuated scientific campaigns. Starting in 1991, the seismic activity and a set of volcanic lake parameters began to be weekly available. We present a database of those measurements that, combined with historical reports, allow us to review each eruption/unrest that occurred during the last two centuries. As of 2010, the volcanic activity is monitored by a new multi-disciplinary network, including digital seismic stations, and lake level and temperature measurements. This detailed monitoring provides an opportunity for better classifying seismic events and forecasting volcanic unrest at Kawah Ijen, but only with the understanding of the characteristics of this volcanic system gained from the historical review presented here.

  20. Multi-Media and Databases for Historical Enquiry: A Report from the Trenches

    ERIC Educational Resources Information Center

    Hillis, Peter

    2003-01-01

    The Victorian period produced a diverse and rich range of historical source materials including census returns, photographs, film, personal reminiscences, music, cartoons, and posters. Recent changes to the history curriculum emphasise the acquisition of enquiry skills alongside developing knowledge and understanding, which necessitates reference…

  1. Alaska Geochemical Database (AGDB)-Geochemical data for rock, sediment, soil, mineral, and concentrate sample media

    USGS Publications Warehouse

    Granitto, Matthew; Bailey, Elizabeth A.; Schmidt, Jeanine M.; Shew, Nora B.; Gamble, Bruce M.; Labay, Keith A.

    2011-01-01

    The Alaska Geochemical Database (AGDB) was created and designed to compile and integrate geochemical data from Alaska in order to facilitate geologic mapping, petrologic studies, mineral resource assessments, definition of geochemical baseline values and statistics, environmental impact assessments, and studies in medical geology. This Microsoft Access database serves as a data archive in support of present and future Alaskan geologic and geochemical projects, and contains data tables describing historical and new quantitative and qualitative geochemical analyses. The analytical results were determined by 85 laboratory and field analytical methods on 264,095 rock, sediment, soil, mineral and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed in USGS laboratories or, under contracts, in commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects from 1962 to 2009. In addition, mineralogical data from 18,138 nonmagnetic heavy mineral concentrate samples are included in this database. The AGDB includes historical geochemical data originally archived in the USGS Rock Analysis Storage System (RASS) database, used from the mid-1960s through the late 1980s and the USGS PLUTO database used from the mid-1970s through the mid-1990s. All of these data are currently maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB were used to generate most of the AGDB data set. These data were checked for accuracy regarding sample location, sample media type, and analytical methods used. This arduous process of reviewing, verifying and, where necessary, editing all USGS geochemical data resulted in a significantly improved Alaska geochemical dataset. USGS data that were not previously in the NGDB because the data predate the earliest USGS geochemical databases, or were once excluded for programmatic reasons, are included here in the AGDB and will be added to the NGDB. The AGDB data provided here are the most accurate and complete to date, and should be useful for a wide variety of geochemical studies. The AGDB data provided in the linked database may be updated or changed periodically. The data on the DVD and in the data downloads provided with this report are current as of date of publication.

  2. A Global Geospatial Database of 5000+ Historic Flood Event Extents

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.

    2017-12-01

    A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.

  3. The influence of the Bible geographic objects peculiarities on the concept of the spatiotemporal geoinformation system

    NASA Astrophysics Data System (ADS)

    Linsebarth, A.; Moscicka, A.

    2010-01-01

    The article describes the infl uence of the Bible geographic object peculiarities on the spatiotemporal geoinformation system of the Bible events. In the proposed concept of this system the special attention was concentrated to the Bible geographic objects and interrelations between the names of these objects and their location in the geospace. In the Bible, both in the Old and New Testament, there are hundreds of geographical names, but the selection of these names from the Bible text is not so easy. The same names are applied for the persons and geographic objects. The next problem which arises is the classification of the geographical object, because in several cases the same name is used for the towns, mountains, hills, valleys etc. Also very serious problem is related to the time-changes of the names. The interrelation between the object name and its location is also complicated. The geographic object of this same name is located in various places which should be properly correlated with the Bible text. Above mentioned peculiarities of Bible geographic objects infl uenced the concept of the proposed system which consists of three databases: reference, geographic object, and subject/thematic. The crucial component of this system is proper architecture of the geographic object database. In the paper very detailed description of this database is presented. The interrelation between the databases allows to the Bible readers to connect the Bible text with the geography of the terrain on which the Bible events occurred and additionally to have access to the other geographical and historical information related to the geographic objects.

  4. THE ART OF DATA MINING THE MINEFIELDS OF TOXICITY DATABASES TO LINK CHEMISTRY TO BIOLOGY

    EPA Science Inventory

    Toxicity databases have a special role in predictive toxicology, providing ready access to historical information throughout the workflow of discovery, development, and product safety processes in drug development as well as in review by regulatory agencies. To provide accurate i...

  5. Toxicities of oils, dispersants and dispersed oils to algae and aquatic plants: review and database value to resource sustainability

    EPA Science Inventory

    Published toxicity results are reviewed for oils, dispersants and dispersed oils and aquatic plants. The historical phytotoxicity database consists largely of results from a patchwork of research conducted after oil spills to marine waters. Toxicity information is available for ...

  6. Repurposing historical control clinical trial data to provide safety context.

    PubMed

    Bhuyan, Prakash; Desai, Jigar; Louis, Matthew St; Carlsson, Martin; Bowen, Edward; Danielson, Mark; Cantor, Michael N

    2016-02-01

    Billions of dollars spent, millions of subject-hours of clinical trial experience and an abundance of archived study-level data, yet why are historical data underutilized? We propose that historical data can be aggregated to provide safety, background incidence rate and context to improve the evaluation of new medicinal products. Here, we describe the development and application of the eControls database, which is derived from the control arms of studies of licensed products, and discuss the challenges and potential solutions to the proper application of historical data to help interpret product safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Correlation Equation of Fault Size, Moment Magnitude, and Height of Tsunami Case Study: Historical Tsunami Database in Sulawesi

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Pribadi, Sugeng; Muzli, Muzli

    2018-03-01

    Sulawesi, one of the biggest island in Indonesia, located on the convergence of two macro plate that is Eurasia and Pacific. NOAA and Novosibirsk Tsunami Laboratory show more than 20 tsunami data recorded in Sulawesi since 1820. Based on this data, determination of correlation between tsunami and earthquake parameter need to be done to proved all event in the past. Complete data of magnitudes, fault sizes and tsunami heights on this study sourced from NOAA and Novosibirsk Tsunami database, completed with Pacific Tsunami Warning Center (PTWC) catalog. This study aims to find correlation between moment magnitude, fault size and tsunami height by simple regression. The step of this research are data collecting, processing, and regression analysis. Result shows moment magnitude, fault size and tsunami heights strongly correlated. This analysis is enough to proved the accuracy of historical tsunami database in Sulawesi on NOAA, Novosibirsk Tsunami Laboratory and PTWC.

  8. ePORT, NASA's Computer Database Program for System Safety Risk Management Oversight (Electronic Project Online Risk Tool)

    NASA Technical Reports Server (NTRS)

    Johnson, Paul W.

    2008-01-01

    ePORT (electronic Project Online Risk Tool) provides a systematic approach to using an electronic database program to manage a program/project risk management processes. This presentation will briefly cover the standard risk management procedures, then thoroughly cover NASA's Risk Management tool called ePORT. This electronic Project Online Risk Tool (ePORT) is a web-based risk management program that provides a common framework to capture and manage risks, independent of a programs/projects size and budget. It is used to thoroughly cover the risk management paradigm providing standardized evaluation criterion for common management reporting, ePORT improves Product Line, Center and Corporate Management insight, simplifies program/project manager reporting, and maintains an archive of data for historical reference.

  9. Introducing GFWED: The Global Fire Weather Database

    NASA Technical Reports Server (NTRS)

    Field, R. D.; Spessa, A. C.; Aziz, N. A.; Camia, A.; Cantin, A.; Carr, R.; de Groot, W. J.; Dowdy, A. J.; Flannigan, M. D.; Manomaiphiboon, K.; hide

    2015-01-01

    The Canadian Forest Fire Weather Index (FWI) System is the mostly widely used fire danger rating system in the world. We have developed a global database of daily FWI System calculations, beginning in 1980, called the Global Fire WEather Database (GFWED) gridded to a spatial resolution of 0.5 latitude by 2-3 longitude. Input weather data were obtained from the NASA Modern Era Retrospective-Analysis for Research and Applications (MERRA), and two different estimates of daily precipitation from rain gauges over land. FWI System Drought Code calculations from the gridded data sets were compared to calculations from individual weather station data for a representative set of 48 stations in North, Central and South America, Europe, Russia,Southeast Asia and Australia. Agreement between gridded calculations and the station-based calculations tended to be most different at low latitudes for strictly MERRA based calculations. Strong biases could be seen in either direction: MERRA DC over the Mato Grosso in Brazil reached unrealistically high values exceeding DCD1500 during the dry season but was too low over Southeast Asia during the dry season. These biases are consistent with those previously identified in MERRAs precipitation, and they reinforce the need to consider alternative sources of precipitation data. GFWED can be used for analyzing historical relationships between fire weather and fire activity at continental and global scales, in identifying large-scale atmosphereocean controls on fire weather, and calibration of FWI-based fire prediction models.

  10. Kerman Photovoltaic Power Plant R&D data collection computer system operations and maintenance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosen, P.B.

    1994-06-01

    The Supervisory Control and Data Acquisition (SCADA) system at the Kerman PV Plant monitors 52 analog, 44 status, 13 control, and 4 accumulator data points in real-time. A Remote Terminal Unit (RTU) polls 7 peripheral data acquisition units that are distributed throughout the plant once every second, and stores all analog, status, and accumulator points that have changed since the last scan. The R&D Computer, which is connected to the SCADA RTU via a RS-232 serial link, polls the RTU once every 5-7 seconds and records any values that have changed since the last scan. A SCADA software package calledmore » RealFlex runs on the R&D computer and stores all updated data values taken from the RTU, along with a time-stamp for each, in a historical real-time database. From this database, averages of all analog data points and snapshots of all status points are generated every 10 minutes and appended to a daily file. These files are downloaded via modem by PVUSA/Davis staff every day, and the data is placed into the PVUSA database.« less

  11. Reliability of the Defense Commissary Agency Personnel Property Database.

    DTIC Science & Technology

    2000-02-18

    Departments’ personal property databases. The tests were designed to validate the personal property databases. This report is the second in a series of...with the completeness of its data , and key data elements were not reliable for estimating the historical costs of real property for the Military...values of greater than $100,000. However, some of the Military Departments had problems with the completeness of its data , and key data elements

  12. Student Loans: Characteristics of Students and Default Rates at Historically Black Colleges and Universities. Report to Congressional Requesters.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC. Health, Education, and Human Services Div.

    This report to Congress analyzes student loan default rates at historically black colleges and universities (HBCUs), focusing on student characteristics which may predict the likelihood of default. The study examined available student databases for characteristics identified by previous studies as related to level of student loan defaults. Among…

  13. The Significance and Use of Historical Method in Library and Information Science Dissertations, 1984-1999.

    ERIC Educational Resources Information Center

    Thompson, Heather A.

    This study is concerned with the importance of historical method in library and information science research. The research conducted in this study specifically examined library and information science doctoral dissertations written between 1984-1999. The study of the "Digital Dissertations" database found that only eight to seventeen percent of…

  14. Archive and Database as Metaphor: Theorizing the Historical Record

    ERIC Educational Resources Information Center

    Manoff, Marlene

    2010-01-01

    Digital media increase the visibility and presence of the past while also reshaping our sense of history. We have extraordinary access to digital versions of books, journals, film, television, music, art and popular culture from earlier eras. New theoretical formulations of database and archive provide ways to think creatively about these changes…

  15. Listing of Education in Archaeological Programs: The LEAP Clearinghouse, 1989-1989 Summary Report.

    ERIC Educational Resources Information Center

    Knoll, Patricia C., Ed.

    This catalog incorporates information gathered between 1987 and 1989 for inclusion into the National Park Service's Listing of Education in Archaeological Programs (LEAP) computerized database. This database is a listing of federal, state, local and private projects promoting positive public awareness of U.S. archaeology--prehistoric and historic,…

  16. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  17. A data-driven soft sensor for needle deflection in heterogeneous tissue using just-in-time modelling.

    PubMed

    Rossa, Carlos; Lehmann, Thomas; Sloboda, Ronald; Usmani, Nawaid; Tavakoli, Mahdi

    2017-08-01

    Global modelling has traditionally been the approach taken to estimate needle deflection in soft tissue. In this paper, we propose a new method based on local data-driven modelling of needle deflection. External measurement of needle-tissue interactions is collected from several insertions in ex vivo tissue to form a cloud of data. Inputs to the system are the needle insertion depth, axial rotations, and the forces and torques measured at the needle base by a force sensor. When a new insertion is performed, the just-in-time learning method estimates the model outputs given the current inputs to the needle-tissue system and the historical database. The query is compared to every observation in the database and is given weights according to some similarity criteria. Only a subset of historical data that is most relevant to the query is selected and a local linear model is fit to the selected points to estimate the query output. The model outputs the 3D deflection of the needle tip and the needle insertion force. The proposed approach is validated in ex vivo multilayered biological tissue in different needle insertion scenarios. Experimental results in five different case studies indicate an accuracy in predicting needle deflection of 0.81 and 1.24 mm in the horizontal and vertical lanes, respectively, and an accuracy of 0.5 N in predicting the needle insertion force over 216 needle insertions.

  18. The Use of Intensity Scales In Exploiting Tsunami Historical Databases

    NASA Astrophysics Data System (ADS)

    Barberopoulou, A.; Scheele, F.

    2015-12-01

    Post-disaster assessments for historical tsunami events (>15 years old) are either scarce or contain limited information. In this study, we are assessing ways to examine tsunami impacts by utilizing data from old events, but more importantly we examine how to best utilize information contained in tsunami historical databases, in order to provide meaningful products that describe the impact of the event. As such, a tsunami intensity scale was applied to two historical events that were observed in New Zealand (one local and one distant), in order to utilize the largest possible number of observations in our dataset. This is especially important for countries like New Zealand where the tsunami historical record is short, going back to only the 19th century, and where instrument recordings are only available for the most recent events. We found that despite a number of challenges in using intensities -uncertainties partly due to limitations of historical event data - these data with the help of GIS tools can be used to produce hazard maps and offer an alternative way to exploit tsunami historical records. Most importantly the assignment of intensities at each point of observation allows for utilization of many more observations than if one depends on physical information alone, such as water heights. We hope these results may be used towards developing a well-defined methodology for hazard assessments, and refine our knowledge for past tsunami events for which the tsunami sources are largely unknown, and also for when physical quantities describing the tsunami (e.g. water height, flood depth, run-up) are scarce.

  19. Image management within a PACS

    NASA Astrophysics Data System (ADS)

    Glicksman, Robert A.; Prior, Fred W.; Wilson, Dennis L.

    1993-09-01

    The full benefits of a PACS system cannot be achieved by a departmental system, as films must still be made to service referring physicians and clinics. Therefore, a full hospital PACS must provide workstations throughout the hospital which are connected to the central file server and database, but which present `clinical' views of radiological data. In contrast to the radiologist, the clinician needs to select examinations from a `patient list' which presents the results of his/her radiology referrals. The most important data for the clinician is the radiology report, which must be immediately available upon selection of the examination. The images themselves, perhaps with annotations provided by the reading radiologist, must also be available in a few seconds from selection. Furthermore, the ability to display radiologist selected relevant historical images along with the new examination is necessary in those instances where the radiologist felt that certain historical images were important in the interpretation and diagnosis of the patient. Therefore, views of the new and historical data along clinical lines, conference preparation features, and modality and body part specific selections are also required to successfully implement a full hospital PACS. This paper describes the concepts for image selection and presentation at PACS workstations, both `diagnostic' workstations within the radiology department and `clinical' workstations which support the rest of the hospital and outpatient clinics.

  20. Power system modeling and optimization methods vis-a-vis integrated resource planning (IRP)

    NASA Astrophysics Data System (ADS)

    Arsali, Mohammad H.

    1998-12-01

    The state-of-the-art restructuring of power industries is changing the fundamental nature of retail electricity business. As a result, the so-called Integrated Resource Planning (IRP) strategies implemented on electric utilities are also undergoing modifications. Such modifications evolve from the imminent considerations to minimize the revenue requirements and maximize electrical system reliability vis-a-vis capacity-additions (viewed as potential investments). IRP modifications also provide service-design bases to meet the customer needs towards profitability. The purpose of this research as deliberated in this dissertation is to propose procedures for optimal IRP intended to expand generation facilities of a power system over a stretched period of time. Relevant topics addressed in this research towards IRP optimization are as follows: (1) Historical prospective and evolutionary aspects of power system production-costing models and optimization techniques; (2) A survey of major U.S. electric utilities adopting IRP under changing socioeconomic environment; (3) A new technique designated as the Segmentation Method for production-costing via IRP optimization; (4) Construction of a fuzzy relational database of a typical electric power utility system for IRP purposes; (5) A genetic algorithm based approach for IRP optimization using the fuzzy relational database.

  1. A dynamic clinical dental relational database.

    PubMed

    Taylor, D; Naguib, R N G; Boulton, S

    2004-09-01

    The traditional approach to relational database design is based on the logical organization of data into a number of related normalized tables. One assumption is that the nature and structure of the data is known at the design stage. In the case of designing a relational database to store historical dental epidemiological data from individual clinical surveys, the structure of the data is not known until the data is presented for inclusion into the database. This paper addresses the issues concerned with the theoretical design of a clinical dynamic database capable of adapting the internal table structure to accommodate clinical survey data, and presents a prototype database application capable of processing, displaying, and querying the dental data.

  2. Computerized commodity management system in Thailand and Brazil.

    PubMed

    1984-01-01

    Thailand's National Family Planning Program is testing a computerized contraceptive commodity reporting management in 4 provinces with 104 National Family Planning Program (NFPP) reporting entities. Staff in the Brazilian Association of Family Planning Entities (ABEPF) and CPAIMC, a major family planning service agency, have been trained in the use of a computerized commodity distribution management system and are ready to initiate test use. The systems were designed in response to specific commodity management needs of the concerned organizations. Neither distribution program functions as a contraceptive social marketing (CSM) program, but each system reviewed has aspects that are relevant to CSM commodity management needs. Both the Thai and Brazilian systems were designed to be as automatic and user friendly as possible. Both have 3 main databases and perform similar management and reporting functions. Differing program configurations and basic data forms reflect the specific purposes of each system. Databases for the logistics monitoring system in Thailand arethe reporting entity (or ID) file; the current month's data file; and the master balance file. The data source is the basic reporting form that also serves as a Request and Issue Voucher for commodities. Editing functions in the program check to see that the current "beginning balance" equals the previous month's ending balance. Indexing functions in the system allow direct access to the records of any reporting entity via the ID number, as well as the sequential processing of records by ID number. 6 reports can be generated: status report by issuing entity; status report by dispensing entity; aggregate status report; out of compliance products report; out of compliance outlets report; and suggested shipment to regional warehouse report. Databases for the distribution management system in Brazil are: the name-ID (client institution) file; the product file; and the data file. The data source is an order form that contains a client code similar to the code used in Thailand. An interrogative data entry program enhances the management function of the system. 8 reports can be individually issued: a status report on back orders by product; a status report on back orders by institution and product; a historical report of year to date shipments and value by product; a historical report of year to date shipments by client and product; year to date payment reports from each client; outstanding invoices by month for the previous 12 months; a product report showing the amount of each product or order with outstanding invoices; and a stock position report.

  3. National Assessment of Oil and Gas Project: Areas of Historical Oil and Gas Exploration and Production in the United States

    USGS Publications Warehouse

    Biewick, Laura

    2008-01-01

    This report contains maps and associated spatial data showing historical oil and gas exploration and production in the United States. Because of the proprietary nature of many oil and gas well databases, the United States was divided into cells one-quarter square mile and the production status of all wells in a given cell was aggregated. Base-map reference data are included, using the U.S. Geological Survey (USGS) National Map, the USGS and American Geological Institute (AGI) Global GIS, and a World Shaded Relief map service from the ESRI Geography Network. A hardcopy map was created to synthesize recorded exploration data from 1859, when the first oil well was drilled in the U.S., to 2005. In addition to the hardcopy map product, the data have been refined and made more accessible through the use of Geographic Information System (GIS) tools. The cell data are included in a GIS database constructed for spatial analysis via the USGS Internet Map Service or by importing the data into GIS software such as ArcGIS. The USGS internet map service provides a number of useful and sophisticated geoprocessing and cartographic functions via an internet browser. Also included is a video clip of U.S. oil and gas exploration and production through time.

  4. Historical reconstructions of California wildfires vary by data source

    USGS Publications Warehouse

    Syphard, Alexandra D.; Keeley, Jon E.

    2016-01-01

    Historical data are essential for understanding how fire activity responds to different drivers. It is important that the source of data is commensurate with the spatial and temporal scale of the question addressed, but fire history databases are derived from different sources with different restrictions. In California, a frequently used fire history dataset is the State of California Fire and Resource Assessment Program (FRAP) fire history database, which circumscribes fire perimeters at a relatively fine scale. It includes large fires on both state and federal lands but only covers fires that were mapped or had other spatially explicit data. A different database is the state and federal governments’ annual reports of all fires. They are more complete than the FRAP database but are only spatially explicit to the level of county (California Department of Forestry and Fire Protection – Cal Fire) or forest (United States Forest Service – USFS). We found substantial differences between the FRAP database and the annual summaries, with the largest and most consistent discrepancy being in fire frequency. The FRAP database missed the majority of fires and is thus a poor indicator of fire frequency or indicators of ignition sources. The FRAP database is also deficient in area burned, especially before 1950. Even in contemporary records, the huge number of smaller fires not included in the FRAP database account for substantial cumulative differences in area burned. Wildfires in California account for nearly half of the western United States fire suppression budget. Therefore, the conclusions about data discrepancies and the implications for fire research are of broad importance.

  5. Producing a Climate-Quality Database of Global Upper Ocean Profile Temperatures - The IQuOD (International Quality-controlled Ocean Database) Project.

    NASA Astrophysics Data System (ADS)

    Sprintall, J.; Cowley, R.; Palmer, M. D.; Domingues, C. M.; Suzuki, T.; Ishii, M.; Boyer, T.; Goni, G. J.; Gouretski, V. V.; Macdonald, A. M.; Thresher, A.; Good, S. A.; Diggs, S. C.

    2016-02-01

    Historical ocean temperature profile observations provide a critical element for a host of ocean and climate research activities. These include providing initial conditions for seasonal-to-decadal prediction systems, evaluating past variations in sea level and Earth's energy imbalance, ocean state estimation for studying variability and change, and climate model evaluation and development. The International Quality controlled Ocean Database (IQuOD) initiative represents a community effort to create the most globally complete temperature profile dataset, with (intelligent) metadata and assigned uncertainties. With an internationally coordinated effort organized by oceanographers, with data and ocean instrumentation expertise, and in close consultation with end users (e.g., climate modelers), the IQuOD initiative will assess and maximize the potential of an irreplaceable collection of ocean temperature observations (tens of millions of profiles collected at a cost of tens of billions of dollars, since 1772) to fulfil the demand for a climate-quality global database that can be used with greater confidence in a vast range of climate change related research and services of societal benefit. Progress towards version 1 of the IQuOD database, ongoing and future work will be presented. More information on IQuOD is available at www.iquod.org.

  6. Alaska Geochemical Database, Version 2.0 (AGDB2)--including “best value” data compilations for rock, sediment, soil, mineral, and concentrate sample media

    USGS Publications Warehouse

    Granitto, Matthew; Schmidt, Jeanine M.; Shew, Nora B.; Gamble, Bruce M.; Labay, Keith A.

    2013-01-01

    The Alaska Geochemical Database Version 2.0 (AGDB2) contains new geochemical data compilations in which each geologic material sample has one “best value” determination for each analyzed species, greatly improving speed and efficiency of use. Like the Alaska Geochemical Database (AGDB, http://pubs.usgs.gov/ds/637/) before it, the AGDB2 was created and designed to compile and integrate geochemical data from Alaska in order to facilitate geologic mapping, petrologic studies, mineral resource assessments, definition of geochemical baseline values and statistics, environmental impact assessments, and studies in medical geology. This relational database, created from the Alaska Geochemical Database (AGDB) that was released in 2011, serves as a data archive in support of present and future Alaskan geologic and geochemical projects, and contains data tables in several different formats describing historical and new quantitative and qualitative geochemical analyses. The analytical results were determined by 85 laboratory and field analytical methods on 264,095 rock, sediment, soil, mineral and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey personnel and analyzed in U.S. Geological Survey laboratories or, under contracts, in commercial analytical laboratories. These data represent analyses of samples collected as part of various U.S. Geological Survey programs and projects from 1962 through 2009. In addition, mineralogical data from 18,138 nonmagnetic heavy-mineral concentrate samples are included in this database. The AGDB2 includes historical geochemical data originally archived in the U.S. Geological Survey Rock Analysis Storage System (RASS) database, used from the mid-1960s through the late 1980s and the U.S. Geological Survey PLUTO database used from the mid-1970s through the mid-1990s. All of these data are currently maintained in the National Geochemical Database (NGDB). Retrievals from the NGDB were used to generate most of the AGDB data set. These data were checked for accuracy regarding sample location, sample media type, and analytical methods used. This arduous process of reviewing, verifying and, where necessary, editing all U.S. Geological Survey geochemical data resulted in a significantly improved Alaska geochemical dataset. USGS data that were not previously in the NGDB because the data predate the earliest U.S. Geological Survey geochemical databases, or were once excluded for programmatic reasons, are included here in the AGDB2 and will be added to the NGDB. The AGDB2 data provided here are the most accurate and complete to date, and should be useful for a wide variety of geochemical studies. The AGDB2 data provided in the linked database may be updated or changed periodically.

  7. Downscaling climate information for local disease mapping.

    PubMed

    Bernardi, M; Gommes, R; Grieser, J

    2006-06-01

    The study of the impacts of climate on human health requires the interdisciplinary efforts of health professionals, climatologists, biologists, and social scientists to analyze the relationships among physical, biological, ecological, and social systems. As the disease dynamics respond to variations in regional and local climate, climate variability affects every region of the world and the diseases are not necessarily limited to specific regions, so that vectors may become endemic in other regions. Climate data at local level are thus essential to evaluate the dynamics of vector-borne disease through health-climate models and most of the times the climatological databases are not adequate. Climate data at high spatial resolution can be derived by statistical downscaling using historical observations but the method is limited by the availability of historical data at local level. Since the 90s', the statistical interpolation of climate data has been an important priority of the Agrometeorology Group of the Food and Agriculture Organization of the United Nations (FAO), as they are required for agricultural planning and operational activities at the local level. Since 1995, date of the first FAO spatial interpolation software for climate data, more advanced applications have been developed such as SEDI (Satellite Enhanced Data Interpolation) for the downscaling of climate data, LOCCLIM (Local Climate Estimator) and the NEW_LOCCLIM in collaboration with the Deutscher Wetterdienst (German Weather Service) to estimate climatic conditions at locations for which no observations are available. In parallel, an important effort has been made to improve the FAO climate database including at present more than 30,000 stations worldwide and expanding the database from developing countries coverage to global coverage.

  8. Incident reporting in one UK accident and emergency department.

    PubMed

    Tighe, Catherine M; Woloshynowych, Maria; Brown, Ruth; Wears, Bob; Vincent, Charles

    2006-01-01

    Greater focus is needed on improving patient safety in modern healthcare systems and the first step to achieving this is to reliably identify the safety issues arising in healthcare. Research has shown the accident and emergency (A&E) department to be a particularly problematic environment where safety is a concern due to various factors, such as the range, nature and urgency of presenting conditions and the high turnover of patients. As in all healthcare environments clinical incident reporting in A&E is an important tool for detecting safety issues which can result in identifying solutions, learning from error and enhancing patient safety. This tool must be responsive and flexible to the local circumstances and work for the department to support the clinical governance agenda. In this paper, we describe the local processes for reporting and reviewing clinical incidents in one A&E department in a London teaching hospital and report recent changes to the system within the department. We used the historical data recorded on the Trust incident database as a representation of the information that would be available to the department in order to identify the high risk areas. In this paper, we evaluate the internal processes, the information available on the database and make recommendations to assist the emergency department in their internal processes. These will strengthen the internal review and staff feedback system so that the department can learn from incidents in a consistent manner. The process was reviewed by detailed examination of the centrally held electronic record (Datix database) of all incidents reported in a one year period. The nature of the incident and the level and accuracy of information provided in the incident reports was evaluated. There were positive aspects to the established system including evidence of positive changes made as a result of the reporting process, new initiatives to feedback to staff, and evolution of the programme for reporting and discussing the incidents internally. There appeared to be a mismatch between the recorded events and the category allocated to the incident in the historical record. In addition the database did not contain complete information for every incident, contributory factors were rarely recorded and relatively large numbers of incidents were recorded as "other" in the type of incident. There was also observed difficulty in updating the system as there is at least a months time lag between reporting or an incident and discussion/resolution of issues at the local departmental clinical risk management committee meetings. We used Leape's model for assessing the reporting system as a whole and found the system in the department to be relatively safe, fairly easy to use and moderately effective. Recommendations as a result of this study include the introduction of an electronic reporting system, limiting the number of staff who categorise the incidents--using clear definitions for classifications including a structured framework for contributory factors, and a process that allows incidents to be updated on the database locally after the discussion. This research may have implications for the incident reporting process in other specialities as well as in other hospitals.

  9. Clinical decision support of radiotherapy treatment planning: A data-driven machine learning strategy for patient-specific dosimetric decision making.

    PubMed

    Valdes, Gilmer; Simone, Charles B; Chen, Josephine; Lin, Alexander; Yom, Sue S; Pattison, Adam J; Carpenter, Colin M; Solberg, Timothy D

    2017-12-01

    Clinical decision support systems are a growing class of tools with the potential to impact healthcare. This study investigates the construction of a decision support system through which clinicians can efficiently identify which previously approved historical treatment plans are achievable for a new patient to aid in selection of therapy. Treatment data were collected for early-stage lung and postoperative oropharyngeal cancers treated using photon (lung and head and neck) and proton (head and neck) radiotherapy. Machine-learning classifiers were constructed using patient-specific feature-sets and a library of historical plans. Model accuracy was analyzed using learning curves, and historical treatment plan matching was investigated. Learning curves demonstrate that for these datasets, approximately 45, 60, and 30 patients are needed for a sufficiently accurate classification model for radiotherapy for early-stage lung, postoperative oropharyngeal photon, and postoperative oropharyngeal proton, respectively. The resulting classification model provides a database of previously approved treatment plans that are achievable for a new patient. An exemplary case, highlighting tradeoffs between the heart and chest wall dose while holding target dose constant in two historical plans is provided. We report on the first artificial-intelligence based clinical decision support system that connects patients to past discrete treatment plans in radiation oncology and demonstrate for the first time how this tool can enable clinicians to use past decisions to help inform current assessments. Clinicians can be informed of dose tradeoffs between critical structures early in the treatment process, enabling more time spent on finding the optimal course of treatment for individual patients. Copyright © 2017. Published by Elsevier B.V.

  10. Improving Logistics Realism in Command Post Exercises Involving the KC-135A/E/R Aircraft Using a Historical Aircraft Maintenance Database Model

    DTIC Science & Technology

    1990-09-01

    exper[ ence in u.sings both the KC-13iA/E/R d ,aboase model and other mat.hematival models. A staListical analysis of survey oz;ai,.,arons, will be...statistic. Consequently, differ- ences of opinion among respondents will be amplified. Summary The research methodology provide5 a sequential set of...Cost Accounting Direc- torate (AFLC/ACC). Though used for cost accounting pur- poses, the VAMOSC system has the capability of cross refer- encing a WUC

  11. Towards a Selenographic Information System: Apollo 15 Mission Digitization

    NASA Astrophysics Data System (ADS)

    Votava, J. E.; Petro, N. E.

    2012-12-01

    The Apollo missions represent some of the most technically complex and extensively documented explorations ever endeavored by mankind. The surface experiments performed and the lunar samples collected in-situ have helped form our understanding of the Moon's geologic history and the history of our Solar System. Unfortunately, a complication exists in the analysis and accessibility of these large volumes of lunar data and historical Apollo Era documents due to their multiple formats and disconnected web and print locations. Described here is a project to modernize, spatially reference, and link the lunar data into a comprehensive SELENOGRAPHIC INFORMATION SYSTEM, starting with the Apollo 15 mission. Like its terrestrial counter-parts, Geographic Information System (GIS) programs, such as ArcGIS, allow for easy integration, access, analysis, and display of large amounts of spatially-related data. Documentation in this new database includes surface photographs, panoramas, samples and their laboratory studies (major element and rare earth element weight percents), planned and actual vehicle traverses, and field notes. Using high-resolution (<0.25 m/pixel) images from the Lunar Reconnaissance Orbiter Camera (LROC) the rover (LRV) tracks and astronaut surface activities, along with field sketches from the Apollo 15 Preliminary Science Report (Swann, 1972), were digitized and mapped in ArcMap. Point features were created for each documented sample within the Lunar Sample Compendium (Meyer, 2010) and hyperlinked to the appropriate Compendium file (.PDF) at the stable archive site: http://curator.jsc.nasa.gov/lunar/compendium.cfm. Historical Apollo Era photographs and assembled panoramas were included as point features at each station that have been hyperlinked to the Apollo Lunar Surface Journal (ALSJ) online image library. The database has been set up to allow for the easy display of spatial variation of select attributes between samples. Attributes of interest that have data from the Compendium added directly into the database include age (Ga), mass, texture, major oxide elements (weight %), and Th and U (ppm). This project will produce an easily accessible and linked database that can offer technical and scientific information in its spatial context. While it is not possible given the enormous amounts of data, and the small allotment of time, to enter and/or link every detail to its map layer, the links that have been made here direct the user to rich, stable archive websites and web-based databases that are easy to navigate. While this project only created a product for the Apollo 15 mission, it is the model for spatially-referencing the other Apollo missions. Such a comprehensive lunar surface-activities database, a Selenographic Information System, will likely prove invaluable for future lunar studies. References: Meyer, C. (2010), The lunar sample compendium, June 2012 to August 2012, http://curator.jsc.nasa.gov/lunar/compendium.cfm, Astromaterials Res. & Exploration Sci., NASA L. B. Johnson Space Cent., Houston, TX. Swann, G. A. (1972), Preliminary geologic investigation of the Apollo 15 landing site, in Apollo 15 Preliminary Science Report, [NASA SP-289], pp. 5-1 - 5-112, NASA Manned Spacecraft Cent., Washington, D.C.

  12. History-Enriched Spaces for Shared Encounters

    NASA Astrophysics Data System (ADS)

    Konomi, Shin'ichi; Sezaki, Kaoru; Kitsuregawa, Masaru

    We discuss "history-enriched spaces" that use historical data to support shared encounters. We first examine our experiences with DeaiExplorer, a social network display that uses RFID and a historical database to support social interactions at academic conferences. This leads to our discussions on three complementary approaches to addressing the issues of supporting social encounters: (1) embedding historical data in embodied interactions, (2) designing for weakly involved interactions such as social navigation, and (3) designing for privacy. Finally, we briefly describe a preliminary prototype of a proxemics-based awareness tool that considers these approaches.

  13. Environmental Database For Water-Quality Data for the Penobscot River, Maine: Design Documentation and User Guide

    USGS Publications Warehouse

    Giffen, Sarah E.

    2002-01-01

    An environmental database was developed to store water-quality data collected during the 1999 U.S. Geological Survey investigation of the occurrence and distribution of dioxins, furans, and PCBs in the riverbed sediment and fish tissue in the Penobscot River in Maine. The database can be used to store a wide range of detailed information and to perform complex queries on the data it contains. The database also could be used to store data from other historical and any future environmental studies conducted on the Penobscot River and surrounding regions.

  14. THE ART OF DATA MINING THE MINEFIELDS OF TOXICITY ...

    EPA Pesticide Factsheets

    Toxicity databases have a special role in predictive toxicology, providing ready access to historical information throughout the workflow of discovery, development, and product safety processes in drug development as well as in review by regulatory agencies. To provide accurate information within a hypothesesbuilding environment, the content of the databases needs to be rigorously modeled using standards and controlled vocabulary. The utilitarian purposes of databases widely vary, ranging from a source for (Q)SAR datasets for modelers to a basis for

  15. In Brief: Online database for instantaneous streamflow data

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2007-11-01

    Access to U.S. Geological Survey (USGS) historical instantaneous streamflow discharge data, dating from around 1990, is now available online through the Instantaneous Data Archive (IDA), the USGS announced on 14 November. In this new system, users can find streamflow information reported at the time intervals at which it is collected, typically 15-minute to hourly intervals. Although instantaneous data have been available for many years, they were not accessible through the Internet. Robert Hirsch, USGS Associate Director of Water, said, ``A user-friendly archive of historical instantaneous streamflow data is important to many different users for such things as floodplain mapping, flood modeling, and estimating pollutant transport.''The site currently has about 1.5 billion instantaneous data values from 5500 stream gages in 26 states. The number of states and stream gages with data will continue to increase, according to the USGS. For more information, visit the Web site: http://ida.water.usgs.gov/ida/.

  16. The PREDICTS database: a global database of how local terrestrial biodiversity responds to human impacts

    PubMed Central

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Senior, Rebecca A; Bennett, Dominic J; Booth, Hollie; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; White, Hannah J; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Ancrenaz, Marc; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Báldi, András; Banks, John E; Barlow, Jos; Batáry, Péter; Bates, Adam J; Bayne, Erin M; Beja, Pedro; Berg, Åke; Berry, Nicholas J; Bicknell, Jake E; Bihn, Jochen H; Böhning-Gaese, Katrin; Boekhout, Teun; Boutin, Céline; Bouyer, Jérémy; Brearley, Francis Q; Brito, Isabel; Brunet, Jörg; Buczkowski, Grzegorz; Buscardo, Erika; Cabra-García, Jimmy; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Carrijo, Tiago F; Carvalho, Anelena L; Castro, Helena; Castro-Luna, Alejandro A; Cerda, Rolando; Cerezo, Alexis; Chauvat, Matthieu; Clarke, Frank M; Cleary, Daniel F R; Connop, Stuart P; D'Aniello, Biagio; da Silva, Pedro Giovâni; Darvill, Ben; Dauber, Jens; Dejean, Alain; Diekötter, Tim; Dominguez-Haydar, Yamileth; Dormann, Carsten F; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Elek, Zoltán; Entling, Martin H; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Ficetola, Gentile F; Filgueiras, Bruno K C; Fonte, Steven J; Fraser, Lauchlan H; Fukuda, Daisuke; Furlani, Dario; Ganzhorn, Jörg U; Garden, Jenni G; Gheler-Costa, Carla; Giordani, Paolo; Giordano, Simonetta; Gottschalk, Marco S; Goulson, Dave; Gove, Aaron D; Grogan, James; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hawes, Joseph E; Hébert, Christian; Helden, Alvin J; Henden, John-André; Hernández, Lionel; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Horgan, Finbarr G; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Jonsell, Mats; Jung, Thomas S; Kapoor, Vena; Kati, Vassiliki; Katovai, Eric; Kessler, Michael; Knop, Eva; Kolb, Annette; Kőrösi, Ádám; Lachat, Thibault; Lantschner, Victoria; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Letcher, Susan G; Littlewood, Nick A; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Marin-Spiotta, Erika; Marshall, E J P; Martínez, Eliana; Mayfield, Margaret M; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Naidoo, Robin; Nakamura, Akihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Neuschulz, Eike L; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Nöske, Nicole M; O'Dea, Niall; Oduro, William; Ofori-Boateng, Caleb; Oke, Chris O; Osgathorpe, Lynne M; Paritsis, Juan; Parra-H, Alejandro; Pelegrin, Nicolás; Peres, Carlos A; Persson, Anna S; Petanidou, Theodora; Phalan, Ben; Philips, T Keith; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Ribeiro, Danilo B; Richardson, Barbara A; Richardson, Michael J; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rosselli, Loreta; Rossiter, Stephen J; Roulston, T'ai H; Rousseau, Laurent; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Samnegård, Ulrika; Schüepp, Christof; Schweiger, Oliver; Sedlock, Jodi L; Shahabuddin, Ghazala; Sheil, Douglas; Silva, Fernando A B; Slade, Eleanor M; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Stout, Jane C; Struebig, Matthew J; Sung, Yik-Hei; Threlfall, Caragh G; Tonietto, Rebecca; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Vanbergen, Adam J; Vassilev, Kiril; Verboven, Hans A F; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Walker, Tony R; Wang, Yanping; Watling, James I; Wells, Konstans; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Woodcock, Ben A; Yu, Douglas W; Zaitsev, Andrey S; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    2014-01-01

    Biodiversity continues to decline in the face of increasing anthropogenic pressures such as habitat destruction, exploitation, pollution and introduction of alien species. Existing global databases of species’ threat status or population time series are dominated by charismatic species. The collation of datasets with broad taxonomic and biogeographic extents, and that support computation of a range of biodiversity indicators, is necessary to enable better understanding of historical declines and to project – and avert – future declines. We describe and assess a new database of more than 1.6 million samples from 78 countries representing over 28,000 species, collated from existing spatial comparisons of local-scale biodiversity exposed to different intensities and types of anthropogenic pressures, from terrestrial sites around the world. The database contains measurements taken in 208 (of 814) ecoregions, 13 (of 14) biomes, 25 (of 35) biodiversity hotspots and 16 (of 17) megadiverse countries. The database contains more than 1% of the total number of all species described, and more than 1% of the described species within many taxonomic groups – including flowering plants, gymnosperms, birds, mammals, reptiles, amphibians, beetles, lepidopterans and hymenopterans. The dataset, which is still being added to, is therefore already considerably larger and more representative than those used by previous quantitative models of biodiversity trends and responses. The database is being assembled as part of the PREDICTS project (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems – http://www.predicts.org.uk). We make site-level summary data available alongside this article. The full database will be publicly available in 2015. PMID:25558364

  17. The PREDICTS database: a global database of how local terrestrial biodiversity responds to human impacts.

    PubMed

    Hudson, Lawrence N; Newbold, Tim; Contu, Sara; Hill, Samantha L L; Lysenko, Igor; De Palma, Adriana; Phillips, Helen R P; Senior, Rebecca A; Bennett, Dominic J; Booth, Hollie; Choimes, Argyrios; Correia, David L P; Day, Julie; Echeverría-Londoño, Susy; Garon, Morgan; Harrison, Michelle L K; Ingram, Daniel J; Jung, Martin; Kemp, Victoria; Kirkpatrick, Lucinda; Martin, Callum D; Pan, Yuan; White, Hannah J; Aben, Job; Abrahamczyk, Stefan; Adum, Gilbert B; Aguilar-Barquero, Virginia; Aizen, Marcelo A; Ancrenaz, Marc; Arbeláez-Cortés, Enrique; Armbrecht, Inge; Azhar, Badrul; Azpiroz, Adrián B; Baeten, Lander; Báldi, András; Banks, John E; Barlow, Jos; Batáry, Péter; Bates, Adam J; Bayne, Erin M; Beja, Pedro; Berg, Åke; Berry, Nicholas J; Bicknell, Jake E; Bihn, Jochen H; Böhning-Gaese, Katrin; Boekhout, Teun; Boutin, Céline; Bouyer, Jérémy; Brearley, Francis Q; Brito, Isabel; Brunet, Jörg; Buczkowski, Grzegorz; Buscardo, Erika; Cabra-García, Jimmy; Calviño-Cancela, María; Cameron, Sydney A; Cancello, Eliana M; Carrijo, Tiago F; Carvalho, Anelena L; Castro, Helena; Castro-Luna, Alejandro A; Cerda, Rolando; Cerezo, Alexis; Chauvat, Matthieu; Clarke, Frank M; Cleary, Daniel F R; Connop, Stuart P; D'Aniello, Biagio; da Silva, Pedro Giovâni; Darvill, Ben; Dauber, Jens; Dejean, Alain; Diekötter, Tim; Dominguez-Haydar, Yamileth; Dormann, Carsten F; Dumont, Bertrand; Dures, Simon G; Dynesius, Mats; Edenius, Lars; Elek, Zoltán; Entling, Martin H; Farwig, Nina; Fayle, Tom M; Felicioli, Antonio; Felton, Annika M; Ficetola, Gentile F; Filgueiras, Bruno K C; Fonte, Steven J; Fraser, Lauchlan H; Fukuda, Daisuke; Furlani, Dario; Ganzhorn, Jörg U; Garden, Jenni G; Gheler-Costa, Carla; Giordani, Paolo; Giordano, Simonetta; Gottschalk, Marco S; Goulson, Dave; Gove, Aaron D; Grogan, James; Hanley, Mick E; Hanson, Thor; Hashim, Nor R; Hawes, Joseph E; Hébert, Christian; Helden, Alvin J; Henden, John-André; Hernández, Lionel; Herzog, Felix; Higuera-Diaz, Diego; Hilje, Branko; Horgan, Finbarr G; Horváth, Roland; Hylander, Kristoffer; Isaacs-Cubides, Paola; Ishitani, Masahiro; Jacobs, Carmen T; Jaramillo, Víctor J; Jauker, Birgit; Jonsell, Mats; Jung, Thomas S; Kapoor, Vena; Kati, Vassiliki; Katovai, Eric; Kessler, Michael; Knop, Eva; Kolb, Annette; Kőrösi, Ádám; Lachat, Thibault; Lantschner, Victoria; Le Féon, Violette; LeBuhn, Gretchen; Légaré, Jean-Philippe; Letcher, Susan G; Littlewood, Nick A; López-Quintero, Carlos A; Louhaichi, Mounir; Lövei, Gabor L; Lucas-Borja, Manuel Esteban; Luja, Victor H; Maeto, Kaoru; Magura, Tibor; Mallari, Neil Aldrin; Marin-Spiotta, Erika; Marshall, E J P; Martínez, Eliana; Mayfield, Margaret M; Mikusinski, Grzegorz; Milder, Jeffrey C; Miller, James R; Morales, Carolina L; Muchane, Mary N; Muchane, Muchai; Naidoo, Robin; Nakamura, Akihiro; Naoe, Shoji; Nates-Parra, Guiomar; Navarrete Gutierrez, Dario A; Neuschulz, Eike L; Noreika, Norbertas; Norfolk, Olivia; Noriega, Jorge Ari; Nöske, Nicole M; O'Dea, Niall; Oduro, William; Ofori-Boateng, Caleb; Oke, Chris O; Osgathorpe, Lynne M; Paritsis, Juan; Parra-H, Alejandro; Pelegrin, Nicolás; Peres, Carlos A; Persson, Anna S; Petanidou, Theodora; Phalan, Ben; Philips, T Keith; Poveda, Katja; Power, Eileen F; Presley, Steven J; Proença, Vânia; Quaranta, Marino; Quintero, Carolina; Redpath-Downing, Nicola A; Reid, J Leighton; Reis, Yana T; Ribeiro, Danilo B; Richardson, Barbara A; Richardson, Michael J; Robles, Carolina A; Römbke, Jörg; Romero-Duque, Luz Piedad; Rosselli, Loreta; Rossiter, Stephen J; Roulston, T'ai H; Rousseau, Laurent; Sadler, Jonathan P; Sáfián, Szabolcs; Saldaña-Vázquez, Romeo A; Samnegård, Ulrika; Schüepp, Christof; Schweiger, Oliver; Sedlock, Jodi L; Shahabuddin, Ghazala; Sheil, Douglas; Silva, Fernando A B; Slade, Eleanor M; Smith-Pardo, Allan H; Sodhi, Navjot S; Somarriba, Eduardo J; Sosa, Ramón A; Stout, Jane C; Struebig, Matthew J; Sung, Yik-Hei; Threlfall, Caragh G; Tonietto, Rebecca; Tóthmérész, Béla; Tscharntke, Teja; Turner, Edgar C; Tylianakis, Jason M; Vanbergen, Adam J; Vassilev, Kiril; Verboven, Hans A F; Vergara, Carlos H; Vergara, Pablo M; Verhulst, Jort; Walker, Tony R; Wang, Yanping; Watling, James I; Wells, Konstans; Williams, Christopher D; Willig, Michael R; Woinarski, John C Z; Wolf, Jan H D; Woodcock, Ben A; Yu, Douglas W; Zaitsev, Andrey S; Collen, Ben; Ewers, Rob M; Mace, Georgina M; Purves, Drew W; Scharlemann, Jörn P W; Purvis, Andy

    2014-12-01

    Biodiversity continues to decline in the face of increasing anthropogenic pressures such as habitat destruction, exploitation, pollution and introduction of alien species. Existing global databases of species' threat status or population time series are dominated by charismatic species. The collation of datasets with broad taxonomic and biogeographic extents, and that support computation of a range of biodiversity indicators, is necessary to enable better understanding of historical declines and to project - and avert - future declines. We describe and assess a new database of more than 1.6 million samples from 78 countries representing over 28,000 species, collated from existing spatial comparisons of local-scale biodiversity exposed to different intensities and types of anthropogenic pressures, from terrestrial sites around the world. The database contains measurements taken in 208 (of 814) ecoregions, 13 (of 14) biomes, 25 (of 35) biodiversity hotspots and 16 (of 17) megadiverse countries. The database contains more than 1% of the total number of all species described, and more than 1% of the described species within many taxonomic groups - including flowering plants, gymnosperms, birds, mammals, reptiles, amphibians, beetles, lepidopterans and hymenopterans. The dataset, which is still being added to, is therefore already considerably larger and more representative than those used by previous quantitative models of biodiversity trends and responses. The database is being assembled as part of the PREDICTS project (Projecting Responses of Ecological Diversity In Changing Terrestrial Systems - http://www.predicts.org.uk). We make site-level summary data available alongside this article. The full database will be publicly available in 2015.

  18. SeaWiFS technical report series. Volume 20: The SeaWiFS bio-optical archive and storage system (SeaBASS), part 1

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Mcclain, Charles R.; Firestone, James K.; Westphal, Todd L.; Yeh, Eueng-Nan; Ge, Yuntao; Firestone, Elaine R.

    1994-01-01

    This document provides an overview of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Bio-Optical Archive and Storage System (SeaBASS), which will serve as a repository for numerous data sets of interest to the SeaWiFS Science Team and other approved investigators in the oceanographic community. The data collected will be those data sets suitable for the development and evaluation of bio-optical algorithms which include results from SeaWiFS Intercalibration Round-Robin Experiments (SIRREXs), prelaunch characterization of the SeaWiFS instrument by its manufacturer -- Hughes/Santa Barbara Research Center (SBRC), Marine Optical Characterization Experiment (MOCE) cruises, Marine Optical Buoy (MOBY) deployments and refurbishments, and field studies of other scientists outside of NASA. The primary goal of the data system is to provide a simple mechanism for querying the available archive and requesting specific items, while assuring that the data is made available only to authorized users. The design, construction, and maintenance of SeaBASS is the responsibility of the SeaWiFS Calibration and Validation Team (CVT). This report is concerned with documenting the execution of this task by the CVT and consists of a series of chapters detailing the various data sets involved. The topics presented are as follows: 1) overview of the SeaBASS file architecture, 2) the bio-optical data system, 3) the historical pigment database, 4) the SIRREX database, and 5) the SBRC database.

  19. Development of a Global Fire Weather Database

    NASA Technical Reports Server (NTRS)

    Field, R. D.; Spessa, A. C.; Aziz, N. A.; Camia, A.; Cantin, A.; Carr, R.; de Groot, W. J.; Dowdy, A. J.; Flannigan, M. D.; Manomaiphiboon, K.; hide

    2015-01-01

    The Canadian Forest Fire Weather Index (FWI) System is the mostly widely used fire danger rating system in the world. We have developed a global database of daily FWI System calculations, beginning in 1980, called the Global Fire WEather Database (GFWED) gridded to a spatial resolution of 0.5 latitude by 2/3 longitude. Input weather data were obtained from the NASA Modern Era Retrospective- Analysis for Research and Applications (MERRA), and two different estimates of daily precipitation from rain gauges over land. FWI System Drought Code calculations from the gridded data sets were compared to calculations from individual weather station data for a representative set of 48 stations in North, Central and South America, Europe, Russia, Southeast Asia and Australia. Agreement between gridded calculations and the station-based calculations tended to be most different at low latitudes for strictly MERRA based calculations. Strong biases could be seen in either direction: MERRA DC over the Mato Grosso in Brazil reached unrealistically high values exceeding DCD1500 during the dry season but was too low over Southeast Asia during the dry season. These biases are consistent with those previously identified in MERRA's precipitation, and they reinforce the need to consider alternative sources of precipitation data. GFWED can be used for analyzing historical relationships between fire weather and fire activity at continental and global scales, in identifying large-scale atmosphere-ocean controls on fire weather, and calibration of FWI-based fire prediction models.

  20. Military, Charter, Unreported Domestic Traffic and General Aviation 1976, 1984, 1992, and 2015 Emission Scenarios

    NASA Technical Reports Server (NTRS)

    Mortlock, Alan; VanAlstyne, Richard

    1998-01-01

    The report describes development of databases estimating aircraft engine exhaust emissions for the years 1976 and 1984 from global operations of Military, Charter, historic Soviet and Chinese, Unreported Domestic traffic, and General Aviation (GA). These databases were developed under the National Aeronautics and Space Administration's (NASA) Advanced Subsonic Assessment (AST). McDonnell Douglas Corporation's (MDC), now part of the Boeing Company has previously estimated engine exhaust emissions' databases for the baseline year of 1992 and a 2015 forecast year scenario. Since their original creation, (Ward, 1994 and Metwally, 1995) revised technology algorithms have been developed. Additionally, GA databases have been created and all past NIDC emission inventories have been updated to reflect the new technology algorithms. Revised data (Baughcum, 1996 and Baughcum, 1997) for the scheduled inventories have been used in this report to provide a comparison of the total aviation emission forecasts from various components. Global results of two historic years (1976 and 1984), a baseline year (1992) and a forecast year (2015) are presented. Since engine emissions are directly related to fuel usage, an overview of individual aviation annual global fuel use for each inventory component is also given in this report.

  1. The Qatar National Historic Environment Record: a Platform for the Development of a Fully-Integrated Cultural Heritage Management Application

    NASA Astrophysics Data System (ADS)

    Cuttler, R. T. H.; Tonner, T. W. W.; Al-Naimi, F. A.; Dingwall, L. M.; Al-Hemaidi, N.

    2013-07-01

    The development of the Qatar National Historic Environment Record (QNHER) by the Qatar Museums Authority and the University of Birmingham in 2008 was based on a customised, bilingual Access database and ArcGIS. While both platforms are stable and well supported, neither was designed for the documentation and retrieval of cultural heritage data. As a result it was decided to develop a custom application using Open Source code. The core module of this application is now completed and is orientated towards the storage and retrieval of geospatial heritage data for the curation of heritage assets. Based on MIDAS Heritage data standards and regionally relevant thesauri, it is a truly bilingual system. Significant attention has been paid to the user interface, which is userfriendly and intuitive. Based on a suite of web services and accessed through a web browser, the system makes full use of internet resources such as Google Maps and Bing Maps. The application avoids long term vendor ''tie-ins'' and as a fully integrated data management system, is now an important tool for both cultural resource managers and heritage researchers in Qatar.

  2. Using OPC and HL7 Standards to Incorporate an Industrial Big Data Historian in a Health IT Environment.

    PubMed

    Cruz, Márcio Freire; Cavalcante, Carlos Arthur Mattos Teixeira; Sá Barretto, Sérgio Torres

    2018-05-30

    Health Level Seven (HL7) is one of the standards most used to centralize data from different vital sign monitoring systems. This solution significantly limits the data available for historical analysis, because it typically uses databases that are not effective in storing large volumes of data. In industry, a specific Big Data Historian, known as a Process Information Management System (PIMS), solves this problem. This work proposes the same solution to overcome the restriction on storing vital sign data. The PIMS needs a compatible communication standard to allow storing, and the one most commonly used is the OLE for Process Control (OPC). This paper presents a HL7-OPC Server that permits communication between vital sign monitoring systems with PIMS, thus allowing the storage of long historical series of vital signs. In addition, it carries out a review about local and cloud-based Big Medical Data researches, followed by an analysis of the PIMS in a Health IT Environment. Then it shows the architecture of HL7 and OPC Standards. Finally, it shows the HL7-OPC Server and a sequence of tests that proved its full operation and performance.

  3. Conjunctive patches subspace learning with side information for collaborative image retrieval.

    PubMed

    Zhang, Lining; Wang, Lipo; Lin, Weisi

    2012-08-01

    Content-Based Image Retrieval (CBIR) has attracted substantial attention during the past few years for its potential practical applications to image management. A variety of Relevance Feedback (RF) schemes have been designed to bridge the semantic gap between the low-level visual features and the high-level semantic concepts for an image retrieval task. Various Collaborative Image Retrieval (CIR) schemes aim to utilize the user historical feedback log data with similar and dissimilar pairwise constraints to improve the performance of a CBIR system. However, existing subspace learning approaches with explicit label information cannot be applied for a CIR task, although the subspace learning techniques play a key role in various computer vision tasks, e.g., face recognition and image classification. In this paper, we propose a novel subspace learning framework, i.e., Conjunctive Patches Subspace Learning (CPSL) with side information, for learning an effective semantic subspace by exploiting the user historical feedback log data for a CIR task. The CPSL can effectively integrate the discriminative information of labeled log images, the geometrical information of labeled log images and the weakly similar information of unlabeled images together to learn a reliable subspace. We formally formulate this problem into a constrained optimization problem and then present a new subspace learning technique to exploit the user historical feedback log data. Extensive experiments on both synthetic data sets and a real-world image database demonstrate the effectiveness of the proposed scheme in improving the performance of a CBIR system by exploiting the user historical feedback log data.

  4. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  5. Global Earthquake and Volcanic Eruption Economic losses and costs from 1900-2014: 115 years of the CATDAT database - Trends, Normalisation and Visualisation

    NASA Astrophysics Data System (ADS)

    Daniell, James; Skapski, Jens-Udo; Vervaeck, Armand; Wenzel, Friedemann; Schaefer, Andreas

    2015-04-01

    Over the past 12 years, an in-depth database has been constructed for socio-economic losses from earthquakes and volcanoes. The effects of earthquakes and volcanic eruptions have been documented in many databases, however, many errors and incorrect details are often encountered. To combat this, the database was formed with socioeconomic checks of GDP, capital stock, population and other elements, as well as providing upper and lower bounds to each available event loss. The definition of economic losses within the CATDAT Damaging Earthquakes Database (Daniell et al., 2011a) as of v6.1 has now been redefined to provide three options of natural disaster loss pricing, including reconstruction cost, replacement cost and actual loss, in order to better define the impact of historical disasters. Similarly for volcanoes as for earthquakes, a reassessment has been undertaken looking at the historical net and gross capital stock and GDP at the time of the event, including the depreciated stock, in order to calculate the actual loss. A normalisation has then been undertaken using updated population, GDP and capital stock. The difference between depreciated and gross capital can be removed from the historical loss estimates which have been all calculated without taking depreciation of the building stock into account. The culmination of time series from 1900-2014 of net and gross capital stock, GDP, direct economic loss data, use of detailed studies of infrastructure age, and existing damage surveys, has allowed the first estimate of this nature. The death tolls in earthquakes from 1900-2014 are presented in various forms, showing around 2.32 million deaths due to earthquakes (with a range of 2.18 to 2.63 million) and around 59% due to masonry buildings and 28% from secondary effects. For the death tolls from the volcanic eruption database, 98000 deaths with a range from around 83000 to 107000 is seen from 1900-2014. The application of VSL life costing from death and injury tolls from historic events is discussed. The CATDAT socioeconomic databases of parameters like disaggregated population, GDP, capital stock, building typologies, food security and inter-country export interactions are used to create a current exposure view of the world. The potential for losses globally is discussed with a re-creation of each damaging event since 1900, with well in excess of 10 trillion USD in normalised losses being seen from the 115 years of events. Potential worst case events for volcano and earthquake around the globe are discussed in terms of their potential for damage and huge economic loss today, and over the next century using SSP projections adjusted over a country basis including inter-country effects.

  6. The National Landslide Database and GIS for Great Britain: construction, development, data acquisition, application and communication

    NASA Astrophysics Data System (ADS)

    Pennington, Catherine; Dashwood, Claire; Freeborough, Katy

    2014-05-01

    The National Landslide Database has been developed by the British Geological Survey (BGS) and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 16,500 records of landslide events, each documented as fully as possible. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and crowd-sourcing information through social media and other online resources. This information is invaluable for the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domain map currently under development rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures and an understanding of causative factors and their spatial distribution, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP and data collected for the National Landslide Database is used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.

  7. Public Health's Approach to Systemic Racism: a Systematic Literature Review.

    PubMed

    Castle, Billie; Wendel, Monica; Kerr, Jelani; Brooms, Derrick; Rollins, Aaron

    2018-05-04

    Recently, public health has acknowledged racism as a social determinant of health. Much evidence exists on the impact of individual-level racism and discrimination, with little to no examination of racism from the standpoint of systems and structures. The purpose of this systematic literature review is to analyze the extent to which public health currently addresses systemic racism in the published literature. Utilizing the PRISMA guidelines, this review examines three widely used databases to examine published literature covering the topic as well as implications for future research and practice. A total of 85 articles were included in the review analysis after meeting study criteria. Across numerous articles, the terms racism and systemic racism are largely absent. A critical need exists for an examination of the historical impact of systemic racism on the social determinants of health and health of marginalized populations.

  8. Impact of storms on coastlines: preparing for the future without forgetting the past? Examples from European coastlines using a Storm Impact Database

    NASA Astrophysics Data System (ADS)

    Ciavola, Paolo; Garnier, Emmanuel; Ferreira, Oscar; Spencer, Thomas; Armaroli, Clara

    2017-04-01

    Severe storms have historically affected many European coastlines but the impact of each storm has been evaluated in different ways in different countries, often using local socio-economic impact criteria (e.g. loss of lives and damage to properties). Although the Xynthia (2010) storm, Atlantic coast of France, was the largest coastal disaster of the last 50 years, similar events have previously impacted Europe. The 1953 storm surge in the southern North Sea, resulted in over 2000 deaths and extensive flooding and was the catalyst for post WWII improvements in flood defences and storm early warning systems. On a longer timescale, the very extreme storm of 1634 AD re-configured Wadden Sea coastlines, accompanied by thousands of deaths. Establishing patterns of coastal risk and vulnerability is greatly helped by the use of historical sources, as these allow the development of more complete time series of storm events and their impacts. The work to be presented was supported by the EU RISC-KIT (Resilience-Increasing Strategies for Coasts - toolKIT) Project. RISC-KIT (http://www.risckit.eu/np4/home.html) is a EU FP7 Collaborative project that has developed methods, tools and management approaches to reduce risk and increase resilience to low frequency, high-impact hydro-meteorological events in the coastal zone. These products will enhance forecasting, prediction and early warning capabilities, improve the assessment of long-term coastal risk and optimize the mix of prevention, mitigation and preparedness measures. We analyse historical large-scale events occurred from The Middle Ages to the 1960s at the case study sites of North Norfolk Coast (UK), the Charente-Maritime and Vendée coast (France), the Cinque Terre-Liguria (Italy), the Emilia-Romagna coast (Italy), and the Ria Formosa coast (Portugal). The work presented here uses a database of events built by the project, examining records for the last 300 years, including the characteristics of the storms as well as recorded losses. Finally, lessons learned will be presented, understanding the interaction between DRR elements such as prevention, resilience, mitigation and preparedness. The project's database is publicly available (http://risckit.cloudapp.net/risckit/#/)

  9. Extending Glacier Monitoring into the Little Ice Age and Beyond

    NASA Astrophysics Data System (ADS)

    Nussbaumer, S. U.; Gärtner-Roer, I.; Zemp, M.; Zumbühl, H. J.; Masiokas, M. H.; Espizua, L. E.; Pitte, P.

    2011-12-01

    Glaciers are among the best natural proxies of climatic changes and, as such, a key variable within the international climate observing system. The worldwide monitoring of glacier distribution and fluctuations has been internationally coordinated for more than a century. Direct measurements of seasonal and annual glacier mass balance are available for the past six decades. Regular observations of glacier front variations have been carried out since the late 19th century. Information on glacier fluctuations before the onset of regular in situ measurements have to be reconstructed from moraines, historical evidence, and a wide range of dating methods. The majority of corresponding data is not available to the scientific community which challenges the reproducibility and direct comparison of the results. Here, we present a first approach towards the standardization of reconstructed Holocene glacier front variations as well as the integration of the corresponding data series into the database of the World Glacier Monitoring Service (www.wgms.ch), within the framework of the Global Terrestrial Network for Glaciers (www.gtn-g.org). The concept for the integration of these reconstructed front variations into the relational glacier database of the WGMS was jointly elaborated and tested by experts of both fields (natural and historical sciences), based on reconstruction series of 15 glaciers in Europe (western/central Alps and southern Norway) and 9 in southern South America. The reconstructed front variation series extend the direct measurements of the 20th century by two centuries in Norway and by four in the Alps and in South America. The storage of the records within the international glacier databases guarantees the long-term availability of the data series and increases the visibility of the scientific research which - in historical glaciology - is often the work of a lifetime. The standardized collection of reconstructed glacier front variations from southern Norway, the western Alps and the southern Andes allows a direct comparison between different glaciers. It is a first step towards a worldwide compilation and free dissemination of Holocene glacier fluctuation series within the internationally coordinated glacier monitoring.

  10. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Tonsina area, Valdez Quadrangle, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 128 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Tonsina area in the Chugach Mountains, Valdez quadrangle, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies

  11. Cooperative organic mine avoidance path planning

    NASA Astrophysics Data System (ADS)

    McCubbin, Christopher B.; Piatko, Christine D.; Peterson, Adam V.; Donnald, Creighton R.; Cohen, David

    2005-06-01

    The JHU/APL Path Planning team has developed path planning techniques to look for paths that balance the utility and risk associated with different routes through a minefield. Extending on previous years' efforts, we investigated real-world Naval mine avoidance requirements and developed a tactical decision aid (TDA) that satisfies those requirements. APL has developed new mine path planning techniques using graph based and genetic algorithms which quickly produce near-minimum risk paths for complicated fitness functions incorporating risk, path length, ship kinematics, and naval doctrine. The TDA user interface, a Java Swing application that obtains data via Corba interfaces to path planning databases, allows the operator to explore a fusion of historic and in situ mine field data, control the path planner, and display the planning results. To provide a context for the minefield data, the user interface also renders data from the Digital Nautical Chart database, a database created by the National Geospatial-Intelligence Agency containing charts of the world's ports and coastal regions. This TDA has been developed in conjunction with the COMID (Cooperative Organic Mine Defense) system. This paper presents a description of the algorithms, architecture, and application produced.

  12. Updated Palaeotsunami Database for Aotearoa/New Zealand

    NASA Astrophysics Data System (ADS)

    Gadsby, M. R.; Goff, J. R.; King, D. N.; Robbins, J.; Duesing, U.; Franz, T.; Borrero, J. C.; Watkins, A.

    2016-12-01

    The updated configuration, design, and implementation of a national palaeotsunami (pre-historic tsunami) database for Aotearoa/New Zealand (A/NZ) is near completion. This tool enables correlation of events along different stretches of the NZ coastline, provides information on frequency and extent of local, regional and distant-source tsunamis, and delivers detailed information on the science and proxies used to identify the deposits. In A/NZ a plethora of data, scientific research and experience surrounds palaeotsunami deposits, but much of this information has been difficult to locate, has variable reporting standards, and lacked quality assurance. The original database was created by Professor James Goff while working at the National Institute of Water & Atmospheric Research in A/NZ, but has subsequently been updated during his tenure at the University of New South Wales. The updating and establishment of the national database was funded by the Ministry of Civil Defence and Emergency Management (MCDEM), led by Environment Canterbury Regional Council, and supported by all 16 regions of A/NZ's local government. Creation of a single database has consolidated a wide range of published and unpublished research contributions from many science providers on palaeotsunamis in A/NZ. The information is now easily accessible and quality assured and allows examination of frequency, extent and correlation of events. This provides authoritative scientific support for coastal-marine planning and risk management. The database will complement the GNS New Zealand Historical Database, and contributes to a heightened public awareness of tsunami by being a "one-stop-shop" for information on past tsunami impacts. There is scope for this to become an international database, enabling the pacific-wide correlation of large events, as well as identifying smaller regional ones. The Australian research community has already expressed an interest, and the database is also compatible with a similar one currently under development in Japan. Expressions of interest in collaborating with the A/NZ team to expand the database are invited from other Pacific nations.

  13. Detection and measurement of total ozone from stellar spectra: Paper 2. Historic data from 1935 1942

    NASA Astrophysics Data System (ADS)

    Griffin, R. E. M.

    2005-10-01

    Atmospheric ozone columns are derived from historic stellar spectra observed between 1935 and 1942 at Mount Wilson Observatory, California. Comparisons with contemporary measurements in the Arosa database show a generally close correspondence. The results of the analysis indicate that astronomy's archives command considerable potential for investigating the natural levels of ozone and its variability during the decades prior to anthropogenic interference.

  14. The microcomputer scientific software series 9: user's guide to Geo-CLM: geostatistical interpolation of the historical climatic record in the Lake States.

    Treesearch

    Margaret R. Holdaway

    1994-01-01

    Describes Geo-CLM, a computer application (for Mac or DOS) whose primary aim is to perform multiple kriging runs to interpolate the historic climatic record at research plots in the Lake States. It is an exploration and analysis tool. Addition capabilities include climatic databases, a flexible test mode, cross validation, lat/long conversion, English/metric units,...

  15. User’s manual to update the National Wildlife Refuge System Water Quality Information System (WQIS)

    USGS Publications Warehouse

    Chojnacki, Kimberly A.; Vishy, Chad J.; Hinck, Jo Ellen; Finger, Susan E.; Higgins, Michael J.; Kilbride, Kevin

    2013-01-01

    National Wildlife Refuges may have impaired water quality resulting from historic and current land uses, upstream sources, and aerial pollutant deposition. National Wildlife Refuge staff have limited time available to identify and evaluate potential water quality issues. As a result, water quality–related issues may not be resolved until a problem has already arisen. The National Wildlife Refuge System Water Quality Information System (WQIS) is a relational database developed for use by U.S. Fish and Wildlife Service staff to identify existing water quality issues on refuges in the United States. The WQIS database relies on a geospatial overlay analysis of data layers for ownership, streams and water quality. The WQIS provides summary statistics of 303(d) impaired waters and total maximum daily loads for the National Wildlife Refuge System at the national, regional, and refuge level. The WQIS allows U.S. Fish and Wildlife Service staff to be proactive in addressing water quality issues by identifying and understanding the current extent and nature of 303(d) impaired waters and subsequent total maximum daily loads. Water quality data are updated bi-annually, making it necessary to refresh the WQIS to maintain up-to-date information. This manual outlines the steps necessary to update the data and reports in the WQIS.

  16. Optimization of analytical laboratory work using computer networking and databasing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upp, D.L.; Metcalf, R.A.

    1996-06-01

    The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs.more » All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.« less

  17. A web-based platform for virtual screening.

    PubMed

    Watson, Paul; Verdonk, Marcel; Hartshorn, Michael J

    2003-09-01

    A fully integrated, web-based, virtual screening platform has been developed to allow rapid virtual screening of large numbers of compounds. ORACLE is used to store information at all stages of the process. The system includes a large database of historical compounds from high throughput screenings (HTS) chemical suppliers, ATLAS, containing over 3.1 million unique compounds with their associated physiochemical properties (ClogP, MW, etc.). The database can be screened using a web-based interface to produce compound subsets for virtual screening or virtual library (VL) enumeration. In order to carry out the latter task within ORACLE a reaction data cartridge has been developed. Virtual libraries can be enumerated rapidly using the web-based interface to the cartridge. The compound subsets can be seamlessly submitted for virtual screening experiments, and the results can be viewed via another web-based interface allowing ad hoc querying of the virtual screening data stored in ORACLE.

  18. Rhode Island Water Supply System Management Plan Database (WSSMP-Version 1.0)

    USGS Publications Warehouse

    Granato, Gregory E.

    2004-01-01

    In Rhode Island, the availability of water of sufficient quality and quantity to meet current and future environmental and economic needs is vital to life and the State's economy. Water suppliers, the Rhode Island Water Resources Board (RIWRB), and other State agencies responsible for water resources in Rhode Island need information about available resources, the water-supply infrastructure, and water use patterns. These decision makers need historical, current, and future water-resource information. In 1997, the State of Rhode Island formalized a system of Water Supply System Management Plans (WSSMPs) to characterize and document relevant water-supply information. All major water suppliers (those that obtain, transport, purchase, or sell more than 50 million gallons of water per year) are required to prepare, maintain, and carry out WSSMPs. An electronic database for this WSSMP information has been deemed necessary by the RIWRB for water suppliers and State agencies to consistently document, maintain, and interpret the information in these plans. Availability of WSSMP data in standard formats will allow water suppliers and State agencies to improve the understanding of water-supply systems and to plan for future needs or water-supply emergencies. In 2002, however, the Rhode Island General Assembly passed a law that classifies some of the WSSMP information as confidential to protect the water-supply infrastructure from potential terrorist threats. Therefore the WSSMP database was designed for an implementation method that will balance security concerns with the information needs of the RIWRB, suppliers, other State agencies, and the public. A WSSMP database was developed by the U.S. Geological Survey in cooperation with the RIWRB. The database was designed to catalog WSSMP information in a format that would accommodate synthesis of current and future information about Rhode Island's water-supply infrastructure. This report documents the design and implementation of the WSSMP database. All WSSMP information in the database is, ultimately, linked to the individual water suppliers and to a WSSMP 'cycle' (which is currently a 5-year planning cycle for compiling WSSMP information). The database file contains 172 tables - 47 data tables, 61 association tables, 61 domain tables, and 3 example import-link tables. This database is currently implemented in the Microsoft Access database software because it is widely used within and outside of government and is familiar to many existing and potential customers. Design documentation facilitates current use and potential modification for future use of the database. Information within the structure of the WSSMP database file (WSSMPv01.mdb), a data dictionary file (WSSMPDD1.pdf), a detailed database-design diagram (WSSMPPL1.pdf), and this database-design report (OFR2004-1231.pdf) documents the design of the database. This report includes a discussion of each WSSMP data structure with an accompanying database-design diagram. Appendix 1 of this report is an index of the diagrams in the report and on the plate; this index is organized by table name in alphabetical order. Each of these products is included in digital format on the enclosed CD-ROM to facilitate use or modification of the database.

  19. [Historical, social and cultural aspects of the deaf population].

    PubMed

    Duarte, Soraya Bianca Reis; Chaveiro, Neuma; Freitas, Adriana Ribeiro de; Barbosa, Maria Alves; Porto, Celmo Celeno; Fleck, Marcelo Pio de Almeida

    2013-10-01

    This work redeems, contextualizes and features the social, historical and cultural aspects of the deaf community that uses the Brazilian Sign Language focusing on the social and anthropological model. The scope of this study was to conduct a bibliographical review in scientific textbooks and articles available in the Virtual Health Library, irrespective of the date of publication. 102 articles and 53 books were located, including 33 textbooks and 26 articles (four from the Lilacs database and 22 from the Medline database) that constituted the sample. Today, in contrast with the past, there are laws that guarantee the right to communication and attendance by means of the Brazilian Sign Language. The repercussion, acceptance and inclusion in health policies of the decrees enshrined in Brazilian laws is a major priority.

  20. The U.S. Geological Survey Monthly Water Balance Model Futures Portal

    USGS Publications Warehouse

    Bock, Andrew R.; Hay, Lauren E.; Markstrom, Steven L.; Emmerich, Christopher; Talbert, Marian

    2017-05-03

    The U.S. Geological Survey Monthly Water Balance Model Futures Portal (https://my.usgs.gov/mows/) is a user-friendly interface that summarizes monthly historical and simulated future conditions for seven hydrologic and meteorological variables (actual evapotranspiration, potential evapotranspiration, precipitation, runoff, snow water equivalent, atmospheric temperature, and streamflow) at locations across the conterminous United States (CONUS).The estimates of these hydrologic and meteorological variables were derived using a Monthly Water Balance Model (MWBM), a modular system that simulates monthly estimates of components of the hydrologic cycle using monthly precipitation and atmospheric temperature inputs. Precipitation and atmospheric temperature from 222 climate datasets spanning historical conditions (1952 through 2005) and simulated future conditions (2020 through 2099) were summarized for hydrographic features and used to drive the MWBM for the CONUS. The MWBM input and output variables were organized into an open-access database. An Open Geospatial Consortium, Inc., Web Feature Service allows the querying and identification of hydrographic features across the CONUS. To connect the Web Feature Service to the open-access database, a user interface—the Monthly Water Balance Model Futures Portal—was developed to allow the dynamic generation of summary files and plots  based on plot type, geographic location, specific climate datasets, period of record, MWBM variable, and other options. Both the plots and the data files are made available to the user for download 

  1. An archival examination of environment and disease in eastern Africa in recent history

    NASA Astrophysics Data System (ADS)

    Larsen, L.

    2012-04-01

    In order to better understand present interactions between climate and infectious disease incidence it is important to examine the history of disease outbreaks and burdens, and their likely links with the environment. This paper will present research that is currently being undertaken on the identification and mapping of historic incidences of malaria, schistosomiasis and Rift Valley fever (RVF) in eastern Africa in relation to possible environmental, social, economic and political contributing factors. The research covers the past one hundred years or so and primarily draws on a range of archival documentary sources located in the region and the former imperial centres. The paper will discuss the methodologies employed in the building of a comprehensive historical database. The research is part of a larger EU FP7-funded project which aims to map, examine and anticipate the future risks of the three diseases in eastern Africa in response to environmental change. The paper will outline how the construction of such a historic database allows the contextualization of current climate-disease relationships and can thus contribute to discussions on the effects of changing climate on future disease trends.

  2. Biogeochemical-Argo: achievements, challenges for the future and potential synergies with other components of ocean observation systems

    NASA Astrophysics Data System (ADS)

    Claustre, Hervé; Johnson, Ken

    2017-04-01

    The recently launched Biogeochemical-Argo (BGC-Argo) program aims at developing a global network of biogeochemical sensors on Argo profiling floats for acquiring long-term high-quality time-series of oceanic properties. BGC-Argo is in particular poised to address a number of challenges in ocean science (e.g. hypoxia, carbon uptake, ocean acidification, biological-carbon pump and phytoplankton communities), topics that are difficult, if not impossible, to address with our present observing assets. Presently six variables are considered as core BGC-Argo variables (O2, NO3, pH, Chla, suspended particles and downwelling irradiance). Historically, BGC-Argo has been initiated through small-scale "showcase" projects progressively scaling up into regional case studies essentially addressing key biological pump-related questions in specific regions (e.g. sub-tropical gyres, North Atlantic, Southern Ocean). Now BGC-Argo is transitioning towards a global and sustained observation system thanks to progressive international coordination of national contributions and to increasingly mature and efficient data management and distribution systems. In this presentation, we will highlight a variety of results derived from BGC-Argo observations and encompassing a wide range of topics related to ocean biogeochemistry. Challenges for the future and long-term sustainability of the system will be addressed in particular with respect to maintaining a high-quality and interoperable dataset over long-term. Part of this can be achieved through a tight interaction with programs (e.g. GOSHIP) and their historical databases, which should constitute a corner stone to assess data quality. Example on the interplay between BGC-Argo and GlodapV2 databases will be particularly exemplified in this context. Furthermore, we will illustrate the potential synergies between synoptically measured surface satellite-quantities and their vertically resolved (BGC-Argo) counterparts into the development of 3D biogeochemical products.

  3. Geospatial database for regional environmental assessment of central Colorado.

    USGS Publications Warehouse

    Church, Stan E.; San Juan, Carma A.; Fey, David L.; Schmidt, Travis S.; Klein, Terry L.; DeWitt, Ed H.; Wanty, Richard B.; Verplanck, Philip L.; Mitchell, Katharine A.; Adams, Monique G.; Choate, LaDonna M.; Todorov, Todor I.; Rockwell, Barnaby W.; McEachron, Luke; Anthony, Michael W.

    2012-01-01

    In conjunction with the future planning needs of the U.S. Department of Agriculture, Forest Service, the U.S. Geological Survey conducted a detailed environmental assessment of the effects of historical mining on Forest Service lands in central Colorado. Stream sediment, macroinvertebrate, and various filtered and unfiltered water quality samples were collected during low-flow over a four-year period from 2004–2007. This report summarizes the sampling strategy, data collection, and analyses performed on these samples. The data are presented in Geographic Information System, Microsoft Excel, and comma-delimited formats. Reports on data interpretation are being prepared separately.

  4. Digital mining claim density map for federal lands in Wyoming: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Wyoming as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  5. Digital mining claim density map for federal lands in Colorado: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Colorado as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  6. Digital mining claim density map for federal lands in Washington: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Washington as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  7. Open Access to Geophysical Data

    NASA Astrophysics Data System (ADS)

    Sergeyeva, Nataliya A.; Zabarinskaya, Ludmila P.

    2017-04-01

    Russian World Data Centers for Solar-Terrestrial Physics & Solid Earth Physics hosted by the Geophysical Center of the Russian Academy of Sciences are the Regular Members of the ICSU-World Data System. Guided by the principles of the WDS Constitution and WDS Data Sharing Principles, the WDCs provide full and open access to data, long-term data stewardship, compliance with agreed-upon data standards and conventions, and mechanisms to facilitate and improve access to data. Historical and current geophysical data on different media, in the form of digital data sets, analog records, collections of maps, descriptions are stored and collected in the Centers. The WDCs regularly fill up repositories and database with new data, support them up to date. Now the WDCs focus on four new projects, aimed at increase of data available in network by retrospective data collection and digital preservation of data; creation of a modern system of registration and publication of data with digital object identifier (DOI) assignment, and promotion of data citation culture; creation of databases instead of file system for more convenient access to data; participation in the WDS Metadata Catalogue and Data Portal by creating of metadata for information resources of WDCs.

  8. The State of the Art of the Zebrafish Model for Toxicology and Toxicologic Pathology Research—Advantages and Current Limitations

    PubMed Central

    Spitsbergen, Jan M.; Kent, Michael L.

    2007-01-01

    The zebrafish (Danio rerio) is now the pre-eminent vertebrate model system for clarification of the roles of specific genes and signaling pathways in development. The zebrafish genome will be completely sequenced within the next 1–2 years. Together with the substantial historical database regarding basic developmental biology, toxicology, and gene transfer, the rich foundation of molecular genetic and genomic data makes zebrafish a powerful model system for clarifying mechanisms in toxicity. In contrast to the highly advanced knowledge base on molecular developmental genetics in zebrafish, our database regarding infectious and noninfectious diseases and pathologic lesions in zebrafish lags far behind the information available on most other domestic mammalian and avian species, particularly rodents. Currently, minimal data are available regarding spontaneous neoplasm rates or spontaneous aging lesions in any of the commonly used wild-type or mutant lines of zebrafish. Therefore, to fully utilize the potential of zebrafish as an animal model for understanding human development, disease, and toxicology we must greatly advance our knowledge on zebrafish diseases and pathology. PMID:12597434

  9. VEMAP Phase 2 bioclimatic database. I. Gridded historical (20th century) climate for modeling ecosystem dynamics across the conterminous USA

    USGS Publications Warehouse

    Kittel, T.G.F.; Rosenbloom, N.A.; Royle, J. Andrew; Daly, Christopher; Gibson, W.P.; Fisher, H.H.; Thornton, P.; Yates, D.N.; Aulenbach, S.; Kaufman, C.; McKeown, R.; Bachelet, D.; Schimel, D.S.; Neilson, R.; Lenihan, J.; Drapek, R.; Ojima, D.S.; Parton, W.J.; Melillo, J.M.; Kicklighter, D.W.; Tian, H.; McGuire, A.D.; Sykes, M.T.; Smith, B.; Cowling, S.; Hickler, T.; Prentice, I.C.; Running, S.; Hibbard, K.A.; Post, W.M.; King, A.W.; Smith, T.; Rizzo, B.; Woodward, F.I.

    2004-01-01

    Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the Vegetation/Ecosystem Modeling and Analysis Project (VEMAP), a biogeochemical and dynamic vegetation model intercomparison. The dataset covers the period 1895-1993 on a 0.5?? latitude/longitude grid. Climate is represented at both monthly and daily timesteps. Variables are: precipitation, mininimum and maximum temperature, total incident solar radiation, daylight-period irradiance, vapor pressure, and daylight-period relative humidity. The dataset was derived from US Historical Climate Network (HCN), cooperative network, and snowpack telemetry (SNOTEL) monthly precipitation and mean minimum and maximum temperature station data. We employed techniques that rely on geostatistical and physical relationships to create the temporally and spatially complete dataset. We developed a local kriging prediction model to infill discontinuous and limited-length station records based on spatial autocorrelation structure of climate anomalies. A spatial interpolation model (PRISM) that accounts for physiographic controls was used to grid the infilled monthly station data. We implemented a stochastic weather generator (modified WGEN) to disaggregate the gridded monthly series to dailies. Radiation and humidity variables were estimated from the dailies using a physically-based empirical surface climate model (MTCLIM3). Derived datasets include a 100 yr model spin-up climate and a historical Palmer Drought Severity Index (PDSI) dataset. The VEMAP dataset exhibits statistically significant trends in temperature, precipitation, solar radiation, vapor pressure, and PDSI for US National Assessment regions. The historical climate and companion datasets are available online at data archive centers. ?? Inter-Research 2004.

  10. Development of an operational African Drought Monitor prototype

    NASA Astrophysics Data System (ADS)

    Chaney, N.; Sheffield, J.; Wood, E. F.; Lettenmaier, D. P.

    2011-12-01

    Droughts have severe economic, environmental, and social impacts. However, timely detection and monitoring can minimize these effects. Based on previous drought monitoring over the continental US, a drought monitor has been developed for Africa. Monitoring drought in data sparse regions such as Africa is difficult due to a lack of historical or real-time observational data at a high spatial and temporal resolution. As a result, a land surface model is used to estimate hydrologic variables, which are used as surrogate observations for monitoring drought. The drought monitoring system consists of two stages: the first is to create long-term historical background simulations against which current conditions can be compared. The second is the real-time estimation of current hydrological conditions that results in an estimated drought index value. For the first step, a hybrid meteorological forcing dataset was created that assimilates reanalysis and observational datasets from 1950 up to real-time. Furthermore, the land surface model (currently the VIC land surface model is being used) was recalibrated against spatially disaggregated runoff fields derived from over 500 GRDC stream gauge measurements over Africa. The final result includes a retrospective database from 1950 to real-time of soil moisture, evapotranspiration, river discharge at the GRDC gauged sites (etc.) at a 1/4 degree spatial resolution, and daily temporal resolution. These observation-forced simulations are analyzed to detect and track historical drought events according to a drought index that is calculated from the soil moisture fields and river discharge relative to their seasonal climatology. The real-time monitoring requires the use of remotely sensed and weather-model analysis estimates of hydrological model forcings. For the current system, NOAA's Global Forecast System (GFS) is used along with remotely sensed precipitation from the NASA TMPA system. The historical archive of these data is evaluated against the data set used to create the background simulations. Real-time adjustments are used to preserve consistency between the historical and real-time data. The drought monitor will be presented together with the web-interface that has been developed for the scientific community to access and retrieve the data products. This system will be deployed for operational use at AGRHYMET in Niamey, Niger before the end of 2011.

  11. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Brissebrat, Guillaume; Fleury, Laurence; Boichard, Jean-Luc; Cloché, Sophie; Eymard, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim; Asencio, Nicole; Favot, Florence; Roussot, Odile

    2013-04-01

    The AMMA information system aims at expediting data and scientific results communication inside the AMMA community and beyond. It has already been adopted as the data management system by several projects and is meant to become a reference information system about West Africa area for the whole scientific community. The AMMA database and the associated on line tools have been developed and are managed by two French teams (IPSL Database Centre, Palaiseau and OMP Data Service, Toulouse). The complete system has been fully duplicated and is operated by AGRHYMET Regional Centre in Niamey, Niger. The AMMA database contains a wide variety of datasets: - about 250 local observation datasets, that cover geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health...) They come from either operational networks or scientific experiments, and include historical data in West Africa from 1850; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Database users can access all the data using either the portal http://database.amma-international.org or http://amma.agrhymet.ne/amma-data. Different modules are available. The complete catalogue enables to access metadata (i.e. information about the datasets) that are compliant with the international standards (ISO19115, INSPIRE...). Registration pages enable to read and sign the data and publication policy, and to apply for a user database account. The data access interface enables to easily build a data extraction request by selecting various criteria like location, time, parameters... At present, the AMMA database counts more than 740 registered users and process about 80 data requests every month In order to monitor day-to-day meteorological and environment information over West Africa, some quick look and report display websites have been developed. They met the operational needs for the observational teams during the AMMA 2006 (http://aoc.amma-international.org) and FENNEC 2011 (http://fenoc.sedoo.fr) campaigns. But they also enable scientific teams to share physical indices along the monsoon season (http://misva.sedoo.fr from 2011). A collaborative WIKINDX tool has been set on line in order to manage scientific publications and communications of interest to AMMA (http://biblio.amma-international.org). Now the bibliographic database counts about 1200 references. It is the most exhaustive document collection about African Monsoon available for all. Every scientist is invited to make use of the different AMMA on line tools and data. Scientists or project leaders who have data management needs for existing or future datasets over West Africa are welcome to use the AMMA database framework and to contact ammaAdmin@sedoo.fr .

  12. Historical earthquakes studies in Eastern Siberia: State-of-the-art and plans for future

    NASA Astrophysics Data System (ADS)

    Radziminovich, Ya. B.; Shchetnikov, A. A.

    2013-01-01

    Many problems in investigating historical seismicity of East Siberia remain unsolved. A list of these problems may refer particularly to the quality and reliability of data sources, completeness of parametric earthquake catalogues, and precision and transparency of estimates for the main parameters of historical earthquakes. The main purpose of this paper is to highlight the current status of the studies of historical seismicity in Eastern Siberia, as well as analysis of existing macroseismic and parametric earthquake catalogues. We also made an attempt to identify the main shortcomings of existing catalogues and to clarify the reasons for their appearance in the light of the history of seismic observations in Eastern Siberia. Contentious issues in the catalogues of earthquakes are considered by the example of three strong historical earthquakes, important for assessing seismic hazard in the region. In particular, it was found that due to technical error the parameters of large M = 7.7 earthquakes of 1742 were transferred from the regional catalogue to the worldwide database with incorrect epicenter coordinates. The way some stereotypes concerning active tectonics influences on the localization of the epicenter is shown by the example of a strong М = 6.4 earthquake of 1814. Effect of insufficient use of the primary data source on completeness of earthquake catalogues is illustrated by the example of a strong M = 7.0 event of 1859. Analysis of the state-of-the-art of historical earthquakes studies in Eastern Siberia allows us to propose the following activities in the near future: (1) database compilation including initial descriptions of macroseismic effects with reference to their place and time of occurrence; (2) parameterization of the maximum possible (magnitude-unlimited) number of historical earthquakes on the basis of all the data available; (3) compilation of an improved version of the parametric historical earthquake catalogue for East Siberia with detailed consideration of each event and distinct logic schemes for data interpretation. Thus, we can make the conclusion regarding the necessity of a large-scale revision in historical earthquakes catalogues for the area of study.

  13. The plant phenological online database (PPODB): an online database for long-term phenological data.

    PubMed

    Dierenbach, Jonas; Badeck, Franz-W; Schaber, Jörg

    2013-09-01

    We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via www.ppodb.de .

  14. Quantifying Data Quality for Clinical Trials Using Electronic Data Capture

    PubMed Central

    Nahm, Meredith L.; Pieper, Carl F.; Cunningham, Maureen M.

    2008-01-01

    Background Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. Methods and Principal Findings The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. Conclusions Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks. PMID:18725958

  15. La Aplicacion de las Bases de Datos al Estudio Historico del Espanol (The Application of Databases to the Historical Study of Spanish).

    ERIC Educational Resources Information Center

    Nadal, Gloria Claveria; Lancis, Carlos Sanchez

    1997-01-01

    Notes that the employment of databases to the study of the history of a language is a method that allows for substantial improvement in investigative quality. Illustrates this with the example of the application of this method to two studies of the history of Spanish developed in the Language and Information Seminary of the Independent University…

  16. Stochastic Model for the Vocabulary Growth in Natural Languages

    NASA Astrophysics Data System (ADS)

    Gerlach, Martin; Altmann, Eduardo G.

    2013-04-01

    We propose a stochastic model for the number of different words in a given database which incorporates the dependence on the database size and historical changes. The main feature of our model is the existence of two different classes of words: (i) a finite number of core words, which have higher frequency and do not affect the probability of a new word to be used, and (ii) the remaining virtually infinite number of noncore words, which have lower frequency and, once used, reduce the probability of a new word to be used in the future. Our model relies on a careful analysis of the Google Ngram database of books published in the last centuries, and its main consequence is the generalization of Zipf’s and Heaps’ law to two-scaling regimes. We confirm that these generalizations yield the best simple description of the data among generic descriptive models and that the two free parameters depend only on the language but not on the database. From the point of view of our model, the main change on historical time scales is the composition of the specific words included in the finite list of core words, which we observe to decay exponentially in time with a rate of approximately 30 words per year for English.

  17. 2006 Compilation of Alaska Gravity Data and Historical Reports

    USGS Publications Warehouse

    Saltus, Richard W.; Brown, Philip J.; Morin, Robert L.; Hill, Patricia L.

    2008-01-01

    Gravity anomalies provide fundamental geophysical information about Earth structure and dynamics. To increase geologic and geodynamic understanding of Alaska, the U.S. Geological Survey (USGS) has collected and processed Alaska gravity data for the past 50 years. This report introduces and describes an integrated, State-wide gravity database and provides accompanying gravity calculation tools to assist in its application. Additional information includes gravity base station descriptions and digital scans of historical USGS reports. The gravity calculation tools enable the user to reduce new gravity data in a consistent manner for combination with the existing database. This database has sufficient resolution to define the regional gravity anomalies of Alaska. Interpretation of regional gravity anomalies in parts of the State are hampered by the lack of local isostatic compensation in both southern and northern Alaska. However, when filtered appropriately, the Alaska gravity data show regional features having geologic significance. These features include gravity lows caused by low-density rocks of Cenozoic basins, flysch belts, and felsic intrusions, as well as many gravity highs associated with high-density mafic and ultramafic complexes.

  18. Introducing the Global Fire WEather Database (GFWED)

    NASA Astrophysics Data System (ADS)

    Field, R. D.

    2015-12-01

    The Canadian Fire Weather Index (FWI) System is the mostly widely used fire danger rating system in the world. We have developed a global database of daily FWI System calculations beginning in 1980 called the Global Fire WEather Database (GFWED) gridded to a spatial resolution of 0.5° latitude by 2/3° longitude. Input weather data were obtained from the NASA Modern Era Retrospective-Analysis for Research (MERRA), and two different estimates of daily precipitation from rain gauges over land. FWI System Drought Code calculations from the gridded datasets were compared to calculations from individual weather station data for a representative set of 48 stations in North, Central and South America, Europe, Russia, Southeast Asia and Australia. Agreement between gridded calculations and the station-based calculations tended to be most different at low latitudes for strictly MERRA-based calculations. Strong biases could be seen in either direction: MERRA DC over the Mato Grosso in Brazil reached unrealistically high values exceeding DC=1500 during the dry season but was too low over Southeast Asia during the dry season. These biases are consistent with those previously-identified in MERRA's precipitation and reinforce the need to consider alternative sources of precipitation data. GFWED is being used by researchers around the world for analyzing historical relationships between fire weather and fire activity at large scales, in identifying large-scale atmosphere-ocean controls on fire weather, and calibration of FWI-based fire prediction models. These applications will be discussed. More information on GFWED can be found at http://data.giss.nasa.gov/impacts/gfwed/

  19. The SSABLE system - Automated archive, catalog, browse and distribution of satellite data in near-real time

    NASA Technical Reports Server (NTRS)

    Simpson, James J.; Harkins, Daniel N.

    1993-01-01

    Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.

  20. Costs of landslides and floods in XX Century in a Calabrian town starting from the data stored in the Historical Archive of IRPI (Cosenza)

    NASA Astrophysics Data System (ADS)

    Giampa', Vincenzo; Pasqua, A. Aurora; Petrucci, Olga

    2015-04-01

    The paper firstly presents the historical archive of Cosenza IRPI Section and the historical database that has been built basing on the data contained in it. Then, an application of these data to Catanzaro, the town that is the administrative center of Calabria region (Southern Italy), is presented. The gathering of historical data on past floods and landslides in Cosenza IRPI Section has been started since 1996, and it is still in progress. In 2005, some donations coming from regional and municipal Public Works offices greatly increased the documental corpus, and required a more incisive classification and management that led us to organize the documents in a real historical archive. Documents were sorted according to municipalities they concerned. In this way, for each of the 409 municipalities of Calabria a set of documents, maps and images was available. Collected documents mainly concern damage caused by the occurrence, since XIX century, of phenomena as floods, flash floods and landslides triggered by extreme meteorological events, or even damage caused by strong earthquakes. At the beginning of 2014, the central office of IRPI (Perugia) funded a project aiming to the digitalization of the archive and the subsequent publication of it on a web-platform. In this paper, the procedure adopted to build the archive and implement the database is described. Then, the elaboration of the historical series of data on Catanzaro town, which has been frequently damaged by rainfall-induced landslides and floods, is also presented. Basing on the documents coming from the archive of Ministry Public Works and stored in our Historical Archive, an assessment of costs related to damage that during XX century affected the houses of this town has been performed. The research pointed out the types of most damaging phenomena, the municipal sectors most frequently damaged, and the evolution of damaged areas throughout the years according to the increasing urbanization.

  1. Current Development at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.; Clayton, R. W.

    2005-12-01

    Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.

  2. Dcs Data Viewer, an Application that Accesses ATLAS DCS Historical Data

    NASA Astrophysics Data System (ADS)

    Tsarouchas, C.; Schlenker, S.; Dimitrov, G.; Jahn, G.

    2014-06-01

    The ATLAS experiment at CERN is one of the four Large Hadron Collider experiments. The Detector Control System (DCS) of ATLAS is responsible for the supervision of the detector equipment, the reading of operational parameters, the propagation of the alarms and the archiving of important operational data in a relational database (DB). DCS Data Viewer (DDV) is an application that provides access to the ATLAS DCS historical data through a web interface. Its design is structured using a client-server architecture. The pythonic server connects to the DB and fetches the data by using optimized SQL requests. It communicates with the outside world, by accepting HTTP requests and it can be used stand alone. The client is an AJAX (Asynchronous JavaScript and XML) interactive web application developed under the Google Web Toolkit (GWT) framework. Its web interface is user friendly, platform and browser independent. The selection of metadata is done via a column-tree view or with a powerful search engine. The final visualization of the data is done using java applets or java script applications as plugins. The default output is a value-over-time chart, but other types of outputs like tables, ascii or ROOT files are supported too. Excessive access or malicious use of the database is prevented by a dedicated protection mechanism, allowing the exposure of the tool to hundreds of inexperienced users. The current configuration of the client and of the outputs can be saved in an XML file. Protection against web security attacks is foreseen and authentication constrains have been taken into account, allowing the exposure of the tool to hundreds of users world wide. Due to its flexible interface and its generic and modular approach, DDV could be easily used for other experiment control systems.

  3. L5 TM radiometric recalibration procedure using the internal calibration trends from the NLAPS trending database

    USGS Publications Warehouse

    Chander, G.; Haque, Md. O.; Micijevic, E.; Barsi, J.A.

    2008-01-01

    From the Landsat program's inception in 1972 to the present, the earth science user community has benefited from a historical record of remotely sensed data. The multispectral data from the Landsat 5 (L5) Thematic Mapper (TM) sensor provide the backbone for this extensive archive. Historically, the radiometric calibration procedure for this imagery used the instrument's response to the Internal Calibrator (IC) on a scene-by-scene basis to determine the gain and offset for each detector. The IC system degraded with time causing radiometric calibration errors up to 20 percent. In May 2003 the National Landsat Archive Production System (NLAPS) was updated to use a gain model rather than the scene acquisition specific IC gains to calibrate TM data processed in the United States. Further modification of the gain model was performed in 2007. L5 TM data that were processed using IC prior to the calibration update do not benefit from the recent calibration revisions. A procedure has been developed to give users the ability to recalibrate their existing Level-1 products. The best recalibration results are obtained if the work order report that was originally included in the standard data product delivery is available. However, many users may not have the original work order report. In such cases, the IC gain look-up table that was generated using the radiometric gain trends recorded in the NLAPS database can be used for recalibration. This paper discusses the procedure to recalibrate L5 TM data when the work order report originally used in processing is not available. A companion paper discusses the generation of the NLAPS IC gain and bias look-up tables required to perform the recalibration.

  4. Historic distribution of Common Loons in Wisconsin in relation to changes in lake characteristics and surrounding land use

    USGS Publications Warehouse

    Kenow, Kevin P.; Garrison, Paul J.; Fox, Timothy J.; Meyer, Michael W.

    2013-01-01

    A study was conducted to evaluate changes in water quality and land-use change associated with lakes that are south of the current breeding range of Common Loons in Wisconsin but that historically supported breeding loons. Museum collection records and published accounts were examined to identify lakes in southern Wisconsin with a former history of loon nesting activity. Historical and recent water quality data were obtained from state and USEPA databases for the former loon nesting lakes that were identified and paleolimnological data were acquired for these lakes from sediment cores used to infer historical total phosphorus concentrations from diatom assemblages. U.S. General Land Office notes and maps from the original land survey conducted in Wisconsin during 1832-1866 and the National Land Cover Database 2006 were utilized to assess land use changes that occurred within the drainage basins of former loon nesting lakes. Our results indicate that the landscape of southern Wisconsin has changed dramatically since Common Loons last nested in the region. A number of factors have likely contributed to the decreased appeal of southern Wisconsin lakes to breeding Common Loons, including changes to water quality, altered trophic status resulting from nutrient enrichment, and reductions in suitable nesting habitat stemming from shoreline development and altered water levels. Increased nutrient and sediment inputs from agricultural and developed areas likely contributed to a reduction in habitat quality.

  5. A scale self-adapting segmentation approach and knowledge transfer for automatically updating land use/cover change databases using high spatial resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Zhihua; Yang, Xiaomei; Lu, Chen; Yang, Fengshuo

    2018-07-01

    Automatic updating of land use/cover change (LUCC) databases using high spatial resolution images (HSRI) is important for environmental monitoring and policy making, especially for coastal areas that connect the land and coast and that tend to change frequently. Many object-based change detection methods are proposed, especially those combining historical LUCC with HSRI. However, the scale parameter(s) segmenting the serial temporal images, which directly determines the average object size, is hard to choose without experts' intervention. And the samples transferred from historical LUCC also need experts' intervention to avoid insufficient or wrong samples. With respect to the scale parameter(s) choosing, a Scale Self-Adapting Segmentation (SSAS) approach based on the exponential sampling of a scale parameter and location of the local maximum of a weighted local variance was proposed to determine the scale selection problem when segmenting images constrained by LUCC for detecting changes. With respect to the samples transferring, Knowledge Transfer (KT), a classifier trained on historical images with LUCC and applied in the classification of updated images, was also proposed. Comparison experiments were conducted in a coastal area of Zhujiang, China, using SPOT 5 images acquired in 2005 and 2010. The results reveal that (1) SSAS can segment images more effectively without intervention of experts. (2) KT can also reach the maximum accuracy of samples transfer without experts' intervention. Strategy SSAS + KT would be a good choice if the temporal historical image and LUCC match, and the historical image and updated image are obtained from the same resource.

  6. Development and testing of a database of NIH research funding of AAPM members: A report from the AAPM Working Group for the Development of a Research Database (WGDRD).

    PubMed

    Whelan, Brendan; Moros, Eduardo G; Fahrig, Rebecca; Deye, James; Yi, Thomas; Woodward, Michael; Keall, Paul; Siewerdsen, Jeff H

    2017-04-01

    To produce and maintain a database of National Institutes of Health (NIH) funding of the American Association of Physicists in Medicine (AAPM) members, to perform a top-level analysis of these data, and to make these data (hereafter referred to as the AAPM research database) available for the use of the AAPM and its members. NIH-funded research dating back to 1985 is available for public download through the NIH exporter website, and AAPM membership information dating back to 2002 was supplied by the AAPM. To link these two sources of data, a data mining algorithm was developed in Matlab. The false-positive rate was manually estimated based on a random sample of 100 records, and the false-negative rate was assessed by comparing against 99 member-supplied PI_ID numbers. The AAPM research database was queried to produce an analysis of trends and demographics in research funding dating from 2002 to 2015. A total of 566 PI_ID numbers were matched to AAPM members. False-positive and -negative rates were respectively 4% (95% CI: 1-10%, N = 100) and 10% (95% CI: 5-18%, N = 99). Based on analysis of the AAPM research database, in 2015 the NIH awarded $USD 110M to members of the AAPM. The four NIH institutes which historically awarded the most funding to AAPM members were the National Cancer Institute, National Institute of Biomedical Imaging and Bioengineering, National Heart Lung and Blood Institute, and National Institute of Neurological Disorders and Stroke. In 2015, over 85% of the total NIH research funding awarded to AAPM members was via these institutes, representing 1.1% of their combined budget. In the same year, 2.0% of AAPM members received NIH funding for a total of $116M, which is lower than the historic mean of $120M (in 2015 USD). A database of NIH-funded research awarded to AAPM members has been developed and tested using a data mining approach, and a top-level analysis of funding trends has been performed. Current funding of AAPM members is lower than the historic mean. The database will be maintained by members of the Working group for the development of a research database (WGDRD) on an annual basis, and is available to the AAPM, its committees, working groups, and members for download through the AAPM electronic content website. A wide range of questions regarding financial and demographic funding trends can be addressed by these data. This report has been approved for publication by the AAPM Science Council. © 2017 American Association of Physicists in Medicine.

  7. Benefit Assessment for Metroplex Tactical Runway Configuration Management (mTRCM) in a Simulated Environment

    NASA Technical Reports Server (NTRS)

    Phojanamongkolkij, Nipa; Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Robbins, Steven W.; Fenbert, James W.; Hartman, Christopher L.

    2015-01-01

    The System-Oriented Runway Management (SORM) concept is a collection of capabilities focused on a more efficient use of runways while considering all of the factors that affect runway use. Tactical Runway Configuration Management (TRCM), one of the SORM capabilities, provides runway configuration and runway usage recommendations, and monitoring the active runway configuration for suitability given existing factors. This report focuses on the metroplex environment, with two or more proximate airports having arrival and departure operations that are highly interdependent. The myriad of factors that affect metroplex opeations require consideration in arriving at runway configurations that collectively best serve the system as a whole. To assess the metroplex TRCM (mTRCM) benefit, the performance metrics must be compared with the actual historical operations. The historical configuration schedules can be viewed as the schedules produced by subject matter experts (SMEs), and therefore are referred to as the SMEs' schedules. These schedules were obtained from the FAA's Aviation System Performance Metrics (ASPM) database; this is the most representative information regarding runway configuration selection by SMEs. This report focused on a benefit assessment of total delay, transit time, and throughput efficiency (TE) benefits using the mTRCM algorithm at representative volumes for today's traffic at the New York metroplex (N90).

  8. Data Curation for the Exploitation of Large Earth Observation Products Databases - The MEA system

    NASA Astrophysics Data System (ADS)

    Mantovani, Simone; Natali, Stefano; Barboni, Damiano; Cavicchi, Mario; Della Vecchia, Andrea

    2014-05-01

    National Space Agencies under the umbrella of the European Space Agency are performing a strong activity to handle and provide solutions to Big Data and related knowledge (metadata, software tools and services) management and exploitation. The continuously increasing amount of long-term and of historic data in EO facilities in the form of online datasets and archives, the incoming satellite observation platforms that will generate an impressive amount of new data and the new EU approach on the data distribution policy make necessary to address technologies for the long-term management of these data sets, including their consolidation, preservation, distribution, continuation and curation across multiple missions. The management of long EO data time series of continuing or historic missions - with more than 20 years of data available already today - requires technical solutions and technologies which differ considerably from the ones exploited by existing systems. Several tools, both open source and commercial, are already providing technologies to handle data and metadata preparation, access and visualization via OGC standard interfaces. This study aims at describing the Multi-sensor Evolution Analysis (MEA) system and the Data Curation concept as approached and implemented within the ASIM and EarthServer projects, funded by the European Space Agency and the European Commission, respectively.

  9. Fossil-Fuel C02 Emissions Database and Exploration System

    NASA Astrophysics Data System (ADS)

    Krassovski, M.; Boden, T.

    2012-04-01

    Fossil-Fuel C02 Emissions Database and Exploration System Misha Krassovski and Tom Boden Carbon Dioxide Information Analysis Center Oak Ridge National Laboratory The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production each year at global, regional, and national spatial scales. These estimates are vital to climate change research given the strong evidence suggesting fossil-fuel emissions are responsible for unprecedented levels of carbon dioxide (CO2) in the atmosphere. The CDIAC fossil-fuel emissions time series are based largely on annual energy statistics published for all nations by the United Nations (UN). Publications containing historical energy statistics make it possible to estimate fossil-fuel CO2 emissions back to 1751 before the Industrial Revolution. From these core fossil-fuel CO2 emission time series, CDIAC has developed a number of additional data products to satisfy modeling needs and to address other questions aimed at improving our understanding of the global carbon cycle budget. For example, CDIAC also produces a time series of gridded fossil-fuel CO2 emission estimates and isotopic (e.g., C13) emissions estimates. The gridded data are generated using the methodology described in Andres et al. (2011) and provide monthly and annual estimates for 1751-2008 at 1° latitude by 1° longitude resolution. These gridded emission estimates are being used in the latest IPCC Scientific Assessment (AR4). Isotopic estimates are possible thanks to detailed information for individual nations regarding the carbon content of select fuels (e.g., the carbon signature of natural gas from Russia). CDIAC has recently developed a relational database to house these baseline emissions estimates and associated derived products and a web-based interface to help users worldwide query these data holdings. Users can identify, explore and download desired CDIAC fossil-fuel CO2 emissions data. This presentation introduces the architecture and design of the new relational database and web interface, summarizes the present state and functionality of the Fossil-Fuel CO2 Emissions Database and Exploration System, and highlights future plans for expansion of the relational database and interface.

  10. Bibliographic Resources for the Historian of Astronomy

    NASA Astrophysics Data System (ADS)

    Corbin, B. G.

    1999-12-01

    Many large library collections now have online bibliographic catalogs on the web. These provide many hidden resources for the historian of astronomy. Special searching techniques will allow the historian to scan bibliographic records of hundreds of entries relating to biographies of astronomers, collected works of astronomers, ancient and medieval astronomy and many other historical subjects. Abstract databases such as the Astrophysics Data System and ARIBIB are also adding much historical bibliographic information. ARIBIB will eventually contain scanned images of the Astronomischer Jahresbericht containing bibliographic entries for all literature of astronomy from 1899 to 1968 and Astronomy and Astrophysics Abstracts from 1969 to present. Commercial services such as UnCover and FirstSearch provide a means of reaching bibliographic entries for journal and book literature in the history of astronomy which were not easily located in the past. A broad overview of these collections and services will be given, and searching techniques for finding ``hidden" bibliographic data will be presented. Web page addresses will be given for all sources covered.

  11. Reusable Rocket Engine Operability Modeling and Analysis

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Komar, D. R.

    1998-01-01

    This paper describes the methodology, model, input data, and analysis results of a reusable launch vehicle engine operability study conducted with the goal of supporting design from an operations perspective. Paralleling performance analyses in schedule and method, this requires the use of metrics in a validated operations model useful for design, sensitivity, and trade studies. Operations analysis in this view is one of several design functions. An operations concept was developed given an engine concept and the predicted operations and maintenance processes incorporated into simulation models. Historical operations data at a level of detail suitable to model objectives were collected, analyzed, and formatted for use with the models, the simulations were run, and results collected and presented. The input data used included scheduled and unscheduled timeline and resource information collected into a Space Transportation System (STS) Space Shuttle Main Engine (SSME) historical launch operations database. Results reflect upon the importance not only of reliable hardware but upon operations and corrective maintenance process improvements.

  12. Development of the X-33 Aerodynamic Uncertainty Model

    NASA Technical Reports Server (NTRS)

    Cobleigh, Brent R.

    1998-01-01

    An aerodynamic uncertainty model for the X-33 single-stage-to-orbit demonstrator aircraft has been developed at NASA Dryden Flight Research Center. The model is based on comparisons of historical flight test estimates to preflight wind-tunnel and analysis code predictions of vehicle aerodynamics documented during six lifting-body aircraft and the Space Shuttle Orbiter flight programs. The lifting-body and Orbiter data were used to define an appropriate uncertainty magnitude in the subsonic and supersonic flight regions, and the Orbiter data were used to extend the database to hypersonic Mach numbers. The uncertainty data consist of increments or percentage variations in the important aerodynamic coefficients and derivatives as a function of Mach number along a nominal trajectory. The uncertainty models will be used to perform linear analysis of the X-33 flight control system and Monte Carlo mission simulation studies. Because the X-33 aerodynamic uncertainty model was developed exclusively using historical data rather than X-33 specific characteristics, the model may be useful for other lifting-body studies.

  13. Using Pattern Recognition and Discriminance Analysis to Predict Critical Events in Large Signal Databases

    NASA Astrophysics Data System (ADS)

    Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang

    2009-09-01

    Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.

  14. 36 CFR § 1256.24 - How long may access to some records be denied?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE ACCESS TO RECORDS AND DONATED HISTORICAL MATERIALS... of the records in which you are interested available. In the case of electronic structured databases...

  15. Characterizing Mega-Earthquake Related Tsunami on Subduction Zones without Large Historical Events

    NASA Astrophysics Data System (ADS)

    Williams, C. R.; Lee, R.; Astill, S.; Farahani, R.; Wilson, P. S.; Mohammed, F.

    2014-12-01

    Due to recent large tsunami events (e.g., Chile 2010 and Japan 2011), the insurance industry is very aware of the importance of managing its exposure to tsunami risk. There are currently few tools available to help establish policies for managing and pricing tsunami risk globally. As a starting point and to help address this issue, Risk Management Solutions Inc. (RMS) is developing a global suite of tsunami inundation footprints. This dataset will include both representations of historical events as well as a series of M9 scenarios on subductions zones that have not historical generated mega earthquakes. The latter set is included to address concerns about the completeness of the historical record for mega earthquakes. This concern stems from the fact that the Tohoku Japan earthquake was considerably larger than had been observed in the historical record. Characterizing the source and rupture pattern for the subduction zones without historical events is a poorly constrained process. In many case, the subduction zones can be segmented based on changes in the characteristics of the subducting slab or major ridge systems. For this project, the unit sources from the NOAA propagation database are utilized to leverage the basin wide modeling included in this dataset. The length of the rupture is characterized based on subduction zone segmentation and the slip per unit source can be determined based on the event magnitude (i.e., M9) and moment balancing. As these events have not occurred historically, there is little to constrain the slip distribution. Sensitivity tests on the potential rupture pattern have been undertaken comparing uniform slip to higher shallow slip and tapered slip models. Subduction zones examined include the Makran Trench, the Lesser Antilles and the Hikurangi Trench. The ultimate goal is to create a series of tsunami footprints to help insurers understand their exposures at risk to tsunami inundation around the world.

  16. Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation

    NASA Astrophysics Data System (ADS)

    Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.

    2012-02-01

    In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.

  17. The Mason Water Data Information System (MWDIS): Enabling data sharing and discovery at George Mason University

    NASA Astrophysics Data System (ADS)

    Ferreira, C.; Da Silva, A. L.; Nunes, A.; Haddad, J.; Lawler, S.

    2014-12-01

    Enabling effective data use and re-use in scientific investigations relies heavily not only on data availability but also on efficient data sharing discovery. The CUAHSI led Hydrological Information Systems (HIS) and supporting products have paved the way to efficient data sharing and discovery in the hydrological sciences. Based on the CUAHSI-HIS framework concepts for hydrologic data sharing we developed a unique system devoted to the George Mason University scientific community to support university wide data sharing and discovery as well as real time data access for extreme events situational awareness. The internet-based system will provide an interface where the researchers will input data collected from the measurement stations and present them to the public in form of charts, tables, maps, and documents. Moreover, the system is developed in ASP.NET MVC 4 using as Database Management System, Microsoft SQL Server 2008 R2, and hosted by Amazon Web Services. Currently the system is supporting the Mason Watershed Project providing historical hydrological, atmospheric and water quality data for the campus watershed and real time flood conditions in the campus. The system is also a gateway for unprecedented data collection of hurricane storm surge hydrodynamics in coastal wetlands in the Chesapeake Bay providing not only access to historical data but recent storms such as Hurricane Arthur. Future research includes coupling the system to a real-time flood alert system on campus, and besides providing data on the World Wide Web, to foment and provide a venue for interdisciplinary collaboration within the water scientists in the region.

  18. New public dataset for spotting patterns in medieval document images

    NASA Astrophysics Data System (ADS)

    En, Sovann; Nicolas, Stéphane; Petitjean, Caroline; Jurie, Frédéric; Heutte, Laurent

    2017-01-01

    With advances in technology, a large part of our cultural heritage is becoming digitally available. In particular, in the field of historical document image analysis, there is now a growing need for indexing and data mining tools, thus allowing us to spot and retrieve the occurrences of an object of interest, called a pattern, in a large database of document images. Patterns may present some variability in terms of color, shape, or context, making the spotting of patterns a challenging task. Pattern spotting is a relatively new field of research, still hampered by the lack of available annotated resources. We present a new publicly available dataset named DocExplore dedicated to spotting patterns in historical document images. The dataset contains 1500 images and 1464 queries, and allows the evaluation of two tasks: image retrieval and pattern localization. A standardized benchmark protocol along with ad hoc metrics is provided for a fair comparison of the submitted approaches. We also provide some first results obtained with our baseline system on this new dataset, which show that there is room for improvement and that should encourage researchers of the document image analysis community to design new systems and submit improved results.

  19. Health and Wellness Technology Use by Historically Underserved Health Consumers: Systematic Review

    PubMed Central

    Perchonok, Jennifer

    2012-01-01

    Background The implementation of health technology is a national priority in the United States and widely discussed in the literature. However, literature about the use of this technology by historically underserved populations is limited. Information on culturally informed health and wellness technology and the use of these technologies to reduce health disparities facing historically underserved populations in the United States is sparse in the literature. Objective To examine ways in which technology is being used by historically underserved populations to decrease health disparities through facilitating or improving health care access and health and wellness outcomes. Methods We conducted a systematic review in four library databases (PubMed, PsycINFO, Web of Science, and Engineering Village) to investigate the use of technology by historically underserved populations. Search strings consisted of three topics (eg, technology, historically underserved populations, and health). Results A total of 424 search phrases applied in the four databases returned 16,108 papers. After review, 125 papers met the selection criteria. Within the selected papers, 30 types of technology, 19 historically underserved groups, and 23 health issues were discussed. Further, almost half of the papers (62 papers) examined the use of technology to create effective and culturally informed interventions or educational tools. Finally, 12 evaluation techniques were used to assess the technology. Conclusions While the reviewed studies show how technology can be used to positively affect the health of historically underserved populations, the technology must be tailored toward the intended population, as personally relevant and contextually situated health technology is more likely than broader technology to create behavior changes. Social media, cell phones, and videotapes are types of technology that should be used more often in the future. Further, culturally informed health information technology should be used more for chronic diseases and disease management, as it is an innovative way to provide holistic care and reminders to otherwise underserved populations. Additionally, design processes should be stated regularly so that best practices can be created. Finally, the evaluation process should be standardized to create a benchmark for culturally informed health information technology. PMID:22652979

  20. Analysis of workplace compliance measurements of asbestos by the U.S. Occupational Safety and Health Administration (1984-2011).

    PubMed

    Cowan, Dallas M; Cheng, Thales J; Ground, Matthew; Sahmel, Jennifer; Varughese, Allysha; Madl, Amy K

    2015-08-01

    The United States Occupational Safety and Health Administration (OSHA) maintains the Chemical Exposure Health Data (CEHD) and the Integrated Management Information System (IMIS) databases, which contain quantitative and qualitative data resulting from compliance inspections conducted from 1984 to 2011. This analysis aimed to evaluate trends in workplace asbestos concentrations over time and across industries by combining the samples from these two databases. From 1984 to 2011, personal air samples ranged from 0.001 to 175 f/cc. Asbestos compliance sampling data associated with the construction, automotive repair, manufacturing, and chemical/petroleum/rubber industries included measurements in excess of 10 f/cc, and were above the permissible exposure limit from 2001 to 2011. The utility of combining the databases was limited by the completeness and accuracy of the data recorded. In this analysis, 40% of the data overlapped between the two databases. Other limitations included sampling bias associated with compliance sampling and errors occurring from user-entered data. A clear decreasing trend in both airborne fiber concentrations and the numbers of asbestos samples collected parallels historically decreasing trends in the consumption of asbestos, and declining mesothelioma incidence rates. Although air sampling data indicated that airborne fiber exposure potential was high (>10 f/cc for short and long-term samples) in some industries (e.g., construction, manufacturing), airborne concentrations have significantly declined over the past 30 years. Recommendations for improving the existing exposure OSHA databases are provided. Copyright © 2015. Published by Elsevier Inc.

  1. Vascular knowledge in medieval times was the turning point for the humanistic trend.

    PubMed

    Ducasse, E; Speziale, F; Baste, J C; Midy, D

    2006-06-01

    Knowledge of the history of our surgical specialty may broaden our viewpoint for everyday practice. We illustrate the scientific progress made in medieval times relevant to the vascular system and blood circulation, progress made despite prevailing religious and philosophical dogma. We located all articles concerning vascular knowledge and historical reviews in databases such as MEDLINE, EMBASE and the database of abstracts of reviews (DARE). We also explored the database of the register from the French National Library, the French Medical Inter-University (BIUM), the Italian National Library and the French and Italian Libraries in the Vatican. All data were collected and analysed in chronological order. Medieval vascular knowledge was inherited from Greek via Byzantine and Arabic writings, the first controversies against the recognized vascular schema emanating from an Arabian physician in the 13th century. Dissection was forbidden and clerical rules instilled a fear of blood. Major contributions to scientific progress in the vascular field in medieval times came from Ibn-al-Nafis and Harvey. Vascular specialists today may feel proud to recall that once religious dogma declined in early medieval times, vascular anatomic and physiological discoveries led the way to scientific progress.

  2. Web application and database modeling of traffic impact analysis using Google Maps

    NASA Astrophysics Data System (ADS)

    Yulianto, Budi; Setiono

    2017-06-01

    Traffic impact analysis (TIA) is a traffic study that aims at identifying the impact of traffic generated by development or change in land use. In addition to identifying the traffic impact, TIA is also equipped with mitigation measurement to minimize the arising traffic impact. TIA has been increasingly important since it was defined in the act as one of the requirements in the proposal of Building Permit. The act encourages a number of TIA studies in various cities in Indonesia, including Surakarta. For that reason, it is necessary to study the development of TIA by adopting the concept Transportation Impact Control (TIC) in the implementation of the TIA standard document and multimodal modeling. It includes TIA's standardization for technical guidelines, database and inspection by providing TIA checklists, monitoring and evaluation. The research was undertaken by collecting the historical data of junctions, modeling of the data in the form of relational database, building a user interface for CRUD (Create, Read, Update and Delete) the TIA data in the form of web programming with Google Maps libraries. The result research is a system that provides information that helps the improvement and repairment of TIA documents that exist today which is more transparent, reliable and credible.

  3. Newspaper archives + text mining = rich sources of historical geo-spatial data

    NASA Astrophysics Data System (ADS)

    Yzaguirre, A.; Smit, M.; Warren, R.

    2016-04-01

    Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.

  4. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the qualitative assessments based on frequency and amplitude. The second method to assess tsunami hazard involved using the USGS earthquake databases to search for possible earthquake sources near American coastlines to extend the NOAA/NGDC tsunami databases backward in time. The qualitative tsunami hazard assessment based on the results of the NGDC and USGS database searches will be presented.

  5. Local Community Verification of Coastal Erosion Risks in the Arctic: Insights from Alaska's North Slope

    NASA Astrophysics Data System (ADS)

    Brady, M.

    2016-12-01

    During his historic trip to Alaska in 2015, U.S. President Barack Obama announced a collaborative effort to update maps of the Arctic region in anticipation of increased maritime access and resource development and to support climate resilience. Included in this effort is development of an Arctic-wide satellite-based digital elevation model (DEM) to provide a baseline to monitor landscape change such as coastal erosion. Focusing in Alaska's North Slope, an objective of this study is to transform emerging Arctic environment spatial data products including the new DEM into information that can support local level planning and decision-making in the face of extreme coastal erosion and related environmental threats. In pursuit of this, in 2016, 4 workshops were held in three North Slope villages highly exposed to coastal erosion. The first workshop with approximately 10 managers in Barrow solicited feedback on an erosion risk database developed in a previous research stage and installed onto the North Slope's planning Web portal. The database includes a physical risk indicator based on factors such as historical erosion and effects of sea ice loss summarized at asset locations. After a demonstration of the database, participants discussed usability aspects such as data reliability. The focus of the mapping workshops in Barrow and two smaller villages Wainwright and Kaktovik was to verify and expand the risk database by interactively mapping erosion observations and community asset impacts. Using coded stickers and paper maps of the shoreline showing USGS erosion rates, a total of 50 participants provided feedback on erosion data accuracy. Approximately 25 of the total 50 participants were elders and hunters who also provided in-depth community risk information. The workshop with managers confirmed physical risk factors used in the risk database, and revealed that the information may be relied upon to support some development decisions and better engage developers about erosion risks. Results from the three mapping workshops revealed that most participants agree that the USGS data are consistent with their observations. Also, in-depth contributions from elders and hunters confirmed that there is a need to monitor loss of specific assets including hunting grounds and historic places and associated community impacts.

  6. Drug Prices and Emergency Department Mentions for Cocaine and Heroin

    PubMed Central

    Caulkins, Jonathan P.

    2001-01-01

    Objectives. In this report, the author illustrates the historic relation between retail drug prices and emergency department mentions for cocaine and heroin. Methods. Price series based on the Drug Enforcement Administration's System to Retrieve Information From Drug Evidence database were correlated with data on emergency department mentions from the Drug Abuse Warning Network for cocaine (1978–1996) and heroin (1981–1996). Results. A simple model in which emergency department mentions are driven by only prices explains more than 95% of the variation in emergency department mentions. Conclusions. Fluctuations in prices are an important determinant of adverse health outcomes associated with drugs. PMID:11527779

  7. Central Colorado Assessment Project (CCAP)-Geochemical data for rock, sediment, soil, and concentrate sample media

    USGS Publications Warehouse

    Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.

    2010-01-01

    This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons, have been included in the CCAP Geochemical Database and are planned to be added to the NGDB.

  8. Deriving spatial patterns from a novel database of volcanic rock geochemistry in the Virunga Volcanic Province, East African Rift

    NASA Astrophysics Data System (ADS)

    Poppe, Sam; Barette, Florian; Smets, Benoît; Benbakkar, Mhammed; Kervyn, Matthieu

    2016-04-01

    The Virunga Volcanic Province (VVP) is situated within the western branch of the East-African Rift. The geochemistry and petrology of its' volcanic products has been studied extensively in a fragmented manner. They represent a unique collection of silica-undersaturated, ultra-alkaline and ultra-potassic compositions, displaying marked geochemical variations over the area occupied by the VVP. We present a novel spatially-explicit database of existing whole-rock geochemical analyses of the VVP volcanics, compiled from international publications, (post-)colonial scientific reports and PhD theses. In the database, a total of 703 geochemical analyses of whole-rock samples collected from the 1950s until recently have been characterised with a geographical location, eruption source location, analytical results and uncertainty estimates for each of these categories. Comparative box plots and Kruskal-Wallis H tests on subsets of analyses with contrasting ages or analytical methods suggest that the overall database accuracy is consistent. We demonstrate how statistical techniques such as Principal Component Analysis (PCA) and subsequent cluster analysis allow the identification of clusters of samples with similar major-element compositions. The spatial patterns represented by the contrasting clusters show that both the historically active volcanoes represent compositional clusters which can be identified based on their contrasted silica and alkali contents. Furthermore, two sample clusters are interpreted to represent the most primitive, deep magma source within the VVP, different from the shallow magma reservoirs that feed the eight dominant large volcanoes. The samples from these two clusters systematically originate from locations which 1. are distal compared to the eight large volcanoes and 2. mostly coincide with the surface expressions of rift faults or NE-SW-oriented inherited Precambrian structures which were reactivated during rifting. The lava from the Mugogo eruption of 1957 belongs to these primitive clusters and is the only known to have erupted outside the current rift valley in historical times. We thus infer there is a distributed hazard of vent opening susceptibility additional to the susceptibility associated with the main Virunga edifices. This study suggests that the statistical analysis of such geochemical database may help to understand complex volcanic plumbing systems and the spatial distribution of volcanic hazards in active and poorly known volcanic areas such as the Virunga Volcanic Province.

  9. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Zane Hills, Hughes and Shungnak quadrangles, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential.The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska.For this report, DGGS funded reanalysis of 105 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Zane Hills area in the Hughes and Shungnak quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  10. Reanalysis of historical U.S. Geological Survey sediment samples for geochemical data from the western part of the Wrangellia terrane, Anchorage, Gulkana, Healy, Mt. Hayes, Nabesna, and Talkeetna Mountains quadrangles, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Azain, Jaime S.; Granitto, Matthew

    2014-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. For the geochemical part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 1,682 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from an area covering the western half of the Wrangellia Terrane in the Anchorage, Gulkana, Healy, Mt. Hayes, Nabesna, and Talkeetna Mountains quadrangles of south-central Alaska (fig. 1). USGS was responsible for sample retrieval from the Denver warehouse through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  11. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Kougarok area, Bendeleben and Teller quadrangles, Seward Peninsula, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 302 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Kougarok River drainage as well as smaller adjacent drainages in the Bendeleben and Teller quadrangles, Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  12. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Haines area, Juneau and Skagway quadrangles, southeast Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 212 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Chilkat, Klehini, Tsirku, and Takhin river drainages, as well as smaller drainages flowing into Chilkat and Chilkoot Inlets near Haines, Skagway Quadrangle, Southeast Alaska. Additionally some samples were also chosen from the Juneau gold belt, Juneau Quadrangle, Southeast Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  13. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the northeastern Alaska Range, Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 670 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the northeastern Alaska Range, in the Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  14. What could you do with 400 years of biological history on african americans? Evaluating the potential scientific benefit of systematic studies of dental and skeletal materials on African Americans from the 17th through 20th centuries

    PubMed Central

    Jackson, Latifa; Cross, Christopher; Clarke, Cameron

    2016-01-01

    Objectives How important is it to be able to reconstruct the lives of a highly diverse, historically recent macroethnic group over the course of 400 years? How many insights into human evolutionary biology and disease susceptibilities could be gained, even with this relatively recent window into the past? In this article, we explore the potential ramifications of a newly constructed dataset of Four Centuries of African American Biological Variation (4Cs). Methods This article provides initial lists of digitized variables formatted as SQL tables for the 17th and 18th century samples and for the 19th and 20th century samples. Results This database is dynamic and new information is added yearly. The database provides novel opportunities for significant insights into the past biological history of this group and three case study applications are detailed for comparative computational systems biology studies of (1) hypertension, (2) the oral microbiome, and (3) mental health disorders. Conclusions The 4Cs dataset is ideal for interdisciplinary “next generation” science research and these data represent a unique step toward the accumulation of historically contextualized Big Data on an underrepresented group known to have experienced differential survival over time. Am. J. Hum. Biol. 28:510–513, 2016. © 2016 The Authors American Journal of Human Biology Published byWiley Periodicals, Inc. PMID:26749025

  15. Disability in Mexico: a comparative analysis between descriptive models and historical periods using a timeline.

    PubMed

    Sandoval, Hugo; Pérez-Neri, Iván; Martínez-Flores, Francisco; Valle-Cabrera, Martha Griselda Del; Pineda, Carlos

    2017-01-01

    Some interpretations frequently argue that three Disability Models (DM) (Charity, Medical/Rehabilitation, and Social) correspond to historical periods in terms of chronological succession. These views permeate a priori within major official documents on the subject in Mexico. This paper intends to test whether this association is plausible by applying a timeline method. A document search was made with inclusion and exclusion criteria in databases to select representative studies with which to depict milestones in the timelines for each period. The following is demonstrated: 1) models should be considered as categories of analysis and not as historical periods, in that the prevalence of elements of the three models is present to date, and 2) the association between disability models and historical periods results in teleological interpretations of the history of disability in Mexico.

  16. The construction and periodicity analysis of natural disaster database of Alxa area based on Chinese local records

    NASA Astrophysics Data System (ADS)

    Yan, Zheng; Mingzhong, Tian; Hengli, Wang

    2010-05-01

    Chinese hand-written local records were originated from the first century. Generally, these local records include geography, evolution, customs, education, products, people, historical sites, as well as writings of an area. Through such endeavors, the information of the natural materials of China nearly has had no "dark ages" in the evolution of its 5000-year old civilization. A compilation of all meaningful historical data of natural-disasters taken place in Alxa of inner-Mongolia, the second largest desert in China, is used here for the construction of a 500-year high resolution database. The database is divided into subsets according to the types of natural-disasters like sand-dust storm, drought events, cold wave, etc. Through applying trend, correlation, wavelet, and spectral analysis on these data, we can estimate the statistically periodicity of different natural-disasters, detect and quantify similarities and patterns of the periodicities of these records, and finally take these results in aggregate to find a strong and coherent cyclicity through the last 500 years which serves as the driving mechanism of these geological hazards. Based on the periodicity obtained from the above analysis, the paper discusses the probability of forecasting natural-disasters and the suitable measures to reduce disaster losses through history records. Keyword: Chinese local records; Alxa; natural disasters; database; periodicity analysis

  17. The plant phenological online database (PPODB): an online database for long-term phenological data

    NASA Astrophysics Data System (ADS)

    Dierenbach, Jonas; Badeck, Franz-W.; Schaber, Jörg

    2013-09-01

    We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via www.ppodb.de .

  18. Documenting Architectural Heritage in Bahia, Brazil, Using Spherical Photogrammetry

    NASA Astrophysics Data System (ADS)

    De Amorim, A. L.; Fangi, G.; Malinverni, E. S.

    2013-07-01

    The Cultural Heritage disappears at a rate higher than we are able, not only, to restore but also to document: human and natural factors, negligence or worst, deliberate demolitions put in danger the collective Architectural Heritage (AH). According to CIPA statements, the recording is important and has to follow some guidelines. The Architectural and Urban Heritage data have to be historically related, critically assessed and analyzed, before to be organized according to a thematic structure and become available for further uses. This paper shows the experiences developed by the Laboratory of Computer Graphics applied to Architecture and Design (LCAD), at the Architecture School of the Federal University of Bahia (FAUFBA), Brazil, in cooperation with the Università Politecnica delle Marche (UNIVPM, DICEA Department), Italy, in documenting architectural heritage. The research set up now has been carried out in the historical sites of Bahia, as Pelourinho neighborhood, a World Heritage by UNESCO. Other historical sites are in the plan of this survey, like the cities of Lençóis and Mucugê in Chapada Diamantina region. The aim is to build a technological platform based on low cost digital technologies and open source tools, such as Panoramic Spherical Photogrammetry, Spatial Database, Geographic Information Systems, Three-dimensional Geometric Modeling, CAD technology, for the collection, validation and dissemination of AH.

  19. Strategies for medical data extraction and presentation part 2: creating a customizable context and user-specific patient reference database.

    PubMed

    Reiner, Bruce

    2015-06-01

    One of the greatest challenges facing healthcare professionals is the ability to directly and efficiently access relevant data from the patient's healthcare record at the point of care; specific to both the context of the task being performed and the specific needs and preferences of the individual end-user. In radiology practice, the relative inefficiency of imaging data organization and manual workflow requirements serves as an impediment to historical imaging data review. At the same time, clinical data retrieval is even more problematic due to the quality and quantity of data recorded at the time of order entry, along with the relative lack of information system integration. One approach to address these data deficiencies is to create a multi-disciplinary patient referenceable database which consists of high-priority, actionable data within the cumulative patient healthcare record; in which predefined criteria are used to categorize and classify imaging and clinical data in accordance with anatomy, technology, pathology, and time. The population of this referenceable database can be performed through a combination of manual and automated methods, with an additional step of data verification introduced for data quality control. Once created, these referenceable databases can be filtered at the point of care to provide context and user-specific data specific to the task being performed and individual end-user requirements.

  20. An expanded mammal mitogenome dataset from Southeast Asia

    PubMed Central

    Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S. Sinding; Riddhi, P. Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K. Lim; Stephen, J. Rossiter; Wilting, Andreas

    2017-01-01

    Abstract Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals—has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. PMID:28873965

  1. An expanded mammal mitogenome dataset from Southeast Asia.

    PubMed

    Mohd Salleh, Faezah; Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S Sinding; Riddhi, P Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K Lim; Stephen, J Rossiter; Wilting, Andreas; Gilbert, M Thomas P

    2017-08-01

    Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals-has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. © The Author 2017. Published by Oxford University Press.

  2. Digital mining claim density map for federal lands in Idaho: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Idaho as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  3. Digital mining claim density map for federal lands in Oregon: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Oregon as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  4. An improved database of coastal flooding in the United Kingdom from 1915 to 2016

    PubMed Central

    Haigh, Ivan D.; Ozsoy, Ozgun; Wadey, Matthew P.; Nicholls, Robert J.; Gallop, Shari L.; Wahl, Thomas; Brown, Jennifer M.

    2017-01-01

    Coastal flooding caused by extreme sea levels can produce devastating and wide-ranging consequences. The ‘SurgeWatch’ v1.0 database systematically documents and assesses the consequences of historical coastal flood events around the UK. The original database was inevitably biased due to the inconsistent spatial and temporal coverage of sea-level observations utilised. Therefore, we present an improved version integrating a variety of ‘soft’ data such as journal papers, newspapers, weather reports, and social media. SurgeWatch2.0 identifies 329 coastal flooding events from 1915 to 2016, a more than fivefold increase compared to the 59 events in v1.0. Moreover, each flood event is now ranked using a multi-level categorisation based on inundation, transport disruption, costs, and fatalities: from 1 (Nuisance) to 6 (Disaster). For the 53 most severe events ranked Category 3 and above, an accompanying event description based upon the Source-Pathway-Receptor-Consequence framework was produced. Thus, SurgeWatch v2.0 provides the most comprehensive and coherent historical record of UK coastal flooding. It is designed to be a resource for research, planning, management and education. PMID:28763054

  5. A Measure of Total Research Impact Independent of Time and Discipline

    PubMed Central

    Pepe, Alberto; Kurtz, Michael J.

    2012-01-01

    Authorship and citation practices evolve with time and differ by academic discipline. As such, indicators of research productivity based on citation records are naturally subject to historical and disciplinary effects. We observe these effects on a corpus of astronomer career data constructed from a database of refereed publications. We employ a simple mechanism to measure research output using author and reference counts available in bibliographic databases to develop a citation-based indicator of research productivity. The total research impact (tori) quantifies, for an individual, the total amount of scholarly work that others have devoted to his/her work, measured in the volume of research papers. A derived measure, the research impact quotient (riq), is an age-independent measure of an individual's research ability. We demonstrate that these measures are substantially less vulnerable to temporal debasement and cross-disciplinary bias than the most popular current measures. The proposed measures of research impact, tori and riq, have been implemented in the Smithsonian/NASA Astrophysics Data System. PMID:23144782

  6. The Cambridge Structural Database in retrospect and prospect.

    PubMed

    Groom, Colin R; Allen, Frank H

    2014-01-13

    The Cambridge Crystallographic Data Centre (CCDC) was established in 1965 to record numerical, chemical and bibliographic data relating to published organic and metal-organic crystal structures. The Cambridge Structural Database (CSD) now stores data for nearly 700,000 structures and is a comprehensive and fully retrospective historical archive of small-molecule crystallography. Nearly 40,000 new structures are added each year. As X-ray crystallography celebrates its centenary as a subject, and the CCDC approaches its own 50th year, this article traces the origins of the CCDC as a publicly funded organization and its onward development into a self-financing charitable institution. Principally, however, we describe the growth of the CSD and its extensive associated software system, and summarize its impact and value as a basis for research in structural chemistry, materials science and the life sciences, including drug discovery and drug development. Finally, the article considers the CCDC's funding model in relation to open access and open data paradigms. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Big Data in Organ Transplantation: Registries and Administrative Claims

    PubMed Central

    Massie, Allan B.; Kucirka, Lauren; Segev, Dorry L.

    2015-01-01

    The field of organ transplantation benefits from large, comprehensive, transplant-specific national datasets available to researchers. In addition to the widely-used OPTN-based registries (the UNOS and SRTR datasets) and USRDS datasets, there are other publicly available national datasets, not specific to transplantation, which have historically been underutilized in the field of transplantation. Of particular interest are the Nationwide Inpatient Sample (NIS) and State Inpatient Databases (SID), produced by the Agency for Healthcare Research and Quality (AHRQ). The United States Renal Data System (USRDS) database provides extensive data relevant to studies of kidney transplantation. Linkage of publicly available datasets to external data sources such as private claims or pharmacy data provides further resources for registry-based research. Although these resources can transcend some limitations of OPTN-based registry data, they come with their own limitations, which must be understood to avoid biased inference. This review discusses different registry-based data sources available in the United States, as well as the proper design and conduct of registry-based research. PMID:25040084

  8. Database of historically documented springs and spring flow measurements in Texas

    USGS Publications Warehouse

    Heitmuller, Franklin T.; Reece, Brian D.

    2003-01-01

    Springs are naturally occurring features that convey excess ground water to the land surface; they represent a transition from ground water to surface water. Water issues through one opening, multiple openings, or numerous seeps in the rock or soil. The database of this report provides information about springs and spring flow in Texas including spring names, identification numbers, location, and, if available, water source and use. This database does not include every spring in Texas, but is limited to an aggregation of selected digital and hard-copy data of the U.S. Geological Survey (USGS), the Texas Water Development Board (TWDB), and Capitol Environmental Services.

  9. An expert system shell for inferring vegetation characteristics

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann; Harrison, Patrick R.

    1993-01-01

    The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. VEG is described in detail in several references. The first generation version of VEG was extended. In the first year of this contract, an interface to a file of unknown cover type data was constructed. An interface that allowed the results of VEG to be written to a file was also implemented. A learning system that learned class descriptions from a data base of historical cover type data and then used the learned class descriptions to classify an unknown sample was built. This system had an interface that integrated it into the rest of VEG. The VEG subgoal PROPORTION.GROUND.COVER was completed and a number of additional techniques that inferred the proportion ground cover of a sample were implemented. This work was previously described. The work carried out in the second year of the contract is described. The historical cover type database was removed from VEG and stored as a series of flat files that are external to VEG. An interface to the files was provided. The framework and interface for two new VEG subgoals that estimate the atmospheric effect on reflectance data were built. A new interface that allows the scientist to add techniques to VEG without assistance from the developer was designed and implemented. A prototype Help System that allows the user to get more information about each screen in the VEG interface was also added to VEG.

  10. Restoration of an academic historical gross pathology collection-refreshed impact on current medical teaching?

    PubMed

    Eichhorn, Philip; Andraschke, Udo; Dross, Fritz; Geppert, Carol I; Hartmann, Arndt; Rau, Tilman T

    2018-05-10

    The declaration of Leiden pronounces the demand to conserve pathological-anatomical collections as cultural heritage. Likewise, the Institute of Pathology of the Friedrich-Alexander-University Erlangen-Nuremberg owns macroscopic pathological-anatomical specimens reaching back over 150 years. The purpose of this work is to examine the impact, meaning, and perception of such historical preparations during the current medical curriculum. Additionally, the experiences from the renovation process can be used as a template for other institutes. All preparations were documented, photographed, and catalogued in an electronic database. During a restoration period, a series of didactically suitable specimens were professionally restored. Hereby, the help of a special course of interested students was admitted. In a second step, the specimens were integrated into the regular teaching of students in macroscopic pathology. An evaluation was carried out on two student cohorts with and without historical specimens by means of a questionnaire with 23 items and two free text fields. In total, 1261 specimens were registered covering diseases from almost the complete human body with a strong representation of the cardiovascular, urinary, gastrointestinal, and central nervous systems. Hereby, exceptional rare and untreated cases with medical relevance could be found and stepwise implemented into the curriculum. The student evaluation positively addressed that the courses became livelier and interactive. Furthermore, a more comprehensive overview and a better understanding of the macroscopic pathology were appreciated. However, more self-study time with the specimen was demanded. The authenticity of historical specimens contrasts with the tendency to carry out virtual "online" didactic methods. The stereoscopic view on often untreated and, therefore, unbiased cases enhances a skill-oriented deeper understanding of diseases. In conclusion, historical specimens regain interest and even didactic value, especially in an era of declining autopsy rates.

  11. The Admissions Office Goes Scientific.

    ERIC Educational Resources Information Center

    Bryant, Peter; Crockett, Kevin

    1993-01-01

    Data-based planning and management is revolutionizing college student recruitment. Data analysis focuses on historical trends, marketing and recruiting strategies, cost-effectiveness strategy, and markets. Data sources include primary market demographics, geo-demographics, secondary sources, student price response information, and institutional…

  12. Classification of Chemicals Based On Structured Toxicity Information

    EPA Science Inventory

    Thirty years and millions of dollars worth of pesticide registration toxicity studies, historically stored as hardcopy and scanned documents, have been digitized into highly standardized and structured toxicity data within the Toxicity Reference Database (ToxRefDB). Toxicity-bas...

  13. Digital mining claim density map for federal lands in Nevada: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Nevada as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  14. Digital mining claim density map for federal lands in Utah: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Utah as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  15. Digital mining claim density map for federal lands in California: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in California as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  16. Digital mining claim density map for federal lands in New Mexico: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in New Mexico as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  17. Digital mining claim density map for federal lands in Arizona: 1996

    USGS Publications Warehouse

    Hyndman, Paul C.; Campbell, Harry W.

    1999-01-01

    This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Arizona as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.

  18. An Intelligent computer-aided tutoring system for diagnosing anomalies of spacecraft in operation

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. When the user selects the novice mode, the system automatically gives detailed explanations and descriptions of terms and reasoning as the session progresses, in a sense teaching the user. As such it is an effective tutoring tool. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The system is available on-line and uses C Language Integrated Production System (CLIPS), an expert shell developed by the NASA Johnson Space Center AI Laboratory in Houston.

  19. The AMMA database

    NASA Astrophysics Data System (ADS)

    Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim

    2010-05-01

    The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can concern local, satellite and model data. - Documentation: catalogue of all the available data and their metadata. These tools have been developed using standard and free languages and softwares: - Linux system with an Apache web server and a Tomcat application server; - J2EE tools : JSF and Struts frameworks, hibernate; - relational database management systems: PostgreSQL and MySQL; - OpenLDAP directory. In order to facilitate the access to the data by African scientists, the complete system has been mirrored at AGHRYMET Regional Centre in Niamey and is operational there since January 2009. Users can now access metadata and request data through one or the other of two equivalent portals: http://database.amma-international.org or http://amma.agrhymet.ne/amma-data.

  20. Reliability of Beam Loss Monitors System for the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2004-11-01

    The employment of superconducting magnets in high energy colliders opens challenging failure scenarios and brings new criticalities for the whole system protection. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particle losses, while at medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data have been processed by reliability software (Isograph). The analysis ranges from the components data to the system configuration.

  1. Enhancements to the NASA Astrophysics Science Information and Abstract Service

    NASA Astrophysics Data System (ADS)

    Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Murray, S. S.

    1995-05-01

    The NASA Astrophysics Data System Astrophysics Science Information and Abstract Service, the extension of the ADS Abstract Service continues rapidly to expand in both use and capabilities. Each month the service is used by about 4,000 different people, and returns about 1,000,000 pieces of bibliographic information. Among the recent additions to the system are: 1. Whole Text Access. In addition to the ApJ Letters we now have whole text for the ApJ on-line, soon we will have AJ and Rev. Mexicana. Discussions with other publishers are in progress. 2. Space Instrumentation Database. We now provide a second abstract service, covering papers related to space instruments. This is larger than the astronomy and astrophysics database in terms of total abstracts. 3. Reference Books and Historical Journals. We have begun putting the SAO Annals and the HCO Annals on-line. We have put the Handbook of Space Astronomy and Astrophysics by M.V. Zombeck (Cambridge U.P.) on-line. 4. Author Abstracts. We can now include original abstracts in addition to those we get from the NASA STI Abstracts Database. We have included abstracts for A&A in collaboration with the CDS in Strasbourg, and are collaborating with the AAS and the ASP on others. We invite publishers and editors of journals and conference proceedings to include their original abstracts in our service; send inquiries via e-mail to ads@cfa.harvard.edu. 5. Author Notes. We now accept notes and comments from authors of articles in our database. These are arbitrary html files and may contain pointers to other WWW documents, they are listed along with the abstracts, whole text, and data available in the index listing for every reference. The ASIAS is available at: http://adswww.harvard.edu/

  2. External access to ALICE controls conditions data

    NASA Astrophysics Data System (ADS)

    Jadlovský, J.; Jadlovská, A.; Sarnovský, J.; Jajčišin, Š.; Čopík, M.; Jadlovská, S.; Papcun, P.; Bielek, R.; Čerkala, J.; Kopčík, M.; Chochula, P.; Augustinus, A.

    2014-06-01

    ALICE Controls data produced by commercial SCADA system WINCCOA is stored in ORACLE database on the private experiment network. The SCADA system allows for basic access and processing of the historical data. More advanced analysis requires tools like ROOT and needs therefore a separate access method to the archives. The present scenario expects that detector experts create simple WINCCOA scripts, which retrieves and stores data in a form usable for further studies. This relatively simple procedure generates a lot of administrative overhead - users have to request the data, experts needed to run the script, the results have to be exported outside of the experiment network. The new mechanism profits from database replica, which is running on the CERN campus network. Access to this database is not restricted and there is no risk of generating a heavy load affecting the operation of the experiment. The developed tools presented in this paper allow for access to this data. The users can use web-based tools to generate the requests, consisting of the data identifiers and period of time of interest. The administrators maintain full control over the data - an authorization and authentication mechanism helps to assign privileges to selected users and restrict access to certain groups of data. Advanced caching mechanism allows the user to profit from the presence of already processed data sets. This feature significantly reduces the time required for debugging as the retrieval of raw data can last tens of minutes. A highly configurable client allows for information retrieval bypassing the interactive interface. This method is for example used by ALICE Offline to extract operational conditions after a run is completed. Last but not least, the software can be easily adopted to any underlying database structure and is therefore not limited to WINCCOA.

  3. The Better Mousetrap...Can Be Built by Engineers.

    ERIC Educational Resources Information Center

    McBride, Matthew

    2003-01-01

    Describes the growth of the INSPEC database developed by the Institution of Electrical Engineers. Highlights include an historical background of its growth from "Science Abstracts"; production methods, including computerization; indexing, including controlled (thesaurus-based), uncontrolled, chemical, and numerical indexing; and the…

  4. International exploration of Mars. A special bibliography

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This bibliography lists 173 reports, articles, and other documents introduced into the NASA Scientific and Technical Information Database on the exploration of Mars. Historical references are cited for background. The bibliography was created for the 1991 session of the International Space University.

  5. Compositional descriptor-based recommender system for the materials discovery

    NASA Astrophysics Data System (ADS)

    Seko, Atsuto; Hayashi, Hiroyuki; Tanaka, Isao

    2018-06-01

    Structures and properties of many inorganic compounds have been collected historically. However, it only covers a very small portion of possible inorganic crystals, which implies the presence of numerous currently unknown compounds. A powerful machine-learning strategy is mandatory to discover new inorganic compounds from all chemical combinations. Herein we propose a descriptor-based recommender-system approach to estimate the relevance of chemical compositions where crystals can be formed [i.e., chemically relevant compositions (CRCs)]. In addition to data-driven compositional similarity used in the literature, the use of compositional descriptors as a prior knowledge is helpful for the discovery of new compounds. We validate our recommender systems in two ways. First, one database is used to construct a model, while another is used for the validation. Second, we estimate the phase stability for compounds at expected CRCs using density functional theory calculations.

  6. Fire in the Earth System: Bridging data and modeling research

    USGS Publications Warehouse

    Hantson, Srijn; Kloster, Silvia; Coughlan, Michael; Daniau, Anne-Laure; Vanniere, Boris; Bruecher, Tim; Kehrwald, Natalie; Magi, Brian I.

    2016-01-01

    Significant changes in wildfire occurrence, extent, and severity in areas such as western North America and Indonesia in 2015 have made the issue of fire increasingly salient in both the public and scientific spheres. Biomass combustion rapidly transforms land cover, smoke pours into the atmosphere, radiative heat from fires initiates dramatic pyrocumulus clouds, and the repeated ecological and atmospheric effects of fire can even impact regional and global climate. Furthermore, fires have a significant impact on human health, livelihoods, and social and economic systems.Modeling and databased methods to understand fire have rapidly coevolved over the past decade. Satellite and ground-based data about present-day fire are widely available for applications in research and fire management. Fire modeling has developed in part because of the evolution in vegetation and Earth system modeling efforts, but parameterizations and validation are largely focused on the present day because of the availability of satellite data. Charcoal deposits in sediment cores have emerged as a powerful method to evaluate trends in biomass burning extending back to the Last Glacial Maximum and beyond, and these records provide a context for present-day fire. The Global Charcoal Database version 3 compiled about 700 charcoal records and more than 1,000 records are expected for the future version 4. Together, these advances offer a pathway to explore how the strengths of fire data and fire modeling could address the weaknesses in the overall understanding of human-climate–fire linkages.A community of researchers studying fire in the Earth system with individual expertise that included paleoecology, paleoclimatology, modern ecology, archaeology, climate, and Earth system modeling, statistics, geography, biogeochemistry, and atmospheric science met at an intensive workshop in Massachusetts to explore new research directions and initiate new collaborations. Research themes, which emerged from the workshop participants via preworkshop surveys, focused on addressing the following questions: What are the climatic, ecological, and human drivers of fire regimes, both past and future? What is the role of humans in shaping historical fire regimes? How does fire ecology affect land cover changes, biodiversity, carbon storage, and human land uses? What are the historical fire trends and their impacts across biomes? Are their impacts local and/or regional? Are the fire trends in the last two decades unprecedented from a historical perspective? The workshop1 aimed to develop testable hypotheses about fire, climate, vegetation, and human interactions by leveraging the confluence of proxy, observational, and model data related to decadal- to millennial-scale fire activity on our planet. New research directions focused on broad interdisciplinary approaches to highlight how knowledge about past fire activity could provide a more complete understanding of the predictive capacity of fire models and inform fire policy in the face of our changing climate.

  7. Historical sources on climate and extreme events before XX century in Calabria (Italy)

    NASA Astrophysics Data System (ADS)

    Aurora Pasqua, Angela; Petrucci, Olga

    2014-05-01

    Damaging Hydrogeological Events (DHEs) are defined as the occurrence of destructive phenomena, such as landslides and floods, triggered by extreme rain events. Due to the huge damage that they can cause to people and properties, DHEs are often described in a wide series of historical sources. The historical series of DHEs that affected a study region can supply useful information about the climatic trend of the area. Moreover, it can reveals temporal and spatial increases in vulnerability affecting sectors where urbanization increased throughout the time. On the other side, it can highlight further vulnerability variations occurred throughout the decades and related to specific defensive measures undertaken (or abandoned) in order to prevent damage caused by either landslides or floods. We present the historical series of catastrophic DHEs which affected a Mediterranean region named Calabria that is located in southern Italy. Data presented came from the database named ASICal (the Italian acronym of historically flooded areas in Calabria) that has been built at the beginning of 2000 at CNR-IRPI of Cosenza and that has been continuously updated since then. Currently, this database includes more than 11,000 records about floods and landslides which have been occurred in Calabria since the XVI century. These data came from different information sources as newspapers, archives of regional and national agencies, scientific and technical reports, on-site surveys reports and so on. ASICal is constantly updated. The updating concerns both current DHEs that every years affect the region, and the results of specific historical research that we regularly perform in order to fill data gaps for older epochs. In this work we present the result of a recent survey carried out in some regional public libraries focusing on the early-mid XIX century. The type of data sources available for the regional framework are described and a sketch of the DHEs trend during the last three centuries is presented. Moreover, a panoramic view of both proxy data and irregularly measured parameters concerning climatic trend of the region obtained from the analyzed historical sources is also shown.

  8. Documentary evidence of past floods in Europe and their utility in flood frequency estimation

    NASA Astrophysics Data System (ADS)

    Kjeldsen, T. R.; Macdonald, N.; Lang, M.; Mediero, L.; Albuquerque, T.; Bogdanowicz, E.; Brázdil, R.; Castellarin, A.; David, V.; Fleig, A.; Gül, G. O.; Kriauciuniene, J.; Kohnová, S.; Merz, B.; Nicholson, O.; Roald, L. A.; Salinas, J. L.; Sarauskiene, D.; Šraj, M.; Strupczewski, W.; Szolgay, J.; Toumazis, A.; Vanneuville, W.; Veijalainen, N.; Wilson, D.

    2014-09-01

    This review outlines the use of documentary evidence of historical flood events in contemporary flood frequency estimation in European countries. The study shows that despite widespread consensus in the scientific literature on the utility of documentary evidence, the actual migration from academic to practical application has been limited. A detailed review of flood frequency estimation guidelines from different countries showed that the value of historical data is generally recognised, but practical methods for systematic and routine inclusion of this type of data into risk analysis are in most cases not available. Studies of historical events were identified in most countries, and good examples of national databases attempting to collate the available information were identified. The conclusion is that there is considerable potential for improving the reliability of the current flood risk assessments by harvesting the valuable information on past extreme events contained in the historical data sets.

  9. WOVOdat: A New Tool for Managing and Accessing Data of Worldwide Volcanic Unrest

    NASA Astrophysics Data System (ADS)

    Venezky, D. Y.; Malone, S. D.; Newhall, C. G.

    2002-12-01

    WOVOdat (World Organization of Volcano Observatories database of volcanic unrest) will for the first time bring together data of worldwide volcanic seismicity, ground deformation, fumarolic activity, and other changes within or adjacent to a volcanic system. Although a large body of data and experience has been built over the past century, currently, we have no means of accessing that collective experience for use during crises and for research. WOVOdat will be the central resource of a data management system; other components will include utilities for data input and archiving, structured data retrieval, and data mining; educational modules; and links to institutional databases such as IRIS (global seismicity), UNAVCO (global GPS coordinates and strain vectors), and Smithsonian's Global Volcanism Program (historical eruptions). Data will be geospatially and time-referenced, to provide four dimensional images of how volcanic systems respond to magma intrusion, regional strain, and other disturbances prior to and during eruption. As part of the design phase, a small WOVOdat team is currently collecting information from observatories about their data types, formats, and local data management. The database schema is being designed such that responses to common, yet complex, queries are rapid (e.g., where else has similar unrest occurred and what was the outcome?) while also allowing for more detailed research analysis of relationships between various parameters (e.g., what do temporal relations between long-period earthquakes, transient deformation, and spikes in gas emission tell us about the geometry and physical properties of magma and a volcanic edifice?). We are excited by the potential of WOVOdat, and we invite participation in its design and development. Next steps involve formalizing and testing the design, and, developing utilities for translating data of various formats into common formats. The large job of populating the database will follow, and eventually we will have a great new tool for eruption forecasting and research.

  10. An artificial system for selecting the optimal surgical team.

    PubMed

    Saberi, Nahid; Mahvash, Mohsen; Zenati, Marco

    2015-01-01

    We introduce an intelligent system to optimize a team composition based on the team's historical outcomes and apply this system to compose a surgical team. The system relies on a record of the procedures performed in the past. The optimal team composition is the one with the lowest probability of unfavorable outcome. We use the theory of probability and the inclusion exclusion principle to model the probability of team outcome for a given composition. A probability value is assigned to each person of database and the probability of a team composition is calculated from them. The model allows to determine the probability of all possible team compositions even if there is no recoded procedure for some team compositions. From an analytical perspective, assembling an optimal team is equivalent to minimizing the overlap of team members who have a recurring tendency to be involved with procedures of unfavorable results. A conceptual example shows the accuracy of the proposed system on obtaining the optimal team.

  11. The Ogallala Agro-Climate Tool (Technical Description)

    USDA-ARS?s Scientific Manuscript database

    A Visual Basic agro-climate application capable of estimating irrigation demand and crop water use over the Ogallala Aquifer region is described here. The application’s meteorological database consists of daily precipitation and temperature data from 141 U.S. Historical Climatology Network stations ...

  12. Past and Future Trends in Light Truck Sales.

    DOT National Transportation Integrated Search

    1981-08-01

    This report uses the Wharton EFA Motor Vehicle Demand Model (Mark II) and its associated databases to discuss and analyze past and future trends in the Light Duty Truck market. The dynamic historical growth in this market and its implications for ene...

  13. Alliance Building in the Information and Online Database Industry.

    ERIC Educational Resources Information Center

    Alexander, Johanna Olson

    2001-01-01

    Presents an analysis of information industry alliance formation using environmental scanning methods. Highlights include why libraries and academic institutions should be interested; a literature review; historical context; industry and market structures; commercial and academic models; trends; and implications for information providers,…

  14. Database for the degradation risk assessment of groundwater resources (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Polemio, M.; Dragone, V.; Mitolo, D.

    2003-04-01

    The risk characterisation of quality degradation and availability lowering of groundwater resources has been pursued for a wide coastal plain (Basilicata region, Southern Italy), an area covering 40 km along the Ionian Sea and 10 km inland. The quality degradation is due two phenomena: pollution due to discharge of waste water (coming from urban areas) and due to salt pollution, related to seawater intrusion but not only. The availability lowering is due to overexploitation but also due to drought effects. To this purpose the historical data of 1,130 wells have been collected. Wells, homogenously distributed in the area, were the source of geological, stratigraphical, hydrogeological, geochemical data. In order to manage space-related information via a GIS, a database system has been devised to encompass all the surveyed wells and the body of information available per well. Geo-databases were designed to comprise the four types of data collected: a database including geometrical, geological and hydrogeological data on wells (WDB), a database devoted to chemical and physical data on groundwater (CDB), a database including the geotechnical parameters (GDB), a database concering piezometric and hydrological (rainfall, air temperature, river discharge) data (HDB). The record pertaining to each well is identified in these databases by the progressive number of the well itself. Every database is designed as follows: a) the HDB contains 1,158 records, 28 of and 31 fields, mainly describing the geometry of the well and of the stratigraphy; b) the CDB encompasses data about 157 wells, based on which the chemical and physical analyses of groundwater have been carried out. More than one record has been associated with these 157 wells, due to periodic monitoring and analysis; c) the GDB covers 61 wells to which the geotechnical parameters obtained by soil samples taken at various depths; the HDB is designed to permit the analysis of long time series (from 1918) of piezometric data, monitored by more than 60 wells, temperature, rainfall and river discharge data. Based on geo-databases, the geostatistical processing of data has permitted to characterise the degradation risk of groundwater resources of a wide coastal aquifer.

  15. Effect of the sequence data deluge on the performance of methods for detecting protein functional residues.

    PubMed

    Garrido-Martín, Diego; Pazos, Florencio

    2018-02-27

    The exponential accumulation of new sequences in public databases is expected to improve the performance of all the approaches for predicting protein structural and functional features. Nevertheless, this was never assessed or quantified for some widely used methodologies, such as those aimed at detecting functional sites and functional subfamilies in protein multiple sequence alignments. Using raw protein sequences as only input, these approaches can detect fully conserved positions, as well as those with a family-dependent conservation pattern. Both types of residues are routinely used as predictors of functional sites and, consequently, understanding how the sequence content of the databases affects them is relevant and timely. In this work we evaluate how the growth and change with time in the content of sequence databases affect five sequence-based approaches for detecting functional sites and subfamilies. We do that by recreating historical versions of the multiple sequence alignments that would have been obtained in the past based on the database contents at different time points, covering a period of 20 years. Applying the methods to these historical alignments allows quantifying the temporal variation in their performance. Our results show that the number of families to which these methods can be applied sharply increases with time, while their ability to detect potentially functional residues remains almost constant. These results are informative for the methods' developers and final users, and may have implications in the design of new sequencing initiatives.

  16. The CTBTO Link to the database of the International Seismological Centre (ISC)

    NASA Astrophysics Data System (ADS)

    Bondar, I.; Storchak, D. A.; Dando, B.; Harris, J.; Di Giacomo, D.

    2011-12-01

    The CTBTO Link to the database of the International Seismological Centre (ISC) is a project to provide access to seismological data sets maintained by the ISC using specially designed interactive tools. The Link is open to National Data Centres and to the CTBTO. By means of graphical interfaces and database queries tailored to the needs of the monitoring community, the users are given access to a multitude of products. These include the ISC and ISS bulletins, covering the seismicity of the Earth since 1904; nuclear and chemical explosions; the EHB bulletin; the IASPEI Reference Event list (ground truth database); and the IDC Reviewed Event Bulletin. The searches are divided into three main categories: The Area Based Search (a spatio-temporal search based on the ISC Bulletin), the REB search (a spatio-temporal search based on specific events in the REB) and the IMS Station Based Search (a search for historical patterns in the reports of seismic stations close to a particular IMS seismic station). The outputs are HTML based web-pages with a simplified version of the ISC Bulletin showing the most relevant parameters with access to ISC, GT, EHB and REB Bulletins in IMS1.0 format for single or multiple events. The CTBTO Link offers a tool to view REB events in context within the historical seismicity, look at observations reported by non-IMS networks, and investigate station histories and residual patterns for stations registered in the International Seismographic Station Registry.

  17. Map and digital database of sedimentary basins and indications of petroleum in the Central Alaska Province

    USGS Publications Warehouse

    Troutman, Sandra M.; Stanley, Richard G.

    2003-01-01

    This database and accompanying text depict historical and modern reported occurrences of petroleum both in wells and at the surface within the boundaries of the Central Alaska Province. These data were compiled from previously published and unpublished sources and were prepared for use in the 2002 U.S. Geological Survey petroleum assessment of Central Alaska, Yukon Flats region. Indications of petroleum are described as oil or gas shows in wells, oil or gas seeps, or outcrops of oil shale or oil-bearing rock and include confirmed and unconfirmed reports. The scale of the source map limits the spatial resolution (scale) of the database to 1:2,500,000 or smaller.

  18. Modeling the Historical Flood Events in France

    NASA Astrophysics Data System (ADS)

    Ali, Hani; Blaquière, Simon

    2017-04-01

    We will present the simulation results for different scenarios based on the flood model developed by AXA Global P&C CAT Modeling team. The model uses a Digital Elevation Model (DEM) with 75 m resolution, a hydrographic system (DB Carthage), daily rainfall data from "Météo France", water level from "HYDRO Banque" the French Hydrological Database (www.hydro.eaufrance.fr), for more than 1500 stations, hydrological model from IRSTEA and in-house hydraulic tool. In particular, the model re-simulates the most important and costly flood events that occurred during the past decade in France: we will present the re-simulated meteorological conditions since 1964 and estimate insurance loss incurred on current AXA portfolio of individual risks.

  19. Research on Historic Bim of Built Heritage in Taiwan - a Case Study of Huangxi Academy

    NASA Astrophysics Data System (ADS)

    Lu, Y. C.; Shih, T. Y.; Yen, Y. N.

    2018-05-01

    Digital archiving technology for conserving cultural heritage is an important subject nowadays. The Taiwanese Ministry of Culture continues to try to converge the concept and technology of conservation towards international conventions. However, the products from these different technologies are not yet integrated due to the lack of research and development in this field. There is currently no effective schema in HBIM for Taiwanese cultural heritage. The aim of this research is to establish an HBIM schema for Chinese built heritage in Taiwan. The proposed method starts from the perspective of the components of built heritage buildings, up to the investigation of the important properties of the components through important international charters and Taiwanese laws of cultural heritage conservation. Afterwards, object-oriented class diagram and ontology from the scale of components were defined to clarify the concept and increase the interoperability. A historical database was then established for the historical information of components and to bring it into the concept of BIM in order to build a 3D model of heritage objects which can be used for visualization. An integration platform was developed for the users to browse and manipulate the database and 3D model simultaneously. In addition, this research also evaluated the feasibility of this method using the study case at the Huangxi academy located in Taiwan. The conclusion showed that class diagram could help the establishment of database and even its application for different Chinese built heritage objects. The establishment of ontology helped to convey knowledge and increase interoperability. In comparison to traditional documentation methods, the querying result of the platform was more accurate and less prone to human error.

  20. The «New Map of Rome» by Giambattista Nolli: a precise representation of the urban space in the 18th century

    NASA Astrophysics Data System (ADS)

    Lelo, Keti; Travaglini, Carlo Maria

    2010-05-01

    The paper refers to the on going experience of the project "The Historic Atlas of Modern Rome" implemented by CROMA (Centro di ateneo per lo studio di Roma) - University Roma Tre. The project combines research in urban history with geographical information systems, and has as main objective to study the "historic environment" of Rome and its transformations. In 1748, Giovanni Battista Nolli (1692-1756) published his «New Map of Rome» (Nuova Pianta di Roma). This work represents the first geometrically correct representation of Rome within the city walls, and the only map deriving from a topographical survey of which the procedures are known. The map represents a precious source of information and a valid cartographic basis for the study of the successive phases of the city development. The presentation will illustrate the characteristics of this cartographic source, the results obtained from the georeferencing process and the construction of a GIS system for the city of Rome in the 18th century. The described methodology stands at the basis of the first volume of the Atlas, that will be shortly published in printable as well as in digital version, in a CD Rom containing a graphical interface that permits the interactive interrogation of map and databases.

  1. Ictalurids in Iowa’s streams and rivers: Status, distribution, and relationships with biotic integrity

    USGS Publications Warehouse

    Sindt, Anthony R.; Fischer, Jesse R.; Quist, Michael C.; Pierce, Clay

    2011-01-01

    Anthropogenic alterations to Iowa’s landscape have greatly altered lotic systems with consequent effects on the biodiversity of freshwater fauna. Ictalurids are a diverse group of fishes and play an important ecological role in aquatic ecosystems. However, little is known about their distribution and status in lotic systems throughout Iowa. The purpose of this study was to describe the distribution of ictalurids in Iowa and examine their relationship with ecological integrity of streams and rivers. Historical data (i.e., 1884–2002) compiled for the Iowa Aquatic Gap Analysis Project (IAGAP) were used to detect declines in the distribution of ictalurids in Iowa streams and rivers at stream segment and watershed scales. Eight variables characterizing ictalurid assemblages were used to evaluate relationships with index of biotic integrity (IBI) ratings. Comparisons of recent and historic data from the IAGAP database indicated that 9 of Iowa’s 10 ictalurid species experienced distribution declines at one or more spatial scales. Analysis of variance indicated that ictalurid assemblages differed among samples with different IBI ratings. Specifically, total ictalurid, sensitive ictalurid, and Noturus spp. richness increased as IBI ratings increased. Results indicate declining ictalurid species distributions and biotic integrity are related, and management strategies aimed to improve habitat and increase biotic integrity will benefit ictalurid species.

  2. Evaluation of relational and NoSQL database architectures to manage genomic annotations.

    PubMed

    Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard

    2016-12-01

    While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. The National Eutrophication Survey: lake characteristics and historical nutrient concentrations

    NASA Astrophysics Data System (ADS)

    Stachelek, Joseph; Ford, Chanse; Kincaid, Dustin; King, Katelyn; Miller, Heather; Nagelkirk, Ryan

    2018-01-01

    Historical ecological surveys serve as a baseline and provide context for contemporary research, yet many of these records are not preserved in a way that ensures their long-term usability. The National Eutrophication Survey (NES) database is currently only available as scans of the original reports (PDF files) with no embedded character information. This limits its searchability, machine readability, and the ability of current and future scientists to systematically evaluate its contents. The NES data were collected by the US Environmental Protection Agency between 1972 and 1975 as part of an effort to investigate eutrophication in freshwater lakes and reservoirs. Although several studies have manually transcribed small portions of the database in support of specific studies, there have been no systematic attempts to transcribe and preserve the database in its entirety. Here we use a combination of automated optical character recognition and manual quality assurance procedures to make these data available for analysis. The performance of the optical character recognition protocol was found to be linked to variation in the quality (clarity) of the original documents. For each of the four archival scanned reports, our quality assurance protocol found an error rate between 5.9 and 17 %. The goal of our approach was to strike a balance between efficiency and data quality by combining entry of data by hand with digital transcription technologies. The finished database contains information on the physical characteristics, hydrology, and water quality of about 800 lakes in the contiguous US (Stachelek et al.(2017), https://doi.org/10.5063/F1639MVD). Ultimately, this database could be combined with more recent studies to generate meta-analyses of water quality trends and spatial variation across the continental US.

  4. [Cocaine: historical background, neurobiology of the addiction and relapse and therapeutic perspectives].

    PubMed

    Silva, M I; Citó, M C; Vasconcelos, P F; Vasconcelos, S M; Sousa, F C

    2010-01-01

    Following more than a century of cocaine hydrochloride extraction from Erythroxylon coca, this drug remains representing a serious social and public health problema around the world. This paper intends to provide a review about the cocaine theme, focusing on historical background and on its different neurotransmission systems, as well as addresses therapeutics aspects about drug addiction. Electronic search in databases Medline, Pubmed and Lilacs was accomplished in order to select classics and recent studies relevant to the discussion of issue addressed. Previous studies have shown high vulnerability to relapse to cocaine seeking following prolonged withdrawal periods. Such behavioral consequences have been cre-dited to induced changes in brain neurotransmitters provoked by repeated cocaine use. In recent years, the growing abuse of this drug has mobilized researchers worldwide in seeking for new therapies that reduce the behavioral and neurochemical changes resulting from addiction. Numerous advances regarding the treatment of cocaine abuse and dependence have emerged in recent years. However, researche aiming at a safe and effective users' pharmacological treatment remain necessary and should be continued.

  5. S.I.I.A for monitoring crop evolution and anomaly detection in Andalusia by remote sensing

    NASA Astrophysics Data System (ADS)

    Rodriguez Perez, Antonio Jose; Louakfaoui, El Mostafa; Munoz Rastrero, Antonio; Rubio Perez, Luis Alberto; de Pablos Epalza, Carmen

    2004-02-01

    A new remote sensing application was developed and incorporated to the Agrarian Integrated Information System (S.I.I.A), project which is involved on integrating the regional farming databases from a geographical point of view, adding new values and uses to the original information. The project is supported by the Studies and Statistical Service, Regional Government Ministry of Agriculture and Fisheries (CAP). The process integrates NDVI values from daily NOAA-AVHRR and monthly IRS-WIFS images, and crop classes location maps. Agrarian local information and meteorological information is being included in the working process to produce a synergistic effect. An updated crop-growing evaluation state is obtained by 10-days periods, crop class, sensor type (including data fusion) and administrative geographical borders. Last ten years crop database (1992-2002) has been organized according to these variables. Crop class database can be accessed by an application which helps users on the crop statistical analysis. Multi-temporal and multi-geographical comparative analysis can be done by the user, not only for a year but also for a historical point of view. Moreover, real time crop anomalies can be detected and analyzed. Most of the output products will be available on Internet in the near future by a on-line application.

  6. Software for pest-management science: computer models and databases from the United States Department of Agriculture-Agricultural Research Service.

    PubMed

    Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D

    2003-01-01

    We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.

  7. Informational database methodology for urban risk analysis.Case study: the historic centre of Bucharest

    NASA Astrophysics Data System (ADS)

    Armas, I.; Dumitrascu, S.

    2009-04-01

    The urban environment often deals with issues concerning the deterioration of the constructed space and the quality of the environmental factors, in general terms meaning an unsatisfactory quality of life. Taking into account the complexity of the urban environment and the strong human impact, this ambience can be considered the ideal place for a varied range of risks to appear, being favoured by the external interventions and the dynamics of the internal changes that occur in the urban system, often unexpectedly. In this context, historic centre areas are even more vulnerable because of the age of the buildings and their socio-cultural value. The present study focuses on the development of a rapid assessment system of urban risks, putting emphasis on earthquakes. The importance of the study is shown by the high vulnerability that defines urban settlements, which can be considered socio-ecological systems characterized by a maximum risk level. In general, cities are highly susceptible areas because of their compactness and elevated degree of land occupancy, the Bucharest municipality being no exception. The street and sewerage networks disorganized the natural system resulted from the evolution of the lake-river system in Superior Pleistocene-Holocene and the intense construction activity represents a pressure that hasn't been measured and that is in need for a methodological interdisciplinary approach. In particular, the specific of Bucharest is given by the seismic risk based on an explosive urban evolution and the advanced state of degradation of the buildings. In this context, the Lipscani sector from the historic centre of the capital city is a maximum seismic vulnerability area, this being the result of its location in the Dâmbovita River meadow, on the brow and 80 m terrace, but more precisely because of the degradation of the buildings that cumulated the effects of the repeated earthquakes. The historic centre of Bucharest has not only a cultural function, but is also a very populated area, this being factors that favour a high susceptibility level. In addition, the majority of the buildings are included in the first and second categories of seismic risk, being built between 1875 and 1940, the age of the buildings establishing an increased vulnerability to natural hazards. The methodology was developed through the contribution of three partner universities from Bucharest: the University of Bucharest, the Academy for Economic Studies and the Technical University of Constructions. The method suggested was based on the analysis and processing of digital and statistical spatial information resulted from 1:500 topographical plans, satellite pictures, archives and historical maps used for the identification of the age of the buildings. Also, an important stage was represented by the field investigations that resulted with the data used in the assessment of the buildings: year of construction, location and vicinity, height, number of floors, state and function of the building, equipment and construction type. The information collected from the field together with the data resulted from the digitization of the ortophotoplans were inserted in ArcGIS in order to compile the database. Furthermore, the team from the Cybernetics Faculty developed a special software package in Visual Studio and SQL server in order to insert the sheets in GIS so that they could be statistically processed. The final product of the study is a program that includes as main functions editing, the analysis based on selected factors (individual or group) and viewing of building information in the shape of maps or 3D visualization. The strengths of the informational system resulted are given by the extended range of applicability, the short processing period, accessibility, capacity of support for a large amount of information and, thus, standing out as an adequate instrument to fit the needs of a susceptible population.

  8. Development of an Adaptable Display and Diagnostic System for the Evaluation of Tropical Cyclone Forecasts

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Burek, T.; Halley-Gotway, J.

    2015-12-01

    NCAR's Joint Numerical Testbed Program (JNTP) focuses on the evaluation of experimental forecasts of tropical cyclones (TCs) with the goal of developing new research tools and diagnostic evaluation methods that can be transitioned to operations. Recent activities include the development of new TC forecast verification methods and the development of an adaptable TC display and diagnostic system. The next generation display and diagnostic system is being developed to support evaluation needs of the U.S. National Hurricane Center (NHC) and broader TC research community. The new hurricane display and diagnostic capabilities allow forecasters and research scientists to more deeply examine the performance of operational and experimental models. The system is built upon modern and flexible technology that includes OpenLayers Mapping tools that are platform independent. The forecast track and intensity along with associated observed track information are stored in an efficient MySQL database. The system provides easy-to-use interactive display system, and provides diagnostic tools to examine forecast track stratified by intensity. Consensus forecasts can be computed and displayed interactively. The system is designed to display information for both real-time and for historical TC cyclones. The display configurations are easily adaptable to meet the needs of the end-user preferences. Ongoing enhancements include improving capabilities for stratification and evaluation of historical best tracks, development and implementation of additional methods to stratify and compute consensus hurricane track and intensity forecasts, and improved graphical display tools. The display is also being enhanced to incorporate gridded forecast, satellite, and sea surface temperature fields. The presentation will provide an overview of the display and diagnostic system development and demonstration of the current capabilities.

  9. A Global Survey of Deep Underground Facilities; Examples of Geotechnical and Engineering Capabilities, Achievements, Challenges (Mines, Shafts, Tunnels, Boreholes, Sites and Underground Facilities for Nuclear Waste and Physics R&D): A Guide to Interactive Global Map Layers, Table Database, References and Notes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tynan, Mark C.; Russell, Glenn P.; Perry, Frank V.

    These associated tables, references, notes, and report present a synthesis of some notable geotechnical and engineering information used to create four interactive layer maps for selected: 1) deep mines and shafts; 2) existing, considered or planned radioactive waste management deep underground studies or disposal facilities 3) deep large diameter boreholes, and 4) physics underground laboratories and facilities from around the world. These data are intended to facilitate user access to basic information and references regarding “deep underground” facilities, history, activities, and plans. In general, the interactive maps and database provide each facility’s approximate site location, geology, and engineered features (e.g.:more » access, geometry, depth, diameter, year of operations, groundwater, lithology, host unit name and age, basin; operator, management organization, geographic data, nearby cultural features, other). Although the survey is not comprehensive, it is representative of many of the significant existing and historical underground facilities discussed in the literature addressing radioactive waste management and deep mined geologic disposal safety systems. The global survey is intended to support and to inform: 1) interested parties and decision makers; 2) radioactive waste disposal and siting option evaluations, and 3) safety case development applicable to any mined geologic disposal facility as a demonstration of historical and current engineering and geotechnical capabilities available for use in deep underground facility siting, planning, construction, operations and monitoring.« less

  10. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  11. Quantifying autonomous vehicles national fuel consumption impacts: A data-rich approach

    DOE PAGES

    Chen, Yuche; Gonder, Jeffrey; Young, Stanley; ...

    2017-11-06

    Autonomous vehicles are drawing significant attention from governments, manufacturers and consumers. Experts predict them to be the primary means of transportation by the middle of this century. Recent literature shows that vehicle automation has the potential to alter traffic patterns, vehicle ownership, and land use, which may affect fuel consumption from the transportation sector. In this paper, we developed a data-rich analytical framework to quantify system-wide fuel impacts of automation in the United States by integrating (1) a dynamic vehicle sales, stock, and usage model, (2) an historical transportation network-level vehicle miles traveled (VMT)/vehicle activity database, and (3) estimates ofmore » automation's impacts on fuel efficiency and travel demand. The vehicle model considers dynamics in vehicle fleet turnover and fuel efficiency improvements of conventional and advanced vehicle fleet. The network activity database contains VMT, free-flow speeds, and historical speeds of road links that can help us accurately identify fuel-savings opportunities of automation. Based on the model setup and assumptions, we found that the impacts of automation on fuel consumption are quite wide-ranging - with the potential to reduce fuel consumption by 45% in our 'Optimistic' case or increase it by 30% in our 'Pessimistic' case. Second, implementing automation on urban roads could potentially result in larger fuel savings compared with highway automation because of the driving features of urban roads. Lastly, through scenario analysis, we showed that the proposed framework can be used for refined assessments as better data on vehicle-level fuel efficiency and travel demand impacts of automation become available.« less

  12. Suicide and fatal drug overdose in child sexual abuse victims: a historical cohort study.

    PubMed

    Cutajar, Margaret C; Mullen, Paul E; Ogloff, James R P; Thomas, Stuart D; Wells, David L; Spataro, Josie

    2010-02-15

    To determine the rate and risk of suicide and accidental fatal drug overdose (ie, overdose deemed not to have been suicide) in individuals who had been medically ascertained as having been sexually abused during childhood. A historical cohort linkage study of suicide and accidental drug-induced death among victims of child sexual abuse (CSA). Forensic medical records of 2759 victims of CSA who were assessed between 1964 and 1995 were obtained from the Victorian Institute of Forensic Medicine and linked with coronial data representing a follow-up period of up to 44 years. Rates of suicide and accidental fatal drug overdose recorded in coronial databases between 1991 and 2008, and rates of psychiatric disorders and substance use recorded in public mental health databases. Twenty-one cases of fatal self-harm were recorded. Relative risks for suicide and accidental fatal overdose among CSA victims, compared with age-limited national data for the general population, were 18.09 (95% CI, 10.96-29.85; population-attributable risk, 0.37%), and 49.22 (95% CI, 36.11-67.09; population-attributable risk, 0.01%) respectively. Relative risks were higher for female victims. Similar to the general population, CSA victims who died as a result of self-harm were predominantly aged in their 30s at time of death. Most had contact with the public mental health system and half were recorded as being diagnosed with an anxiety disorder. Our data highlight that CSA victims are at increased risk of suicide and accidental fatal drug overdose. CSA is a risk factor that mediates suicide and fatal overdose.

  13. What could you do with 400 years of biological history on african americans? Evaluating the potential scientific benefit of systematic studies of dental and skeletal materials on African Americans from the 17th through 20th centuries.

    PubMed

    Jackson, Fatimah; Jackson, Latifa; Cross, Christopher; Clarke, Cameron

    2016-07-01

    How important is it to be able to reconstruct the lives of a highly diverse, historically recent macroethnic group over the course of 400 years? How many insights into human evolutionary biology and disease susceptibilities could be gained, even with this relatively recent window into the past? In this article, we explore the potential ramifications of a newly constructed dataset of Four Centuries of African American Biological Variation (4Cs). This article provides initial lists of digitized variables formatted as SQL tables for the 17th and 18th century samples and for the 19th and 20th century samples. This database is dynamic and new information is added yearly. The database provides novel opportunities for significant insights into the past biological history of this group and three case study applications are detailed for comparative computational systems biology studies of (1) hypertension, (2) the oral microbiome, and (3) mental health disorders. The 4Cs dataset is ideal for interdisciplinary "next generation" science research and these data represent a unique step toward the accumulation of historically contextualized Big Data on an underrepresented group known to have experienced differential survival over time. Am. J. Hum. Biol. 28:510-513, 2016. © 2016 The Authors American Journal of Human Biology Published byWiley Periodicals, Inc. © 2016 The Authors American Journal of Human Biology Published by Wiley Periodicals, Inc.

  14. Neural fraud detection in credit card operations.

    PubMed

    Dorronsoro, J R; Ginel, F; Sgnchez, C; Cruz, C S

    1997-01-01

    This paper presents an online system for fraud detection of credit card operations based on a neural classifier. Since it is installed in a transactional hub for operation distribution, and not on a card-issuing institution, it acts solely on the information of the operation to be rated and of its immediate previous history, and not on historic databases of past cardholder activities. Among the main characteristics of credit card traffic are the great imbalance between proper and fraudulent operations, and a great degree of mixing between both. To ensure proper model construction, a nonlinear version of Fisher's discriminant analysis, which adequately separates a good proportion of fraudulent operations away from other closer to normal traffic, has been used. The system is fully operational and currently handles more than 12 million operations per year with very satisfactory results.

  15. Historical Collections | Alaska State Library

    Science.gov Websites

    Microfilm eResources Electronic Books (EBSCO) World Catalog (WorldCat) Free Images and Sounds Journal Finder Publications Catalog and Library Card Info Federal Publications Free Images and Sounds Library Resources Articles & Databases Free Images & Sounds Journal Finder Library Resources Live Homework Help

  16. Point of Entry

    ERIC Educational Resources Information Center

    Manzo, Kathleen Kennedy

    2007-01-01

    As part of a professional development program organized by the Save Ellis Island Foundation, the exhibits, databases, photo archives, and recorded interviews at the island's museum helps put the nation's current immigration debate into a broader historical context. Teachers at these sessions learn from scholars and park personnel about early…

  17. Vehicle and positive control values from the in vivo rodent comet assay and biomonitoring studies using human lymphocytes: historical database and influence of technical aspects.

    PubMed

    Pant, Kamala; Springer, S; Bruce, S; Lawlor, T; Hewitt, N; Aardema, M J

    2014-10-01

    There is increased interest in the in vivo comet assay in rodents as a follow-up approach for determining the biological relevance of chemicals that are genotoxic in in vitro assays. This is partly because, unlike other assays, DNA damage can be assessed in this assay in virtually any tissue. Since background levels of DNA damage can vary with the species, tissue, and cell processing method, a robust historical control database covering multiple tissues is essential. We describe extensive vehicle and positive control data for multiple tissues from rats and mice. In addition, we report historical data from control and genotoxin-treated human blood. Technical issues impacting comet results are described, including the method of cell preparation and freezing. Cell preparation by scraping (stomach and other GI tract organs) resulted in higher % tail DNA than mincing (liver, spleen, kidney etc) or direct collection (blood or bone marrow). Treatment with the positive control genotoxicant, ethyl methanesulfonate (EMS) in rats and methyl methanesulfonate in mice, resulted in statistically significant increases in % tail DNA. Background DNA damage was not markedly increased when cell suspensions were stored frozen prior to preparing slides, and the outcome of the assay was unchanged (EMS was always positive). In conclusion, historical data from our laboratory for the in vivo comet assay for multiple tissues from rats and mice, as well as human blood show very good reproducibility. These data and recommendations provided are aimed at contributing to the design and proper interpretation of results from comet assays. © 2014 Wiley Periodicals, Inc.

  18. Architectural Heritage Visualization Using Interactive Technologies

    NASA Astrophysics Data System (ADS)

    Albourae, A. T.; Armenakis, C.; Kyan, M.

    2017-08-01

    With the increased exposure to tourists, historical monuments are at an ever-growing risk of disappearing. Building Information Modelling (BIM) offers a process of digitally documenting of all the features that are made or incorporated into the building over its life-span, thus affords unique opportunities for information preservation. BIM of historical buildings are called Historical Building Information Models (HBIM). This involves documenting a building in detail throughout its history. Geomatics professionals have the potential to play a major role in this area as they are often the first professionals involved on construction development sites for many Architectural, Engineering, and Construction (AEC) projects. In this work, we discuss how to establish an architectural database of a heritage site, digitally reconstruct, preserve and then interact with it through an immersive environment that leverages BIM for exploring historic buildings. The reconstructed heritage site under investigation was constructed in the early 15th century. In our proposed approach, the site selection was based on many factors such as architectural value, size, and accessibility. The 3D model is extracted from the original collected and integrated data (Image-based, range-based, CAD modelling, and land survey methods), after which the elements of the 3D objects are identified by creating a database using the BIM software platform (Autodesk Revit). The use of modern and widely accessible game engine technology (Unity3D) is explored, allowing the user to fully embed and interact with the scene using handheld devices. The details of implementing an integrated pipeline between HBIM, GIS and augmented and virtual reality (AVR) tools and the findings of the work are presented.

  19. Historical literature review on waste classification and categorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croff, A.G.; Richmond, A.A.; Williams, J.P.

    1995-03-01

    The Staff of the Waste Management Document Library (WMDL), in cooperation with Allen Croff have been requested to provide information support for a historical search concerning waste categorization/classification. This bibliography has been compiled under the sponsorship of Oak Ridge National Laboratory`s Chemical Technology Division to help in Allen`s ongoing committee work with the NRC/NRCP. After examining the search, Allen Croff saw the value of the search being published. Permission was sought from the database providers to allow limited publication (i.e. 20--50 copies) of the search for internal distribution at the Oak Ridge National Laboratory and for Allen Croff`s associated committee.more » Citations from the database providers who did not grant legal permission for their material to be published have been omitted from the literature review. Some of the longer citations have been included in an abbreviated form in the search to allow the format of the published document to be shortened from approximately 1,400 pages. The bibliography contains 372 citations.« less

  20. Uses of the Word “Macula” in Written English, 1400-Present

    PubMed Central

    Schwartz, Stephen G.; Leffler, Christopher T.

    2014-01-01

    We compiled uses of the word “macula” in written English by searching multiple databases, including the Early English Books Online Text Creation Partnership, America’s Historical Newspapers, the Gale Cengage Collections, and others. “Macula” has been used: as a non-medical “spot” or “stain”, literal or figurative, including in astronomy and in Shakespeare; as a medical skin lesion, occasionally with a following descriptive adjective, such as a color (macula alba); as a corneal lesion, including the earliest identified use in English, circa 1400; and to describe the center of the retina. Francesco Buzzi described a yellow color in the posterior pole (“retina tinta di un color giallo”) in 1782, but did not use the word “macula”. “Macula lutea” was published by Samuel Thomas von Sömmering by 1799, and subsequently used in 1818 by James Wardrop, which appears to be the first known use in English. The Google n-gram database shows a marked increase in the frequencies of both “macula” and “macula lutea” following the introduction of the ophthalmoscope in 1850. “Macula” has been used in multiple contexts in written English. Modern databases provide powerful tools to explore historical uses of this word, which may be underappreciated by contemporary ophthalmologists. PMID:24913329

  1. The Polish Genetic Database of Victims of Totalitarianisms.

    PubMed

    Ossowski, A; Kuś, M; Kupiec, T; Bykowska, M; Zielińska, G; Jasiński, M E; March, A L

    2016-01-01

    This paper describes the creation of the Polish Genetic Database of Victims of Totalitarianism and the first research conducted under this project. On September 28th 2012, the Pomeranian Medical University in Szczecin and the Institute of National Remembrance-Commission for Prosecution of Crimes against the Polish Nation agreed to support the creation of the Polish Genetic Database of Victims of Totalitarianism (PBGOT, www.pbgot.pl). The purpose was to employ state-of-the-art methods of forensic genetics to identify the remains of unidentified victims of Communist and Nazi totalitarian regimes. The database was designed to serve as a central repository of genetic information of the victim's DNA and that of the victim's nearest living relatives, with the goal of making a positive identification of the victim. Along the way, PGBOT encountered several challenges. First, extracting useable DNA samples from the remains of individuals who had been buried for over half a century required forensic geneticists to create special procedures and protocols. Second, obtaining genetic reference material and historical information from the victim's closest relatives was both problematic and urgent. The victim's nearest living relatives were part of a dying generation, and the opportunity to obtain the best genetic and historical information about the victims would soon die with them. For this undertaking, PGBOT assembled a team of historians, archaeologists, forensic anthropologists, and forensic geneticists from several European research institutions. The field work was divided into five broad categories: (1) exhumation of victim remains and storing their biological material for later genetic testing; (2) researching archives and historical data for a more complete profile of those killed or missing and the families that lost them; (3) locating the victim's nearest relatives to obtain genetic reference samples (swabs), (4) entering the genetic data from both victims and family members into a common database; (5) making a conclusive, final identification of the victim. PGBOT's first project was to identify victims of the Communist regime buried in hidden mass graves in the Powązki Military Cemetery in Warsaw. Throughout 2012 and 2013, PGBOT carried out archaeological exhumations in the Powązki Military Cemetery that resulted in the recovery of the skeletal remains of 194 victims in several mass graves. Of the 194 sets of remains, more than 50 victims have been successfully matched and identified through genetic evidence. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. National Levee Database: monitoring, vulnerability assessment and management in Italy

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Camici, Stefania; Maccioni, Pamela; Moramarco, Tommaso

    2015-04-01

    A properly designed and constructed levees system can often be an effective device for repelling floodwaters and provide barriers against inundation to protect urbanized and industrial areas. However, the delineation of flooding-prone areas and the related hydraulic hazard mapping taking account of uncertainty (Apel et al., 2008) are usually developed with a scarce consideration of the possible occurrence of levee failures along river channels (Mazzoleni et al., 2014). Indeed, it is well known that flooding is frequently the result of levee failures that can be triggered by several factors, as: (1) overtopping, (2) scouring of the foundation, (3) seepage/piping of levee body/foundation, and (4) sliding of the foundation. Among these failure mechanisms that are influenced by the levee's geometrical configuration, hydraulic conditions (e.g. river level and seepage), and material properties (e.g. permeability, cohesion, porosity, compaction), the piping caused by seepage (ICOLD, http://www.icold-cigb.org) is considered one of the most dominant levee failure mechanisms (Colleselli F., 1994; Wallingford H. R., 2003). The difficulty of estimating the hydraulic parameters to properly describe the seepage line within the body and foundation of the levee implies that the study of the critical flood wave routing is typically carried out by assuming that the levee system is undamaged during the flood event. In this context, implementing and making operational a National Levee Database (NLD), effectively structured and continuously updated, becomes fundamental to have a searchable inventory of information about levees available as a key resource supporting decisions and actions affecting levee safety. The ItaliaN LEvee Database (INLED) has been recently developed by the Research Institute for Geo-Hydrological Protection (IRPI) for the Civil Protection Department of the Presidency of Council of Ministers. INLED has the main focus of collecting comprehensive information about Italian levees and historical breach failures to be exploited in the framework of an operational procedure addressed to the seepage vulnerability assessment of river reaches where the levee system is an important structural measure against flooding. For its structure, INLED is a dynamic geospatial database with ongoing efforts to add levee data from authorities with the charge of hydraulic risk mitigation. In particular, the database is aimed to provide the available information about: i) location and condition of levees; ii) morphological and geometrical properties; iii) photographic documentation; iv) historical levee failures; v) assessment of vulnerability to overtopping and seepage carried out through a procedure based on simple vulnerability indexes (Camici et al. 2014); vi) management, control and maintenance; vii)flood hazard maps developed by assuming the levee system undamaged/damaged during the flood event. Currently, INLED contains data of levees that are mostly located in the Tiber basin, Central Italy. References Apel H., Merz B. & Thieken A.H. Quantification of uncertainties in flood risk assessments. Int J River Basin Manag 2008, 6, (2), 149-162. Camici S,, Barbetta S., Moramarco T., Levee body vulnerability to seepage: the case study of the levee failure along the Foenna stream on 1st January 2006 (central Italy)", Journal of Flood Risk Management, in press. Colleselli F. Geotechnical problems related to river and channel embankments. Rotterdam, the Netherlands: Springer, 1994. H. R.Wallingford Consultants (HRWC). Risk assessment for flood and coastal defence for strategic planning: high level methodology technical report, London, 2003. Mazzoleni M., Bacchi B., Barontini S., Di Baldassarre G., Pilotti M. & Ranzi R. Flooding hazard mapping in floodplain areas affected by piping breaches in the Po River, Italy. J Hydrol Eng 2014, 19, (4), 717-731.

  3. X-1 to X-Wings: Developing a Parametric Cost Model

    NASA Technical Reports Server (NTRS)

    Sterk, Steve; McAtee, Aaron

    2015-01-01

    In todays cost-constrained environment, NASA needs an X-Plane database and parametric cost model that can quickly provide rough order of magnitude predictions of cost from initial concept to first fight of potential X-Plane aircraft. This paper takes a look at the steps taken in developing such a model and reports the results. The challenges encountered in the collection of historical data and recommendations for future database management are discussed. A step-by-step discussion of the development of Cost Estimating Relationships (CERs) is then covered.

  4. Making historic loss data comparable over time and place

    NASA Astrophysics Data System (ADS)

    Eichner, Jan; Steuer, Markus; Löw, Petra

    2017-04-01

    When utilizing historic loss data for present day risk assessment, it is necessary to make the data comparable over time and place. To achieve this, the assessment of costs from natural hazard events requires consistent and homogeneous methodologies for loss estimation as well as a robust treatment of loss data to estimate and/or reduce distorting effects due to a temporal bias in the reporting of small-scale loss events. Here we introduce Munich Re's NatCatSERVICE loss database and present a novel methodology of peril-specific normalization of the historic losses (to account for socio-economic growth of assets over time), and we introduce a metric of severity classification (called CatClass) that allows for a global comparison of impact severity across countries of different stages of economic development.

  5. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Inmachuk, Kugruk, Kiwalik, and Koyuk River drainages, Granite Mountain, and the northern Darby Mountains, Bendeleben, Candle, Kotzebue, and Solomon quadrangles, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 653 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from an area covering portions of the Inmachuk, Kugruk, Kiwalik, and Koyuk river drainages, Granite Mountain, and the northern Darby Mountains, located in the Bendeleben, Candle, Kotzebue, and Solomon quadrangles of eastern Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  6. RITA--Registry of Industrial Toxicology Animal data: the application of historical control data for Leydig cell tumors in rats.

    PubMed

    Nolte, Thomas; Rittinghausen, Susanne; Kellner, Rupert; Karbe, Eberhard; Kittel, Birgit; Rinke, Matthias; Deschl, Ulrich

    2011-11-01

    Historical data for Leydig cell tumors from untreated or vehicle treated rats from carcinogenicity studies collected in the RITA database are presented. Examples are given for analyses of these data for dependency on variables considered to be of possible influence on the spontaneous incidence of Leydig cell tumors. In the 7453 male rats available for analysis, only one case of a Leydig cell carcinoma was identified. The incidence of Leydig cell adenomas differed markedly between strains. High incidences of close to 100% have been found in F344 rats, while the mean incidence was 4.2% in Sprague-Dawley rats and 13.7% in Wistar rats. Incidences in Wistar rats were highly variable, primarily caused by different sources of animals. Mean incidences per breeder varied from 2.8 to 39.9%. Analyses for the dependency on further parameters have been performed in Wistar rats. In breeders G and I, the Leydig cell tumor incidence decreased over the observation period and with increasing mean terminal body weight. The incidence of Leydig cell tumors increased with mean age at necropsy and was higher in studies with dietary admixture compared to gavage studies. These parameters had no effect on Leydig cell tumor incidence in breeders A and B. Animals from almost all breeders had a considerably higher mean age at necropsy when bearing a Leydig cell adenoma than animals without a Leydig cell adenoma. Studies with longitudinal trimming of the testes had a higher incidence than studies with transverse trimming. The observed dependencies and breeder differences are discussed and explanations are given. Consequences for the use of historical control data are outlined. With the retrospective analyses presented here we were able to confirm the published features of Leydig cell adenomas and carcinomas. This indicates that the RITA database is a valuable tool for analyses of tumors for their biological features. Furthermore, it demonstrates that the RITA database is highly beneficial for the definition of reliable historical control data for carcinogenicity studies on a scientifically solid basis. Copyright © 2010 Elsevier GmbH. All rights reserved.

  7. Modifications of natural hazard impacts and hydrological extremes in previous centuries (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Petrucci, Olga; Pasqua, Aurora Angela; Polemio, Maurizio

    2013-04-01

    The present work is based on the use of a wide historical database concerning floods and landslides which occurred in Calabria, a region of southern Italy, since the seventeenth century, and including more than 11,000 records. This database has been built by collecting data coming from different information sources as newspapers, archives of regional and national agencies, scientific and technical reports, on-site surveys reports and information collected by interviewing both people involved and local administrators. This database has been continuously updated by both the results of local historical research and data coming from the daily survey of regional newspapers. Similarly, a wide archive of rainfall data for the same period and the same region has been implemented. In this work, basing on the abovementioned archives, a comparative analysis of floods that occurred in a regional sector over a long period and the climatic data characterizing the same period has been carried out, focusing on the climate trend and aiming to investigate the potential effect of climate variation on the damaging floods trend. The aim was to assess whether the frequency of floods is changing and, if so, whether these changes can be related to either rainfall and/or anthropogenic modifications. In order to assess anthropogenic modifications, the evolution of urbanized sectors of the study area in the last centuries has been reenacted by mean of comparisons, in GIS environment, of historical maps of different epochs. The annual variability of rainfall was discussed using an annual index. Short duration-high intensity rainfalls were characterized considering time series of annual maxima of 1, 3, 6, 12, and 24 hours and daily rainfall. The analysis indicates that, despite a rainfall trend favorable towards a reduction in flood occurrence, floods damage has not decreased. This seems to be mainly the effect of mismanagement of land use modifications. Moreover, the long historical series analyzed allowed us to individuate both the most frequently damaged elements and the frequently damaged geographical sectors of the study area, even with a further in depth on the cases involving people in urbanized sectors.

  8. Study of Mobile GIS Application on the Field of GPR in the Road Disease Detection

    NASA Astrophysics Data System (ADS)

    Liao, Q.; Yang, F.

    2013-12-01

    With the reflection principle of pulsed electromagnetic waves, ground penetrating radar (GPR) is available to measure depth of the pavement layer, reflecting different hidden danger underground. Currently, GPR has been widely used in road engineering with the constantly improved ability of detection and diagnosis to road diseases. The sum of road disease data of a region, a city, and even a wider range will be a very informative database, so we need a more convenient way to achieve data query intuitively. As mobile internet develops continuously, application of mobile terminal device plays a more important role in information platform. Mobile GIS, with smartphone as its terminal, is supported by the mobile Internet, GPS or base station as its positioning method. In this article, based on Android Platform and using C/S pattern, the LBS application of road diseases information which integrates Baidu Map API and database technology was discussed. After testing, it can display and query the real-time and historical road diseases data, the classification of data on a phone intuitively and easily. Because of the location technique and high portability of smart phone, the spot investigations of road diseases become easier. Though, the system needs further improvement, especially with the improving of the mobile phone performance, the system can also add the function of analysis to the disease data, thus forming a set of service system with more applicable.

  9. Evaluation of the Historic Triangle Wayfinding Sign System.

    DOT National Transportation Integrated Search

    2009-01-01

    The "Historic Triangle" in Virginia is named for the historic areas comprising and surrounding Williamsburg, Jamestown, and Yorktown, Virginia. A Historic Triangle Wayfinding Sign System was designed to lead travelers from I-64 to historic sites in W...

  10. Changing pattern of natural hazards due to extreme hydro-meteorological conditions (Apulia, southern Italy)

    NASA Astrophysics Data System (ADS)

    Polemio, Maurizio; Lonigro, Teresa

    2013-04-01

    Recent international researches have underlined the evidences of climate changes throughout the world. Among the consequences of climate change, there is the increase in the frequency and magnitude of natural disasters, such as droughts, windstorms, heat waves, landslides, floods and secondary floods (i.e. rapid accumulation or pounding of surface water with very low flow velocity). The Damaging Hydrogeological Events (DHEs) can be defined as the occurrence of one or more simultaneous aforementioned phenomena causing damages. They represent a serious problem, especially in DHE-prone areas with growing urbanisation. In these areas the increasing frequency of extreme hydrological events could be related to climate variations and/or urban development. The historical analysis of DHEs can support decision making and land-use planning, ultimately reducing natural risks. The paper proposes a methodology, based on both historical and time series approaches, used for describing the influence of climatic variability on the number of phenomena observed. The historical approach is finalised to collect phenomenon historical data. The historical flood and landslide data are important for the comprehension of the evolution of a study area and for the estimation of risk scenarios as a basis for civil protection purposes. Phenomenon historical data is useful for expanding the historical period of investigation in order to assess the occurrence trend of DHEs. The time series approach includes the collection and the statistical analysis of climatic and rainfall data (monthly rainfall, wet days, rainfall intensity, and temperature data together with the annual maximum of short-duration rainfall data, from 1 hour to 5 days), which are also used as a proxy for floods and landslides. The climatic and rainfall data are useful to characterise the climate variations and trends and to roughly assess the effects of these trends on river discharge and on the triggering of landslides. The time series approach is completed by tools to analyse simultaneously all data types. The methodology was tested considering a selected Italian region (Apulia, southern Italy). The data were collected in two databases: a damaging hydrogeological event database (1186 landslides and floods since 1918) and a climate database (from 1877; short-duration rainfall from 1921). A statistically significant decreasing trend of rainfall intensity and an increasing trend of temperature, landslides, and DHEs were observed. A generalised decreasing trend of short-duration rainfall was observed. If there is not an evident relationship between climate variability and the variability of DHE occurrences, the role of anthropogenic modifications (increasing use or misuse of flood- and landslide-prone areas) could be hypothesized to justify the increasing occurrences of floods and landslides.. This study identifies the advantages of a simplifying approach to reduce the intrinsic complexities of the spatial-temporal analysis of climate variability, permitting the simultaneous analysis of the modification of flood and landslide occurrences.

  11. The National Landslide Database of Great Britain: Acquisition, communication and the role of social media

    NASA Astrophysics Data System (ADS)

    Pennington, Catherine; Freeborough, Katy; Dashwood, Claire; Dijkstra, Tom; Lawrie, Kenneth

    2015-11-01

    The British Geological Survey (BGS) is the national geological agency for Great Britain that provides geoscientific information to government, other institutions and the public. The National Landslide Database has been developed by the BGS and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 17,000 records of landslide events to date, each documented as fully as possible for inland, coastal and artificial slopes. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and using citizen science through social media and other online resources. This information is invaluable for directing the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domains map currently under development, as well as regional mapping campaigns, rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures, an understanding of causative factors, their spatial distribution and likely impacts, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) and Hazard Impact Model contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP partnership and data collected for the National Landslide Database are used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.

  12. A New Data Acquisition Portal for the Sacramento River Settlement Contractors

    NASA Astrophysics Data System (ADS)

    Narlesky, P. E., C. A.; Williams, P. E., A. M.

    2017-12-01

    In 1964, the United States Bureau of Reclamation (Reclamation) executed settlement contracts with the Sacramento River Settlement Contractors (SRSC), entities which hold water rights along the Sacramento River with area of origin protection or that are senior to Reclamation's water rights for Shasta Reservoir. Shasta is the cornerstone of the federal Central Valley Project (CVP), one of the nation's largest multi-purpose water conservation programs. In order to optimize CVP operations for multiple beneficial uses including water supply, fisheries, water quality, and waterfowl habitat, the SRSC voluntarily agreed to adaptively manage diversions throughout the year in close coordination with Reclamation. MBK Engineers assists the SRSC throughout this process by collecting, organizing, compiling, and distributing diversion data to Reclamation and others involved in operational decisions related to Shasta Reservoir and the CVP. To improve and expand participation in diversions reporting, we have developed the SRSC Web Portal, which launches a data-entry dashboard for members of the SRSC to facilitate recording and transmittal of both predicted and observed monthly and daily flow diversion data. This cloud-hosted system leverages a combination of Javascript interactive visualization libraries with a database-backed Python web framework to present streamlined data-entry forms and valuable SRSC program summary illustrations. SRSC program totals, which can now be aggregated through queries to the web-app's database backend, are used by Reclamation, SRSC, fish agencies, and others to inform operational decisions. By submitting diversion schedules and tracking actual diversions through the portal, contractors will also be directly contributing to the development of a richer and more consistently-formatted historical record for demand hydrology in the Sacramento River Watershed; this may be useful in future water supply studies. Adoption of this technology will foster an increased appreciation for the historical record of individual and combined Sacramento River diversions relative to the overall system.

  13. Spatiotemporal conceptual platform for querying archaeological information systems

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Sartzetaki, Mary; Sarris, Apostolos

    2015-04-01

    Spatial and temporal distribution of archaeological sites has been shown to associate with several attributes including marine, water, mineral and food resources, climate conditions, geomorphological features, etc. In this study, archeological settlement attributes are evaluated under various associations in order to provide a specialized query platform in a geographic information system (GIS). Towards this end, a spatial database is designed to include a series of archaeological findings for a secluded geographic area of Crete in Greece. The key categories of the geodatabase include the archaeological type (palace, burial site, village, etc.), temporal information of the habitation/usage period (pre Minoan, Minoan, Byzantine, etc.), and the extracted geographical attributes of the sites (distance to sea, altitude, resources, etc.). Most of the related spatial attributes are extracted with readily available GIS tools. Additionally, a series of conceptual data attributes are estimated, including: Temporal relation of an era to a future one in terms of alteration of the archaeological type, topologic relations of various types and attributes, spatial proximity relations between various types. These complex spatiotemporal relational measures reveal new attributes towards better understanding of site selection for prehistoric and/or historic cultures, yet their potential combinations can become numerous. Therefore, after the quantification of the above mentioned attributes, they are classified as of their importance for archaeological site location modeling. Under this new classification scheme, the user may select a geographic area of interest and extract only the important attributes for a specific archaeological type. These extracted attributes may then be queried against the entire spatial database and provide a location map of possible new archaeological sites. This novel type of querying is robust since the user does not have to type a standard SQL query but graphically select an area of interest. In addition, according to the application at hand, novel spatiotemporal attributes and relations can be supported, towards the understanding of historical settlement patterns.

  14. RESPONSE OF GULF COAST ESTUARIES TO NUTRIENT LOAD: DISSOLVED OXYGEN DEPLETION

    EPA Science Inventory

    GED has developed a process-based approach to hypoxia research on Pensacola Bay as a model Gulf of Mexico estuary. We selected Pensacola Bay because, like many Gulf coast estuaries, it is shallow, microtidal, and experiences seasonal hypoxia. We also have an historical database ...

  15. The Role of Research-Oriented Universities in School Change

    ERIC Educational Resources Information Center

    Nur, Mary Morison

    1986-01-01

    The interdisciplinary school-university partnership based at Stanford University is establishing a database for developing educational policy. The following features are discussed: (1) historical perspective; (2) data collection/feedback process and its contribution to the linking of researcher and practitioner on a national basis; (3) lessons…

  16. Access to destinations : arterial data acquisition and network-wide travel time estimation (phase II).

    DOT National Transportation Integrated Search

    2010-03-01

    The objectives of this project were to (a) produce historic estimates of travel times on Twin-Cities arterials : for 1995 and 2005, and (b) develop an initial architecture and database that could, in the future, produce timely : estimates of arterial...

  17. Development of a publicly available, comprehensive database of fiber and health outcomes: rationale and methods

    USDA-ARS?s Scientific Manuscript database

    Background: Dietary fiber is a broad category of compounds historically defined as partially or completely indigestible plant-based carbohydrates and lignin with, more recently, the additional criteria that fibers incorporated into foods as additives should demonstrate functional human health outcom...

  18. Detection and measurement of total ozone from stellar spectra: Paper 2. Historic data from 1935-1942

    NASA Astrophysics Data System (ADS)

    Griffin, R. E. M.

    2006-06-01

    Atmospheric ozone columns are derived from historic stellar spectra observed between 1935 and 1942 at Mount Wilson Observatory, California. Comparisons with contemporary measurements in the Arosa database show a generally close correspondence, while a similar comparison with more sparse data from Table Mountain reveals a difference of ~15-20%, as has also been found by other researches of the latter data. The results of the analysis indicate that astronomy's archives command considerable potential for investigating the natural levels of ozone and its variability during the decades prior to anthropogenic interference.

  19. The Growth Dynamics of Words: How Historical Context Shapes the Competitive Linguistic Environment

    NASA Astrophysics Data System (ADS)

    Tenenbaum, Joel; Petersen, Alexander; Havlin, Shlomo; Stanley, H. Eugene

    2012-02-01

    Using the massive Google n-gram database of over 10^11 word uses in English, Hebrew, and Spanish, we explore the connection between the growth rates of relative word use and the observed growth rates of disparate competing actors in a common environment such as businesses, scientific journals, and universities, supporting the concept that a language's lexicon is a generic arena for competition, evolving according to selection laws. We find aggregate-level anomalies in the collective statistics corresponding to the time of key historical events such as World War II and the Balfour Declaration.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, A.; Seuntjens, J.; Parker, W.

    We describe development of automated, web-based, electronic health record (EHR) auditing software for use within our paperless radiation oncology clinic. By facilitating access to multiple databases within the clinic, each patient's EHR is audited prior to treatment, regularly during treatment, and post treatment. Anomalies such as missing documentation, non-compliant workflow and treatment parameters that differ significantly from the norm may be monitored, flagged and brought to the attention of clinicians. By determining historical trends using existing patient data and by comparing new patient data with the historical, we expect our software to provide a measurable improvement in the quality ofmore » radiotherapy at our centre.« less

  1. Toxicity tests aiming to protect Brazilian aquatic systems: current status and implications for management.

    PubMed

    Martins, Samantha Eslava; Bianchini, Adalto

    2011-07-01

    The current status of toxicological tests performed with Brazilian native species was evaluated through a survey of the scientific data available in the literature. The information gathered was processed and an electronic toxicology database (http://www.inct-ta.furg.br/bd_toxicologico.php) was generated. This database provides valuable information for researchers to select sensitive and tolerant aquatic species to a large variety of aquatic pollutants. Furthermore, the toxicology database allows researchers to select species representative of an ecosystem of interest. Analysis of the toxicology database showed that ecotoxicological assays have significantly improved in Brazil over the last decade, in spite of the still relatively low number of tests performed and the restricted number of native species tested. This is because most of the research is developed in a few laboratories concentrated in certain regions of Brazil, especially in Southern and Southeast regions. Considering the extremely rich biodiversity and the large variety of aquatic ecosystems in Brazil, this finding points to the urgent need for the development of ecotoxicological studies with other groups of aquatic animals, such as insects, foraminifera, cnidarians, worms, amphibians, among others. This would help to derive more realistic water quality criteria (WQC) values, which would better protect the different aquatic ecosystems in Brazil. Finally, the toxicology database generated presents solid and science based information, which can encourage and drive the Environmental Regulatory Agencies in Brazil to derive WQC based on native species. In this context, the present paper discusses the historical evolution of ecotoxicological studies in Brazil, and how they have contributed to the improvement of the Brazilian Federal and Regional regulations for environment.

  2. Pacific walrus coastal haulout database, 1852-2016— Background report

    USGS Publications Warehouse

    Fischbach, Anthony S.; Kochnev, Anatoly A.; Garlich-Miller, Joel L.; Jay, Chadwick V.

    2016-01-01

    Walruses are large benthic predators that rest out of water between foraging bouts. Coastal “haulouts” (places where walruses rest) are formed by adult males in summer and sometimes by females and young when sea ice is absent, and are often used repeatedly across seasons and years. Understanding the geography and historical use of haulouts provides a context for conservation efforts. We summarize information on Pacific walrus haulouts from available reports (n =151), interviews with coastal residents and aviators, and personal observations of the authors. We provide this in the form of a georeferenced database that can be queried and displayed with standard geographic information system and database management software. The database contains 150 records of Pacific walrus haulouts, with a summary of basic characteristics on maximum haulout aggregation size, age-sex composition, season of use, and decade of most recent use. Citations to reports are provided in the appendix and as a bibliographic database. Haulouts were distributed across the coasts of the Pacific walrus range; however, the largest (maximum >10,000 walruses) of the haulouts reported in the recent 4 decades (n=19) were concentrated on the Russian shores in regions near the Bering Strait and northward into the western Chukchi Sea (n=17). Haulouts of adult female and young walruses primarily occurred in the Bering Strait region and areas northward, with others occurring in the central Bering Sea, Gulf of Anadyr, and Saint Lawrence Island regions. The Gulf of Anadyr was the only region to contain female and young walrus haulouts, which formed after the northward spring migration and prior to autumn ice formation.

  3. Durand Neighbourhood Heritage Inventory: Toward a Digital Citywide Survey Approach to Heritage Planning in Hamilton

    NASA Astrophysics Data System (ADS)

    Angel, V.; Garvey, A.; Sydor, M.

    2017-08-01

    In the face of changing economies and patterns of development, the definition of heritage is diversifying, and the role of inventories in local heritage planning is coming to the fore. The Durand neighbourhood is a layered and complex area located in inner-city Hamilton, Ontario, Canada, and the second subject area in a set of pilot inventory studies to develop a new city-wide inventory strategy for the City of Hamilton,. This paper presents an innovative digital workflow developed to undertake the Durand Built Heritage Inventory project. An online database was developed to be at the centre of all processes, including digital documentation, record management, analysis and variable outputs. Digital tools were employed for survey work in the field and analytical work in the office, resulting in a GIS-based dataset that can be integrated into Hamilton's larger municipal planning system. Together with digital mapping and digitized historical resources, the Durand database has been leveraged to produce both digital and static outputs to shape recommendations for the protection of Hamilton's heritage resources.

  4. Historical records of the geomagnetic field

    NASA Astrophysics Data System (ADS)

    Arneitz, Patrick; Heilig, Balázs; Vadasz, Gergely; Valach, Fridrich; Dolinský, Peter; Hejda, Pavel; Fabian, Karl; Hammerl, Christa; Leonhardt, Roman

    2014-05-01

    Records of historical direct measurements of the geomagnetic field are invaluable sources to reconstruct temporal variations of the Earth's magnetic field. They provide information about the field evolution back to the late Middle Age. We have investigated such records with focus on Austria and some neighbouring countries. A variety of new sources and source types are examined. These include 19th century land survey and observatory records of the Imperial and Royal "Centralanstalt f. Meteorologie und Erdmagnetismus", which are not included in the existing compilations. Daily measurements at the Imperial and Royal Observatory in Prague have been digitized. The Imperial and Royal Navy carried out observations in the Adriatic Sea during several surveys. Declination values have been collected from famous mining areas in the former Austro-Hungarian Empire. In this connection, a time series for Banska Stiavnica has been compiled. In the meteorological yearbooks of the monastery Kremsmünster regular declination measurements for the first half of the 19th century were registered. Marsigli's observations during military mapping works in 1696 are also included in our collection. Moreover, compass roses on historical maps or declination values marked on compasses, sundials or globes also provide information about ancient field declination. An evaluation of church orientations in Lower Austria and Northern Germany did not support the hypothesis that church naves had been aligned along the East-West direction by means of magnetic compasses. Therefore, this potential source of information must be excluded from our collection. The gathered records are integrated into a database together with corresponding metadata, such as the used measurement instruments and methods. This information allows an assessment of quality and reliability of the historical observations. The combination of compilations of historical measurements with high quality archeo- and paleomagnetic data in a single database enables a reliable joint evaluation of all types of magnetic field records from different origins. This collection forms the basis for a combined inverse modelling of the geomagnetic field evolution.

  5. Aggregating Hydrometeorological Data from International Monitoring Networks Across Earth's Largest Lake System to Quantify Uncertainty in Historical Water Budget Records, Improve Regional Water Budget Projections, and Differentiate Drivers Behind a Recent Record-Setting Surge in Water Levels

    NASA Astrophysics Data System (ADS)

    Gronewold, A.; Bruxer, J.; Smith, J.; Hunter, T.; Fortin, V.; Clites, A. H.; Durnford, D.; Qian, S.; Seglenieks, F.

    2015-12-01

    Resolving and projecting the water budget of the North American Great Lakes basin (Earth's largest lake system) requires aggregation of data from a complex array of in situ monitoring and remote sensing products that cross an international border (leading to potential sources of bias and other inconsistencies), and are relatively sparse over the surfaces of the lakes themselves. Data scarcity over the surfaces of the lakes is a particularly significant problem because, unlike Earth's other large freshwater basins, the Great Lakes basin water budget is (on annual scales) comprised of relatively equal contributions from runoff, over-lake precipitation, and over-lake evaporation. Consequently, understanding drivers behind changes in regional water storage and water levels requires a data management framework that can reconcile uncertainties associated with data scarcity and bias, and propagate those uncertainties into regional water budget projections and historical records. Here, we assess the development of a historical hydrometeorological database for the entire Great Lakes basin with records dating back to the late 1800s, and describe improvements that are specifically intended to differentiate hydrological, climatological, and anthropogenic drivers behind recent extreme changes in Great Lakes water levels. Our assessment includes a detailed analysis of the extent to which extreme cold winters in central North America in 2013-2014 (caused by the anomalous meridional upper air flow - commonly referred to in the public media as the "polar vortex" phenomenon) altered the thermal and hydrologic regimes of the Great Lakes and led to a record setting surge in water levels between January 2014 and December 2015.

  6. Reliability of Beam Loss Monitor Systems for the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2005-06-01

    The increase of beam energy and beam intensity, together with the use of super conducting magnets, opens new failure scenarios and brings new criticalities for the whole accelerator protection system. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system, and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particles losses at 7 TeV and assisted by the Fast Beam Current Decay Monitors at 450 GeV. At medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data has been processed by reliability software (Isograph). The analysis spaces from the components data to the system configuration.

  7. HISTORY OF TROPOSPHERIC OZONE FOR THE SAN BERNARDINO MOUNTAINS OF SOUTHERN CALIFORNIA, 1963-1999

    EPA Science Inventory

    A historical database of hourly O3 concentrations for Crestline, California in 1963-1999 has been developed based on all available representative oxidant/ozone monitoring data taken since 1963. All data were obtained from the California Air Resources Board and the U.S. Departmen...

  8. "Stressed and Sexy": Lexical Borrowing in Cape Town Xhosa

    ERIC Educational Resources Information Center

    Dowling, Tessa

    2011-01-01

    Codeswitching by African language speakers in South Africa (whether speaking English or the first language) has been extensively commented on and researched. Many studies analyse the historical, political and sociolinguistic factors behind this growing phenomenon, but there appears to be a little urgency about establishing a database of new…

  9. YAQUINA BAY NUTRIENT CRITERIA CASE STUDY: APPROACHES TO ESTUARINE NUTRIENT CRITERIA IN THE PACIFIC NORTHWEST

    EPA Science Inventory

    The presentation presents an introduction to the Yaquina Bay Nutrient Case Study which provides approaches for development of estuarine nutrient criteria in the Pacific Northwest. As part of this effort, a database of historic and recent data has been assembled consisting of phy...

  10. Profiling Chemicals Based on Chronic Toxicity Results from the U.S. EPA ToxRef Database

    EPA Science Inventory

    Thirty years of pesticide registration toxicity data have been historically stored as hardcopy and scanned documents by the U.S. Environmental Protection Agency (EPA) . A significant portion of these data have now been processed into standardized and structured toxicity data with...

  11. YAQUINA ESTUARY NUTRIENT CRITERIA CASE STUDY: GUIDANCE FOR DEVELOPING NUTRIENT CRITERIA IN THE PACIFIC NORTHWEST

    EPA Science Inventory

    The presentation provides an introduction to the Yaquina Estuary Nutrient Case Study which includes considerations for development of estuarine nutrient criteria in the Pacific Northwest. As part of this effort, a database of historic and recent data has been assembled consistin...

  12. Cracking the Egg: The South Carolina Digital Library's New Perspective

    ERIC Educational Resources Information Center

    Vinson, Christopher G.; Boyd, Kate Foster

    2008-01-01

    This article explores the historical foundations of the South Carolina Digital Library, a collaborative statewide program that ties together academic special collections and archives, public libraries, state government archives, and other cultural resource institutions in an effort to provide the state with a comprehensive database of online…

  13. Counseling and Spirituality: A Historical Review

    ERIC Educational Resources Information Center

    Powers, Robin

    2005-01-01

    Evolution of the relationship between counseling and spirituality since 1840 is examined in terms of the number of publications that have appeared over time that include these terms. The author retrieved the data using the American Psychological Association's PsycINFO database. A similar search was done adding the term training. The rise of…

  14. Concepts and Technologies for a Comprehensive Information System for Historical Research and Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Henze, F.; Magdalinski, N.; Schwarzbach, F.; Schulze, A.; Gerth, Ph.; Schäfer, F.

    2013-07-01

    Information systems play an important role in historical research as well as in heritage documentation. As part of a joint research project of the German Archaeological Institute, the Brandenburg University of Technology Cottbus and the Dresden University of Applied Sciences a web-based documentation system is currently being developed, which can easily be adapted to the needs of different projects with individual scientific concepts, methods and questions. Based on open source and standardized technologies it will focus on open and well-documented interfaces to ease the dissemination and re-use of its content via web-services and to communicate with desktop applications for further evaluation and analysis. Core of the system is a generic data model that represents a wide range of topics and methods of archaeological work. By the provision of a concerted amount of initial themes and attributes a cross project analysis of research data will be possible. The development of enhanced search and retrieval functionalities will simplify the processing and handling of large heterogeneous data sets. To achieve a high degree of interoperability with existing external data, systems and applications, standardized interfaces will be integrated. The analysis of spatial data shall be possible through the integration of web-based GIS functions. As an extension to this, customized functions for storage, processing and provision of 3D geo data are being developed. As part of the contribution system requirements and concepts will be presented and discussed. A particular focus will be on introducing the generic data model and the derived database schema. The research work on enhanced search and retrieval capabilities will be illustrated by prototypical developments, as well as concepts and first implementations for an integrated 2D/3D Web-GIS.

  15. Lake Pontchartrain Basin: bottom sediments and related environmental resources

    USGS Publications Warehouse

    Manheim, Frank T.; Hayes, Laura

    2002-01-01

    Lake Pontchartrain is the largest estuary southern Louisiana. It is an important recreational, commercial, and environmental resource for New Orleans and southwestern Louisiana. This publication is part of a 5-year cooperative program led by the USGS on the geological framework and sedimentary processes of the Lake Pontchartrain Basin.This presentation is divided into two main parts:- Scientific Research and Assessments- Multimedia Tools and Regional ResourcesThe scientific sections include historical information on the area; shipboard, field, and remote sensing studies; and a comprehensive sediment database with geological and chemical discussions of the region.The multimedia and resources sections include Geographic Information System (GIS) tools and data, a video demonstrating vibracore sampling techniques in Lake Pontchartrain, and abstracts from four Basics of the Basin symposia.

  16. 2011 Tohoku, Japan tsunami data available from the National Oceanic and Atmospheric Administration/National Geophysical Data Center

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Mccullough, H. L.; Mungov, G.; Harris, E.

    2012-12-01

    The U.S. National Oceanic and Atmospheric Administration (NOAA) has primary responsibility for providing tsunami warnings to the Nation, and a leadership role in tsunami observations and research. A key component of this effort is easy access to authoritative data on past tsunamis, a responsibility of the National Geophysical Data Center (NGDC) and collocated World Service for Geophysics. Archive responsibilities include the global historical tsunami database, coastal tide-gauge data from US/NOAA operated stations, the Deep-ocean Assessment and Reporting of Tsunami (DART®) data, damage photos, as well as other related hazards data. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Understanding the severity and timing of tsunami effects is important for tsunami hazard mitigation and warning. The global historical tsunami database includes the date, time, and location of the source event, magnitude of the source, event validity, maximum wave height, the total number of fatalities and dollar damage. The database contains additional information on run-ups (locations where tsunami waves were observed by eyewitnesses, field reconnaissance surveys, tide gauges, or deep ocean sensors). The run-up table includes arrival times, distance from the source, measurement type, maximum wave height, and the number of fatalities and damage for the specific run-up location. Tide gauge data are required for modeling the interaction of tsunami waves with the coast and for verifying propagation and inundation models. NGDC is the long-term archive for all NOAA coastal tide gauge data and is currently archiving 15-second to 1-minute water level data from the NOAA Center for Operational Oceanographic Products and Services (CO-OPS) and the NOAA Tsunami Warning Centers. DART® buoys, which are essential components of tsunami warning systems, are now deployed in all oceans, giving coastal communities faster and more accurate tsunami warnings. NOAA's National Data Buoy Center disseminates real-time DART® data and NGDC processes and archives post-event 15-second high-resolution bottom pressure time series data. An event-specific archive of DART® observations recorded during recent significant tsunamis, including the March 2011 Tohoku, Japan event, are now available through new tsunami event pages integrated with the NGDC global historical tsunami database. These pages are developed to deliver comprehensive summaries of each tsunami event, including socio-economic impacts, tsunami travel time maps, raw observations, de-tided residuals, spectra of the tsunami signal compared to the energy of the background noise, and wavelets. These data are invaluable to tsunami researchers and educators as they are essential to providing a more thorough understanding of tsunamis and their propagation in the open ocean and subsequent inundation of coastal communities. NGDC has collected 289 tide gauge observations, 34 Deep-ocean Assessment and Reporting of Tsunami (DART®) and bottom pressure recorder (BPR) station observations, and over 5,000 eyewitness reports and post-tsunami field survey measurements for the 2011 Tohoku event.

  17. JET ICRH plant statistics from 2008-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wooldridge, E.; Monakhov, I.; Blackman, T.

    2014-02-12

    JET ICRH plant faults from 2008 - 2012 have been catalogued and a new assessment of the reliability of the plant by sub-system is given. Data from pulses where ICRH was used, excluding the ITER-Like Antenna (ILA) and its generators, has been collated. This is compared to fault data in order to investigate any correlation between faults and operations. The number of faults is shown to have decreased between 2011-2012 in comparison to 2008-2009 as the time between faults is shown to have increased. Future electronic fault logging requirements to enable easier analysis are discussed. Due to the changing configurationmore » of the ICRH plant; the introduction of ELM tolerant systems, generator upgrade, changes to the settings of the VSWR protection et cetera, a method to expand the fault database to include more historical data [1] in a consistent way are discussed.« less

  18. Human error mitigation initiative (HEMI) : summary report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Susan M.; Ramos, M. Victoria; Wenner, Caren A.

    2004-11-01

    Despite continuing efforts to apply existing hazard analysis methods and comply with requirements, human errors persist across the nuclear weapons complex. Due to a number of factors, current retroactive and proactive methods to understand and minimize human error are highly subjective, inconsistent in numerous dimensions, and are cumbersome to characterize as thorough. An alternative and proposed method begins with leveraging historical data to understand what the systemic issues are and where resources need to be brought to bear proactively to minimize the risk of future occurrences. An illustrative analysis was performed using existing incident databases specific to Pantex weapons operationsmore » indicating systemic issues associated with operating procedures that undergo notably less development rigor relative to other task elements such as tooling and process flow. Future recommended steps to improve the objectivity, consistency, and thoroughness of hazard analysis and mitigation were delineated.« less

  19. Education as an intergenerational process of human learning, teaching, and development.

    PubMed

    Cole, Michael

    2010-11-01

    In this article I argue that the future of psychological research on educational processes would benefit from an interdisciplinary approach that enables psychologists to locate their objects of study within the cultural, social, and historical contexts of their research. To make this argument, I begin by examining anthropological accounts of the characteristics of education in small, face-to-face, preindustrial societies. I then turn to a sample of contemporary psychoeducational research that seeks to implement major, qualitative changes in modern educational practices by transforming them to have the properties of education in those self-same face-to-face societies. Next I examine the challenges faced by these modern approaches and briefly describe a multi-institutional, multidisciplinary system of education that responds to these challenges while offering a model for educating psychology students in a multigenerational system of activities with potential widespread benefits. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  20. Shaping nursing profession regulation through history - a systematic review.

    PubMed

    Stievano, A; Caruso, R; Pittella, F; Shaffer, F A; Rocco, G; Fairman, J

    2018-03-23

    The aim of this systematic review was to provide a critical synthesis of the factors that historically shaped the advancements of nursing regulators worldwide. An in-depth examination of the different factors that moulded regulatory changes over time is pivotal to comprehend current issues in nursing. In the light of global health scenarios, the researchers explored the factors that historically influenced the socio-contextual circumstances upon which governments made regulatory changes. A systematic search was performed on the following databases: PubMed, CINAHL, Scopus, OpenGrey and ScienceDirect. The review included papers from January 2000 to October 2016 published in English. The authors used the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) and an inductive thematic approach for synthesis. Two main themes were identified: factors underpinning current challenges and historical and contextual triggers of regulation. The first theme was composed of three aspects: education, migration and internationalization, and policy and regulation; the second theme consisted of four attributes: demographics, economics, history of registration and wars, and historical changes in nursing practice. Factors that shaped nursing regulation were linked to changing demographics and economics, education, history of nursing registration, shifting patterns of migration and internationalization, nursing practice, policy and regulation and significant societal turns often prompted by wars. A deeper understanding of the developments of the nursing regulatory institutions provides the foundation for portable standards that can be applied across an array of jurisdictions to guarantee a better public safety. Understanding factors that socially, legislatively and politically have influenced the development of regulatory bodies over time helps to mould local, national and international policies that have a stronger impact on health worldwide. To achieve this, there must be effective cooperation among systems of nursing regulations globally. © 2018 International Council of Nurses.

  1. Oceans 2.0: Interactive tools for the Visualization of Multi-dimensional Ocean Sensor Data

    NASA Astrophysics Data System (ADS)

    Biffard, B.; Valenzuela, M.; Conley, P.; MacArthur, M.; Tredger, S.; Guillemot, E.; Pirenne, B.

    2016-12-01

    Ocean Networks Canada (ONC) operates ocean observatories on all three of Canada's coasts. The instruments produce 280 gigabytes of data per day with 1/2 petabyte archived so far. In 2015, 13 terabytes were downloaded by over 500 users from across the world. ONC's data management system is referred to as "Oceans 2.0" owing to its interactive, participative features. A key element of Oceans 2.0 is real time data acquisition and processing: custom device drivers implement the input-output protocol of each instrument. Automatic parsing and calibration takes place on the fly, followed by event detection and quality control. All raw data are stored in a file archive, while the processed data are copied to fast databases. Interactive access to processed data is provided through data download and visualization/quick look features that are adapted to diverse data types (scalar, acoustic, video, multi-dimensional, etc). Data may be post or re-processed to add features, analysis or correct errors, update calibrations, etc. A robust storage structure has been developed consisting of an extensive file system and a no-SQL database (Cassandra). Cassandra is a node-based open source distributed database management system. It is scalable and offers improved performance for big data. A key feature is data summarization. The system has also been integrated with web services and an ERDDAP OPeNDAP server, capable of serving scalar and multidimensional data from Cassandra for fixed or mobile devices.A complex data viewer has been developed making use of the big data capability to interactively display live or historic echo sounder and acoustic Doppler current profiler data, where users can scroll, apply processing filters and zoom through gigabytes of data with simple interactions. This new technology brings scientists one step closer to a comprehensive, web-based data analysis environment in which visual assessment, filtering, event detection and annotation can be integrated.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baptista, António M.

    This work focuses on the numerical modeling of Columbia River estuarine circulation and associated modeling-supported analyses conducted as an integral part of a multi-disciplinary and multi-institutional effort led by NOAA's Northwest Fisheries Science Center. The overall effort is aimed at: (1) retrospective analyses to reconstruct historic bathymetric features and assess effects of climate and river flow on the extent and distribution of shallow water, wetland and tidal-floodplain habitats; (2) computer simulations using a 3-dimensional numerical model to evaluate the sensitivity of salmon rearing opportunities to various historical modifications affecting the estuary (including channel changes, flow regulation, and diking of tidalmore » wetlands and floodplains); (3) observational studies of present and historic food web sources supporting selected life histories of juvenile salmon as determined by stable isotope, microchemistry, and parasitology techniques; and (4) experimental studies in Grays River in collaboration with Columbia River Estuary Study Taskforce (CREST) and the Columbia Land Trust (CLT) to assess effects of multiple tidal wetland restoration projects on various life histories of juvenile salmon and to compare responses to observed habitat-use patterns in the mainstem estuary. From the above observations, experiments, and additional modeling simulations, the effort will also (5) examine effects of alternative flow-management and habitat-restoration scenarios on habitat opportunity and the estuary's productive capacity for juvenile salmon. The underlying modeling system is part of the SATURN1coastal-margin observatory [1]. SATURN relies on 3D numerical models [2, 3] to systematically simulate and understand baroclinic circulation in the Columbia River estuary-plume-shelf system [4-7] (Fig. 1). Multi-year simulation databases of circulation are produced as an integral part of SATURN, and have multiple applications in understanding estuary/plume variability, the role of the estuary and plume on salmon survival, and functional changes in the estuary-plume system in response to climate and human activities.« less

  3. Common Cause Failure Modeling

    NASA Technical Reports Server (NTRS)

    Hark, Frank; Britton, Paul; Ring, Rob; Novack, Steven D.

    2016-01-01

    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFS are a set of dependent type of failures that can be caused by: system environments; manufacturing; transportation; storage; maintenance; and assembly, as examples. Since there are many factors that contribute to CCFs, the effects can be reduced, but they are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and CCF (dependent) failure and data is limited, especially for launch vehicles. The Probabilistic Risk Assessment (PRA) of NASA's Safety and Mission Assurance Directorate at Marshal Space Flight Center (MFSC) is using generic data from the Nuclear Regulatory Commission's database of common cause failures at nuclear power plants to estimate CCF due to the lack of a more appropriate data source. There remains uncertainty in the actual magnitude of the common cause risk estimates for different systems at this stage of the design. Given the limited data about launch vehicle CCF and that launch vehicles are a highly redundant system by design, it is important to make design decisions to account for a range of values for independent and CCFs. When investigating the design of the one-out-of-two component redundant system for launch vehicles, a response surface was constructed to represent the impact of the independent failure rate versus a common cause beta factor effect on a system's failure probability. This presentation will define a CCF and review estimation calculations. It gives a summary of reduction methodologies and a review of examples of historical CCFs. Finally, it presents the response surface and discusses the results of the different CCFs on the reliability of a one-out-of-two system.

  4. Common Cause Failure Modeling

    NASA Technical Reports Server (NTRS)

    Hark, Frank; Britton, Paul; Ring, Rob; Novack, Steven D.

    2015-01-01

    Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFS are a set of dependent type of failures that can be caused by: system environments; manufacturing; transportation; storage; maintenance; and assembly, as examples. Since there are many factors that contribute to CCFs, the effects can be reduced, but they are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and CCF (dependent) failure and data is limited, especially for launch vehicles. The Probabilistic Risk Assessment (PRA) of NASA's Safety and Mission Assurance Directorate at Marshall Space Flight Center (MFSC) is using generic data from the Nuclear Regulatory Commission's database of common cause failures at nuclear power plants to estimate CCF due to the lack of a more appropriate data source. There remains uncertainty in the actual magnitude of the common cause risk estimates for different systems at this stage of the design. Given the limited data about launch vehicle CCF and that launch vehicles are a highly redundant system by design, it is important to make design decisions to account for a range of values for independent and CCFs. When investigating the design of the one-out-of-two component redundant system for launch vehicles, a response surface was constructed to represent the impact of the independent failure rate versus a common cause beta factor effect on a system's failure probability. This presentation will define a CCF and review estimation calculations. It gives a summary of reduction methodologies and a review of examples of historical CCFs. Finally, it presents the response surface and discusses the results of the different CCFs on the reliability of a one-out-of-two system.

  5. Historical loss thinking and symptoms of depression are influenced by ethnic experience in American Indian college students.

    PubMed

    Tucker, Raymond P; Wingate, LaRicka R; O'Keefe, Victoria M

    2016-07-01

    Recent research has indicated that historical loss may play an important role in the experience of depression symptoms in American Indian/Alaska Native people. Increased frequency of historical loss thinking has been related to symptoms of depression and other pervasive psychological outcomes (i.e., substance abuse) in American Indian and Canadian First Nations communities. The current study investigated how aspects of ethnic minority experience relate to the incidence of historical loss thinking and symptoms of depression in American Indian adults. Data are presented from 123 self-identified American Indian college students (ages 18-25, 67.50% female) who participated in the study in return for course credit and/or entrance into a raffle for gift cards. Participants completed the Adolescent Historical Loss Scale (AHLS), Scale of Ethnic Experiences (SEE), and the Center for Epidemiologic Studies-Depression Scale (CES-D). Indirect effects of ethnic experience on symptoms of depression through historical loss thinking were calculated with nonparametric bootstrapping procedures. Results indicated that a strong ethnic identification, desire to predominantly socialize with other American Indians, and perceptions of discrimination were associated with increased historical loss thinking. Feelings of comfort and assimilation with the mainstream American culture were negatively related to historical loss thinking. Only perception of discrimination was directly related to symptoms of depression; however, ethnic identification and the preference to predominantly socialize with other American Indians were both indirectly related to elevated depressive symptoms through increased historical loss thinking. The clinical implications for these results are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Tapping into the Hexagon spy imagery database: A new automated pipeline for geomorphic change detection

    NASA Astrophysics Data System (ADS)

    Maurer, Joshua; Rupper, Summer

    2015-10-01

    Declassified historical imagery from the Hexagon spy satellite database has near-global coverage, yet remains a largely untapped resource for geomorphic change studies. Unavailable satellite ephemeris data make DEM (digital elevation model) extraction difficult in terms of time and accuracy. A new fully-automated pipeline for DEM extraction and image orthorectification is presented which yields accurate results and greatly increases efficiency over traditional photogrammetric methods, making the Hexagon image database much more appealing and accessible. A 1980 Hexagon DEM is extracted and geomorphic change computed for the Thistle Creek Landslide region in the Wasatch Range of North America to demonstrate an application of the new method. Surface elevation changes resulting from the landslide show an average elevation decrease of 14.4 ± 4.3 m in the source area, an increase of 17.6 ± 4.7 m in the deposition area, and a decrease of 30.2 ± 5.1 m resulting from a new roadcut. Two additional applications of the method include volume estimates of material excavated during the Mount St. Helens volcanic eruption and the volume of net ice loss over a 34-year period for glaciers in the Bhutanese Himalayas. These results show the value of Hexagon imagery in detecting and quantifying historical geomorphic change, especially in regions where other data sources are limited.

  7. Mass and Reliability System (MaRS)

    NASA Technical Reports Server (NTRS)

    Barnes, Sarah

    2016-01-01

    The Safety and Mission Assurance (S&MA) Directorate is responsible for mitigating risk, providing system safety, and lowering risk for space programs from ground to space. The S&MA is divided into 4 divisions: The Space Exploration Division (NC), the International Space Station Division (NE), the Safety & Test Operations Division (NS), and the Quality and Flight Equipment Division (NT). The interns, myself and Arun Aruljothi, will be working with the Risk & Reliability Analysis Branch under the NC Division's. The mission of this division is to identify, characterize, diminish, and communicate risk by implementing an efficient and effective assurance model. The team utilizes Reliability and Maintainability (R&M) and Probabilistic Risk Assessment (PRA) to ensure decisions concerning risks are informed, vehicles are safe and reliable, and program/project requirements are realistic and realized. This project pertains to the Orion mission, so it is geared toward a long duration Human Space Flight Program(s). For space missions, payload is a critical concept; balancing what hardware can be replaced by components verse by Orbital Replacement Units (ORU) or subassemblies is key. For this effort a database was created that combines mass and reliability data, called Mass and Reliability System or MaRS. The U.S. International Space Station (ISS) components are used as reference parts in the MaRS database. Using ISS components as a platform is beneficial because of the historical context and the environment similarities to a space flight mission. MaRS uses a combination of systems: International Space Station PART for failure data, Vehicle Master Database (VMDB) for ORU & components, Maintenance & Analysis Data Set (MADS) for operation hours and other pertinent data, & Hardware History Retrieval System (HHRS) for unit weights. MaRS is populated using a Visual Basic Application. Once populated, the excel spreadsheet is comprised of information on ISS components including: operation hours, random/nonrandom failures, software/hardware failures, quantity, orbital replaceable units (ORU), date of placement, unit weight, frequency of part, etc. The motivation for creating such a database will be the development of a mass/reliability parametric model to estimate mass required for replacement parts. Once complete, engineers working on future space flight missions will have access a mean time to failures and on parts along with their mass, this will be used to make proper decisions for long duration space flight missions

  8. Research Update: The materials genome initiative: Data sharing and the impact of collaborative ab initio databases

    DOE PAGES

    Jain, Anubhav; Persson, Kristin A.; Ceder, Gerbrand

    2016-03-24

    Materials innovations enable new technological capabilities and drive major societal advancements but have historically required long and costly development cycles. The Materials Genome Initiative (MGI) aims to greatly reduce this time and cost. Here, we focus on data reuse in the MGI and, in particular, discuss the impact of three different computational databases based on density functional theory methods to the research community. Finally, we discuss and provide recommendations on technical aspects of data reuse, outline remaining fundamental challenges, and present an outlook on the future of MGI's vision of data sharing.

  9. Data-Based Performance Assessments for the DOE Hydropower Advancement Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    March, Patrick; Wolff, Dr. Paul; Smith, Brennan T

    2012-01-01

    The U. S. Department of Energy s Hydropower Advancement Project (HAP) was initiated to characterize and trend hydropower asset conditions across the U.S.A. s existing hydropower fleet and to identify and evaluate the upgrading opportunities. Although HAP includes both detailed performance assessments and condition assessments of existing hydropower plants, this paper focuses on the performance assessments. Plant performance assessments provide a set of statistics and indices that characterize the historical extent to which each plant has converted the potential energy at a site into electrical energy for the power system. The performance metrics enable benchmarking and trending of performance acrossmore » many projects in a variety contexts (e.g., river systems, power systems, and water availability). During FY2011 and FY2012, assessments will be performed on ten plants, with an additional fifty plants scheduled for FY2013. This paper focuses on the performance assessments completed to date, details the performance assessment process, and describes results from the performance assessments.« less

  10. Resources monitoring and automatic management system for multi-VO distributed computing system

    NASA Astrophysics Data System (ADS)

    Chen, J.; Pelevanyuk, I.; Sun, Y.; Zhemchugov, A.; Yan, T.; Zhao, X. H.; Zhang, X. M.

    2017-10-01

    Multi-VO supports based on DIRAC have been set up to provide workload and data management for several high energy experiments in IHEP. To monitor and manage the heterogeneous resources which belong to different Virtual Organizations in a uniform way, a resources monitoring and automatic management system based on Resource Status System(RSS) of DIRAC has been presented in this paper. The system is composed of three parts: information collection, status decision and automatic control, and information display. The information collection includes active and passive way of gathering status from different sources and stores them in databases. The status decision and automatic control is used to evaluate the resources status and take control actions on resources automatically through some pre-defined policies and actions. The monitoring information is displayed on a web portal. Both the real-time information and historical information can be obtained from the web portal. All the implementations are based on DIRAC framework. The information and control including sites, policies, web portal for different VOs can be well defined and distinguished within DIRAC user and group management infrastructure.

  11. Geolocation of man-made reservoirs across terrains of varying complexity using GIS

    NASA Astrophysics Data System (ADS)

    Mixon, David M.; Kinner, David A.; Stallard, Robert F.; Syvitski, James P. M.

    2008-10-01

    The Reservoir Sedimentation Survey Information System (RESIS) is one of the world's most comprehensive databases of reservoir sedimentation rates, comprising nearly 6000 surveys for 1819 reservoirs across the continental United States. Sediment surveys in the database date from 1904 to 1999, though more than 95% of surveys were entered prior to 1980, making RESIS largely a historical database. The use of this database for large-scale studies has been limited by the lack of precise coordinates for the reservoirs. Many of the reservoirs are relatively small structures and do not appear on current USGS topographic maps. Others have been renamed or have only approximate (i.e. township and range) coordinates. This paper presents a method scripted in ESRI's ARC Macro Language (AML) to locate the reservoirs on digital elevation models using information available in RESIS. The script also delineates the contributing watersheds and compiles several hydrologically important parameters for each reservoir. Evaluation of the method indicates that, for watersheds larger than 5 km 2, the correct outlet is identified over 80% of the time. The importance of identifying the watershed outlet correctly depends on the application. Our intent is to collect spatial data for watersheds across the continental United States and describe the land use, soils, and topography for each reservoir's watershed. Because of local landscape similarity in these properties, we show that choosing the incorrect watershed does not necessarily mean that the watershed characteristics will be misrepresented. We present a measure termed terrain complexity and examine its relationship to geolocation success rate and its influence on the similarity of nearby watersheds.

  12. SurgeWatch: a user-friendly database of coastal flooding in the United Kingdom from 1915-2014

    NASA Astrophysics Data System (ADS)

    Wadey, Matthew; Haigh, Ivan; Nicholls, Robert J.; Ozsoy, Ozgun; Gallop, Shari; Brown, Jennifer; Horsburgh, Kevin; Bradshaw, Elizabeth

    2016-04-01

    Coastal flooding caused by extreme sea levels can be devastating, with long-lasting and diverse consequences. Historically, the UK has suffered major flooding events, and at present 2.5 million properties and £150 billion of assets are potentially exposed to coastal flooding. However, no formal system is in place to catalogue which storms and high sea level events progress to coastal flooding. Furthermore, information on the extent of flooding and associated damages is not systematically documented nationwide. Here we present a database and online tool called 'SurgeWatch', which provides a systematic UK-wide record of high sea level and coastal flood events over the last 100 years (1915-2014). Using records from the National Tide Gauge Network, with a dataset of exceedance probabilities and meteorological fields, SurgeWatch captures information of 96 storms during this period, the highest sea levels they produced, and the occurrence and severity of coastal flooding. The data are presented to be easily assessable and understandable to a range of users including, scientists, coastal engineers, managers and planners and concerned citizens. We also focus on some significant events in the database, such as the North Sea storm surge of 31 January-1 February 1953 (Northwest Europe's most severe coastal floods in living memory) and the 5-6 December 2013 "Xaver" Storm and floods.

  13. Exporting obesity: US farm and trade policy and the transformation of the Mexican consumer food environment.

    PubMed

    Clark, Sarah E; Hawkes, Corinna; Murphy, Sophia M E; Hansen-Kuhn, Karen A; Wallinga, David

    2012-01-01

    Obesity has reached epidemic proportions, in the United States as well as among its trade partners such as Mexico. It has been established that an "obesogenic" (obesity-causing) food environment is one influence on obesity prevalence. To isolate the particular role of NAFTA, the North American Free Trade Agreement, in changing Mexico's food environment, we plotted the flow of several key products between the United States and Mexico over the 14-year NAFTA period (1994-2008) and situated them in a broader historical context. Key sources of USDA data include the Foreign Agricultural Service's Global Agricultural Trade System, its official repository for current and historical data on imports, exports and re-exports, and its Production, Supply, and Distribution online database. US export data were queried for agricultural products linked to shifting diet patterns including: corn, soybeans, sugar and sweeteners, consumer-oriented products, and livestock products. The Bureau of Economic Analysis' Balance of Payments and Direct Investment Position Data in their web-based International Economic Accounts system also helped determine changes in US direct investment abroad from 1982 to 2009. Directly and indirectly, the United States has exported increasing amounts of corn, soybeans, sugar, snack foods, and meat products into Mexico over the last two decades. Facilitated by NAFTA, these exports are one important way in which US agriculture and trade policy influences Mexico's food system. Because of significant US agribusiness investment in Mexico across the full spectrum of the latter's food supply chain, from production and processing to distribution and retail, the Mexican food system increasingly looks like the industrialized food system of the United States.

  14. Academic Oral History: Life Review in Spite of Itself.

    ERIC Educational Resources Information Center

    Ryant, Carl

    The process and content of the life review should not be separated from the creation of an oral history. Several projects, undertaken at the University of Louisville Oral History Center, support the therapeutic aspects of reminiscence. The dichotomy between oral history, as an historical database, and life review, as a therapeutic exercise, breaks…

  15. The Lawyers in the 16th-18th Century's Germany: A Historical Database.

    ERIC Educational Resources Information Center

    Ranieri, Filippo

    1990-01-01

    Investigates the sociological backgrounds of German lawyers of the Holy Roman Empire through an analysis of the dissertations and disputations written during the seventeenth and eighteenth centuries. Focuses on their university education, family circumstances, and careers. Creates an information data bank to carry out the project. Predicts further…

  16. The 21st Century Writing Program: Collaboration for the Common Good

    ERIC Educational Resources Information Center

    Moberg, Eric

    2010-01-01

    The purpose of this report is to review the literature on theoretical frameworks, best practices, and conceptual models for the 21st century collegiate writing program. Methods include electronic database searches for recent and historical peer-reviewed scholarly literature on collegiate writing programs. The author analyzed over 65 sources from…

  17. Historic Bim: a New Repository for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Banfi, F.; Barazzetti, L.; Previtali, M.; Roncoroni, F.

    2017-05-01

    Recent developments in Building Information Modelling (BIM) technologies are facilitating the management of historic complex structures using new applications. This paper proposes a generative method combining the morphological and typological aspects of the historic buildings (H-BIM), with a set of monitoring information. This combination of 3D digital survey, parametric modelling and monitoring datasets allows for the development of a system for archiving and visualizing structural health monitoring (SHM) data (Fig. 1). The availability of a BIM database allows one to integrate a different kind of data stored in different ways (e.g. reports, tables, graphs, etc.) with a representation directly connected to the 3D model of the structure with appropriate levels of detail (LoD). Data can be interactively accessed by selecting specific objects of the BIM, i.e. connecting the 3D position of the sensors installed with additional digital documentation. Such innovative BIM objects, which form a new BIM family for SHM, can be then reused in other projects, facilitating data archiving and exploitation of data acquired and processed. The application of advanced modeling techniques allows for the reduction of time and costs of the generation process, and support cooperation between different disciplines using a central workspace. However, it also reveals new challenges for parametric software and exchange formats. The case study presented is the medieval bridge Azzone Visconti in Lecco (Italy), in which multi-temporal vertical movements during load testing were integrated into H-BIM.

  18. Multiscale Interactive Communication: Inside and Outside Thun Castle

    NASA Astrophysics Data System (ADS)

    Massari, G. A.; Luce, F.; Pellegatta, C.

    2011-09-01

    The applications of informatics to architecture have become, for professionals, a great tool for managing analytical phases and project activities but also, for the general public, new ways of communication that may relate directly present, past and future facts. Museums in historic buildings, their installations and the recent experiences of eco-museums located throughout the territory provide a privileged experimentation field for technical and digital representation. On the one hand, the safeguarding and the functional adaptation of buildings use 3D computer graphics models that are real spatially related databases: in them are ordered, viewed and interpreted the results of archival, artistic-historical, diagnostic, technological-structural studies and the assumption and feasibility of interventions. On the other hand, the disclosure of things and knowledge linked to collective memory relies on interactive maps and hypertext systems that provide access to authentic virtual museums; a sort of multimedia extension of the exhibition hall is produced to an architectural scale, but at landscape scale the result is an instrument of cultural development so far unpublished: works that are separated in direct perception find in a zenith view of the map a synthetic relation, related both to spatial parameters and temporal interpretations.

  19. Estimating Changes in Runoff and Carbon Exports From Northern Canadian Catchments Under a 2X CO2 Atmosphere

    NASA Astrophysics Data System (ADS)

    Clair, T. A.; Ehrman, J. M.

    2006-12-01

    The doubling of atmospheric CO2 on temperature and precipitation will change annual runoff and dissolved organic carbon (DOC) export patterns in northern Canada. Because of the physical size and the range of climatic changes of northern Canada, we found it necessary to model potential changes in river water and carbon exports in the region using a neural network approach. We developed a model for hydrology and one for DOC using as inputs, monthly General Circulation Model temperature and precipitation predictions, historical hydrology and dissolved organic carbon values, as well as catchment size and slope. Mining Environment Canada's historical hydrology and water chemistry databases allowed us to identify 20 sites suitable for our analysis. The site results were summarized within the Canadian Terrestrial Ecozone classification system. Our results show spring melts occurring one month sooner in all northern ecozones except for the Hudson Bay Plains zone, with changes in melt intensity occurring in most regions. The DOC model predicts that exports from catchments will increase by between 10 and 20% depending on the ecozone. Generally, we predict that major changes in both hydrology and carbon cycling should be expected in northern Canadian ecosystems in a warmer planet.

  20. Geospatial Multi-Agency Coordination (GeoMAC) wildland fire perimeters, 2008

    USGS Publications Warehouse

    Walters, Sandra P.; Schneider, Norma J.; Guthrie, John D.

    2011-01-01

    The Geospatial Multi-Agency Coordination (GeoMAC) has been collecting and storing data on wildland fire perimeters since August 2000. The dataset presented via this U.S. Geological Survey Data Series product contains the GeoMAC wildland fire perimeter data for the calendar year 2008, which are based upon input from incident intelligence sources, Global Positioning System (GPS) data, and infrared (IR) imagery. Wildland fire perimeter data are obtained from the incidents, evaluated for completeness and accuracy, and processed to reflect consistent field names and attributes. After a quality check, the perimeters are loaded to GeoMAC databases, which support the GeoMAC Web application for access by wildland fire managers and the public. The wildland fire perimeters are viewed through the Web application. The data are subsequently archived according to year and state and are made available for downloading through the Internet in shapefile and Keyhole Markup Language (KML) format. These wildland fire perimeter data are also retained for historical, planning, and research purposes. The datasets that pertain to this report can be found on the Rocky Mountain Geographic Science Center HTTP site at http://rmgsc.cr.usgs.gov/outgoing/GeoMAC/historic_fire_data/. The links are also provided on the sidebar.

  1. ICOADS: A Foundational Database with a new Release

    NASA Astrophysics Data System (ADS)

    Angel, W.; Freeman, E.; Woodruff, S. D.; Worley, S. J.; Brohan, P.; Dumenil-Gates, L.; Kent, E. C.; Smith, S. R.

    2016-02-01

    The International Comprehensive Ocean-Atmosphere Data Set (ICOADS) offers surface marine data spanning the past three centuries and is the world's largest collection of marine surface in situ observations with approximately 300 million unique records from 1662 to the present in a common International Maritime Meteorological Archive (IMMA) format. Simple gridded monthly summary products (including netCDF) for 2° latitude x 2° longitude boxes back to 1800 and 1° x 1° boxes since 1960 are computed for each month. ICOADS observations made available in the IMMA format are taken primarily from ships (merchant, ocean research, fishing, navy, etc.) and moored and drifting buoys. Each report contains individual observations of meteorological and oceanographic variables, such as sea surface and air temperatures, winds, pressure, humidity, wet bulb, dew point, ocean waves and cloudiness. A monthly summary for an area box includes ten statistics (e.g. mean, median, standard deviation, etc.) for 22 observed and computed variables (e.g. sea surface and air temperature, wind, pressure, humidity, cloudiness, etc.). ICOADS is the most complete and heterogeneous collection of surface marine data in existence. A major new historical update, Release 3.0 (R3.0), now in production (with availability anticipated in mid-2016) will contain a variety of important updates. These updates will include unique IDs (UIDs), new IMMA attachments, ICOADS Value-Added Database (IVAD), and numerous new or improved historical and contemporary data sources. UIDs are assigned to each individual marine report, which will greatly facilitate interaction between users and data developers, and affords record traceability. A new Near-Surface Oceanographic (Nocn) attachment has been developed to include oceanographic profile elements, such as sea surface salinity, sea surface temperatures, and their associated measurement depths. Additionally, IVAD allows a feedback mechanism of data adjustments which can be stored within each IMMA report. R3.0 includes near-surface ocean profile measurements from sources such as the World Ocean Database (WOD), Shipboard Automated Meteorological and Oceanographic System (SAMOS), as well as many others. An in-depth look at the improvements and the data inputs planned for R3.0 will be further discussed.

  2. Lowell National Historical Park alternative transportation system historic trolley planning study

    DOT National Transportation Integrated Search

    2002-12-01

    This report assesses opportunities for expanding Lowell National Historical Parks historic trolley line by implementing a light rail system reminiscent of late 19th/early 20th Century trolley lines. This is in line with the Park Services Transp...

  3. Atlantic Hurricane Activity: 1851-1900

    NASA Astrophysics Data System (ADS)

    Landsea, C. W.

    2001-12-01

    This presentation reports on the second year's work of a three year project to re-analyze the North Atlantic hurricane database (or HURDAT). The original database of six-hourly positions and intensities were put together in the 1960s in support of the Apollo space program to help provide statistical track forecast guidance. In the intervening years, this database - which is now freely and easily accessible on the Internet from the National Hurricane Center's (NHC's) Webpage - has been utilized for a wide variety of uses: climatic change studies, seasonal forecasting, risk assessment for county emergency managers, analysis of potential losses for insurance and business interests, intensity forecasting techniques and verification of official and various model predictions of track and intensity. Unfortunately, HURDAT was not designed with all of these uses in mind when it was first put together and not all of them may be appropriate given its original motivation. One problem with HURDAT is that there are numerous systematic as sell as some random errors in the database which need correction. Additionally, analysis techniques have changed over the years at NHC as our understanding of tropical cyclones has developed, leading to biases in the historical database that have not been addressed. Another difficulty in applying the hurricane database to studies concerned with landfalling events is the lack exact location, time and intensity at hurricane landfall. Finally, recent efforts into uncovering undocumented historical hurricanes in the late 1800s and early 1900s led by Jose Fernandez-Partagas have greatly increased our knowledge of these past events, which are not yet incorporated into the HURDAT database. Because of all of these issues, a re-analysis of the Atlantic hurricane database is being attempted that will be completed in three years. As part of the re-analyses, three files will be made available: {* } The revised Atlantic HURDAT (with six hourly intensities & positions) {* }{* } HURDAT meta-file: A text file with detailed information about each suggested change proposed in the revised HURDAT. {* }{* }{* } A ``center fix" file: This file is composed of actual observations of tropical cyclone positions and intensity estimates from the following platforms: aircraft, satellite, radar, and synoptic. All changes made to HURDAT will be approved by a NHC Committee as this database is one that is officially maintained by them. At the conference, results will be shown including a revised climatology of U.S. hurricane strikes back to 1851. >http://www.aoml.noaa.gov/hrd/hurdat/index.html

  4. Addressing fundamental architectural challenges of an activity-based intelligence and advanced analytics (ABIAA) system

    NASA Astrophysics Data System (ADS)

    Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.

    2015-06-01

    The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.

  5. GeoInt: the first macroseismic intensity database for the Republic of Georgia

    NASA Astrophysics Data System (ADS)

    Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.

    2018-05-01

    Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.

  6. GeoInt: the first macroseismic intensity database for the Republic of Georgia

    NASA Astrophysics Data System (ADS)

    Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.

    2018-01-01

    Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.

  7. Patent Documents as a Resource for Studies and Education in Geophysics - An Approach.

    NASA Astrophysics Data System (ADS)

    Wollny, K. G.

    2016-12-01

    Patents are a highly neglected source of information in geophysics, although they supply a wealth of technical and historically relevant data and might be an important asset for researchers and students. The technical drawings and descriptions in patent documents provide insight into the personal work of a researcher or a scientific group and give detailed technical background information, show interdisciplinary solutions for similar problems, help to learn about inventions too advanced for their time but maybe useful now, and to explore the historical background and timelines of inventions and their inventors. It will be shown how to get access to patent documents and how to use them for research and education purposes. Exemplary inventions by well-known geoscientists or scientists in related fields will be presented to illustrate the usefulness of patent documents. The data pool used is the International Patent Classification (IPC) class G01V that the United Nations' World Intellectual Property Organisation (WIPO) has set up mainly for inventions with key aspects in geophysics. This class contains approximately 235,000 patent documents (July 2016) for methods, apparatuses or scientific instruments developed during scientific projects or by geophysical companies. The patent documents can be accessed via patent databases. The most important patent databases are for free, search functionality is self-explanatory and the amount of information to be extracted is enormous. For example, more than 90 million multilingual patent documents are currently available online (July 2016) in DEPATIS database of the German Patent and Trade Mark Office or ESPACENET of the European Patent Office. To summarize, patent documents are a highly useful tool for educational and research purposes to strengthen students' and scientists' knowledge in a practically orientated geophysical field and to widen the horizon to adjacent technical areas. Last but not least, they also provide insight into historical aspects of geophysics and the persons working in that area.

  8. Challenges in Defining Tsunami Wave Height

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Dunbar, P. K.; Mungov, G.; Sweeney, A.; Arcos, N. P.

    2017-12-01

    The NOAA National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 Mw earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height, NCEI will consider adding an additional field for the maximum peak measurement.

  9. Challenges in Defining Tsunami Wave Heights

    NASA Astrophysics Data System (ADS)

    Dunbar, Paula; Mungov, George; Sweeney, Aaron; Stroker, Kelly; Arcos, Nicolas

    2017-08-01

    The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 M w earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 coastal tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height for each tide gauge and deep-ocean buoy, NCEI will consider adding an additional field for the maximum peak measurement.

  10. Online learning algorithm for time series forecasting suitable for low cost wireless sensor networks nodes.

    PubMed

    Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma

    2015-04-21

    Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources.

  11. Online Learning Algorithm for Time Series Forecasting Suitable for Low Cost Wireless Sensor Networks Nodes

    PubMed Central

    Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma

    2015-01-01

    Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources. PMID:25905698

  12. Nutrient, suspended-sediment, and total suspended-solids data for surface water in the Great Salt Lake basins study unit, Utah, Idaho, and Wyoming, 1980-95

    USGS Publications Warehouse

    Hadley, Heidi K.

    2000-01-01

    Selected nitrogen and phosphorus (nutrient), suspended-sediment and total suspended-solids surface-water data were compiled from January 1980 through December 1995 within the Great Salt Lake Basins National Water-Quality Assessment study unit, which extends from southeastern Idaho to west-central Utah and from Great Salt Lake to the Wasatch and western Uinta Mountains. The data were retrieved from the U.S. Geological Survey National Water Information System and the State of Utah, Department of Environmental Quality, Division of Water Quality database. The Division of Water Quality database includes data that are submitted to the U.S. Environmental Protection Agency STOrage and RETrieval system. Water-quality data included in this report were selected for surface-water sites (rivers, streams, and canals) that had three or more nutrient, suspended-sediment, or total suspended-solids analyses. Also, 33 percent or more of the measurements at a site had to include discharge, and, for non-U.S. Geological Survey sites, there had to be 2 or more years of data. Ancillary data for parameters such as water temperature, pH, specific conductance, streamflow (discharge), dissolved oxygen, biochemical oxygen demand, alkalinity, and turbidity also were compiled, as available. The compiled nutrient database contains 13,511 samples from 191 selected sites. The compiled suspended-sediment and total suspended-solids database contains 11,642 samples from 142 selected sites. For the nutrient database, the median (50th percentile) sample period for individual sites is 6 years, and the 75th percentile is 14 years. The median number of samples per site is 52 and the 75th percentile is 110 samples. For the suspended-sediment and total suspended-solids database, the median sample period for individual sites is 9 years, and the 75th percentile is 14 years. The median number of samples per site is 76 and the 75th percentile is 120 samples. The compiled historical data are being used in the basinwide sampling strategy to characterize the broad-scale geographic and seasonal water-quality conditions in relation to major contaminant sources and background conditions. Data for this report are stored on a compact disc.

  13. Analysis and selection of magnitude relations for the Working Group on Utah Earthquake Probabilities

    USGS Publications Warehouse

    Duross, Christopher; Olig, Susan; Schwartz, David

    2015-01-01

    Prior to calculating time-independent and -dependent earthquake probabilities for faults in the Wasatch Front region, the Working Group on Utah Earthquake Probabilities (WGUEP) updated a seismic-source model for the region (Wong and others, 2014) and evaluated 19 historical regressions on earthquake magnitude (M). These regressions relate M to fault parameters for historical surface-faulting earthquakes, including linear fault length (e.g., surface-rupture length [SRL] or segment length), average displacement, maximum displacement, rupture area, seismic moment (Mo ), and slip rate. These regressions show that significant epistemic uncertainties complicate the determination of characteristic magnitude for fault sources in the Basin and Range Province (BRP). For example, we found that M estimates (as a function of SRL) span about 0.3–0.4 units (figure 1) owing to differences in the fault parameter used; age, quality, and size of historical earthquake databases; and fault type and region considered.

  14. Simulation of Water Levels and Salinity in the Rivers and Tidal Marshes in the Vicinity of the Savannah National Wildlife Refuge, Coastal South Carolina and Georgia

    USGS Publications Warehouse

    Conrads, Paul; Roehl, Edwin A.; Daamen, Ruby C.; Kitchens, Wiley M.

    2006-01-01

    The Savannah Harbor is one of the busiest ports on the East Coast of the United States and is located downstream from the Savannah National Wildlife Refuge, which is one of the Nation?s largest freshwater tidal marshes. The Georgia Ports Authority and the U.S. Army Corps of Engineers funded hydrodynamic and ecological studies to evaluate the potential effects of a proposed deepening of Savannah Harbor as part of the Environmental Impact Statement. These studies included a three-dimensional (3D) model of the Savannah River estuary system, which was developed to simulate changes in water levels and salinity in the system in response to geometry changes as a result of the deepening of Savannah Harbor, and a marsh-succession model that predicts plant distribution in the tidal marshes in response to changes in the water-level and salinity conditions in the marsh. Beginning in May 2001, the U.S. Geological Survey entered into cooperative agreements with the Georgia Ports Authority to develop empirical models to simulate the water level and salinity of the rivers and tidal marshes in the vicinity of the Savannah National Wildlife Refuge and to link the 3D hydrodynamic river-estuary model and the marsh-succession model. For the development of these models, many different databases were created that describe the complexity and behaviors of the estuary. The U.S. Geological Survey has maintained a network of continuous streamflow, water-level, and specific-conductance (field measurement to compute salinity) river gages in the study area since the 1980s and a network of water-level and salinity marsh gages in the study area since 1999. The Georgia Ports Authority collected water-level and salinity data during summer 1997 and 1999 and collected continuous water-level and salinity data in the marsh and connecting tidal creeks from 1999 to 2002. Most of the databases comprise time series that differ by variable type, periods of record, measurement frequency, location, and reliability. Understanding freshwater inflows, tidal water levels, and specific conductance in the rivers and marshes is critical to enhancing the predictive capabilities of a successful marsh succession model. Data-mining techniques, including artificial neural network (ANN) models, were applied to address various needs of the ecology study and to integrate the riverine predictions from the 3D model to the marsh-succession model. ANN models were developed to simulate riverine water levels and specific conductance in the vicinity of the tidal marshes for the full range of historical conditions using data from the river gaging networks. ANN models were also developed to simulate the marsh water levels and pore-water salinities using data from the marsh gaging networks. Using the marsh ANN models, the continuous marsh network was hindcasted to be concurrent with the long-term riverine network. The hindcasted data allow ecologists to compute hydrologic parameters?such as hydroperiods and exposure frequency?to help analyze historical vegetation data. To integrate the 3D hydrodynamic model, the marsh-succession model, and various time-series databases, a decision support system (DSS) was developed to support the various needs of regulatory and scientific stakeholders. The DSS required the development of a spreadsheet application that integrates the database, 3D hydrodynamic model output, and ANN riverine and marsh models into a single package that is easy to use and can be readily disseminated. The DSS allows users to evaluate water-level and salinity response for different hydrologic conditions. Savannah River streamflows can be controlled by the user as constant flow, a percentage of historical flows, a percentile daily flow hydrograph, or as a user-specified hydrograph. The DSS can also use output from the 3D model at stream gages near the Savannah National Wildlife Refuge to simulate the effects in the tidal marshes. The DSS is distributed with a two-dimensional (

  15. a 3d GIS Method Applied to Cataloging and Restoring: the Case of Aurelian Walls at Rome

    NASA Astrophysics Data System (ADS)

    Canciani, M.; Ceniccola, V.; Messi, M.; Saccone, M.; Zampilli, M.

    2013-07-01

    The project involves architecture, archaeology, restoration, graphic documentation and computer imaging. The objective is development of a method for documentation of an architectural feature, based on a three-dimensional model obtained through laser scanning technologies, linked to a database developed in GIS environment. The case study concerns a short section of Rome's Aurelian walls, including the Porta Latina. The city walls are Rome's largest single architectural monument, subject to continuous deterioration, modification and maintenance since their original construction beginning in 271 AD. The documentation system provides a flexible, precise and easily-applied instrument for recording the full appearance, materials, stratification palimpsest and conservation status, in order to identify restoration criteria and intervention priorities, and to monitor and control the use and conservation of the walls over time. The project began with an analysis and documentation campaign integrating direct, traditional recording methods with indirect, topographic instrument and 3D laser scanning recording. These recording systems permitted development of a geographic information system based on three-dimensional modelling of separate, individual elements, linked to a database and related to the various stratigraphic horizons, the construction techniques, the component materials and their state of degradation. The investigations of the extant wall fabric were further compared to historic documentation, from both graphic and descriptive sources. The resulting model constitutes the core of the GIS system for this specific monument. The methodology is notable for its low cost, precision, practicality and thoroughness, and can be applied to the entire Aurelian wall and to other monuments.

  16. PDTCM: a systems pharmacology platform of traditional Chinese medicine for psoriasis.

    PubMed

    Wang, Dongmei; Gu, Jiangyong; Zhu, Wei; Luo, Fang; Chen, Lirong; Xu, Xiaojie; Lu, Chuanjian

    2017-12-01

    Psoriasis is a refractory skin disorder, and usually requires a lifetime control. Traditional Chinese medicine (TCM) is effective and safe for this disease. However, the cellular and molecular mechanisms of TCM remedies for psoriasis are still not fully understood. TCM contains numerous natural products. Natural products have historically been invaluable as a resource of therapeutic agents. Yet, there is no integrated information about active compounds of TCM for psoriasis. We use systems pharmacology methods to develop the Psoriasis Database of Traditional Chinese Medicine (PDTCM). The database covered a number of psoriasis-related information (formulas, TCM, compounds, target proteins, diseases and biomarkers). With these data information, an online platform was constructed Results: PDTCM comprises 38 empirical therapeutic formulas, 34373 compounds from 1424 medicinal plants, 44 psoriasis-related proteins and 76 biomarkers from 111 related diseases. On this platform, users can screen active compounds for a psoriasis-related target and explore molecular mechanisms of TCM. Accordingly, users can also download the retrieved structures and data information with a defined value set. In addition, it helps to get a better understanding of Chinese prescriptions in disease treatment. With the systems pharmacology-based data, PDTCM would become a valuable resource for TCM in psoriasis-related research. Key messages PDTCM platform comprises a great deal of data on TCM and psoriasis. On this platform, users can retrieve and get needed information with systems pharmacology methods, such as active compounds screening, target prediction and molecular mechanisms exploration. It is a tool for psoriasis-related research on natural drugs systematically.

  17. Use of national clinical databases for informing and for evaluating health care policies.

    PubMed

    Black, Nick; Tan, Stefanie

    2013-02-01

    Policy-makers and analysts could make use of national clinical databases either to inform or to evaluate meso-level (organisation and delivery of health care) and macro-level (national) policies. Reviewing the use of 15 of the best established databases in England, we identify and describe four published examples of each use. These show that policy-makers can either make use of the data itself or of research based on the database. For evaluating policies, the major advantages are the huge sample sizes available, the generalisability of the data, its immediate availability and historic information. The principal methodological challenges involve the need for risk adjustment and time-series analysis. Given their usefulness in the policy arena, there are several reasons why national clinical databases have not been used more, some due to a lack of 'push' by their custodians and some to the lack of 'pull' by policy-makers. Greater exploitation of these valuable resources would be facilitated by policy-makers' and custodians' increased awareness, minimisation of legal restrictions on data use, improvements in the quality of databases and a library of examples of applications to policy. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  18. Design and implementation of a twin-family database for behavior genetics and genomics studies.

    PubMed

    Boomsma, Dorret I; Willemsen, Gonneke; Vink, Jacqueline M; Bartels, Meike; Groot, Paul; Hottenga, Jouke Jan; van Beijsterveldt, C E M Toos; Stroet, Therese; van Dijk, Rob; Wertheim, Rien; Visser, Marco; van der Kleij, Frank

    2008-06-01

    In this article we describe the design and implementation of a database for extended twin families. The database does not focus on probands or on index twins, as this approach becomes problematic when larger multigenerational families are included, when more than one set of multiples is present within a family, or when families turn out to be part of a larger pedigree. Instead, we present an alternative approach that uses a highly flexible notion of persons and relations. The relations among the subjects in the database have a one-to-many structure, are user-definable and extendible and support arbitrarily complicated pedigrees. Some additional characteristics of the database are highlighted, such as the storage of historical data, predefined expressions for advanced queries, output facilities for individuals and relations among individuals and an easy-to-use multi-step wizard for contacting participants. This solution presents a flexible approach to accommodate pedigrees of arbitrary size, multiple biological and nonbiological relationships among participants and dynamic changes in these relations that occur over time, which can be implemented for any type of multigenerational family study.

  19. Integrated Historical Tsunami Event and Deposit Database

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; McCullough, H. L.

    2010-12-01

    The National Geophysical Data Center (NGDC) provides integrated access to historical tsunami event, deposit, and proxy data. The NGDC tsunami archive initially listed tsunami sources and locations with observed tsunami effects. Tsunami frequency and intensity are important for understanding tsunami hazards. Unfortunately, tsunami recurrence intervals often exceed the historic record. As a result, NGDC expanded the archive to include the Global Tsunami Deposits Database (GTD_DB). Tsunami deposits are the physical evidence left behind when a tsunami impacts a shoreline or affects submarine sediments. Proxies include co-seismic subsidence, turbidite deposits, changes in biota following an influx of marine water in a freshwater environment, etc. By adding past tsunami data inferred from the geologic record, the GTD_DB extends the record of tsunamis backward in time. Although the best methods for identifying tsunami deposits and proxies in the geologic record remain under discussion, developing an overall picture of where tsunamis have affected coasts, calculating recurrence intervals, and approximating runup height and inundation distance provides a better estimate of a region’s true tsunami hazard. Tsunami deposit and proxy descriptions in the GTD_DB were compiled from published data found in journal articles, conference proceedings, theses, books, conference abstracts, posters, web sites, etc. The database now includes over 1,200 descriptions compiled from over 1,100 citations. Each record in the GTD_DB is linked to its bibliographic citation where more information on the deposit can be found. The GTD_DB includes data for over 50 variables such as: event description (e.g., 2010 Chile Tsunami), geologic time period, year, deposit location name, latitude, longitude, country, associated body of water, setting during the event (e.g., beach, lake, river, deep sea), upper and lower contacts, underlying and overlying material, etc. If known, the tsunami source mechanism (e.g., earthquake, landslide, volcanic eruption, asteroid impact) is also specified. Observations (grain size, sedimentary structure, bed thickness, number of layers, etc.) are stored along with the conclusions drawn from the evidence by the author (wave height, flow depth, flow velocity, number of waves, etc.). Geologic time periods in the GTD_DB range from Precambrian to Quaternary, but the majority (70%) are from the Quaternary period. This period includes events such as: the 2004 Indian Ocean tsunami, the Cascadia subduction zone earthquakes and tsunamis, the 1755 Lisbon tsunami, the A.D. 79 Vesuvius tsunami, the 3500 BP Santorini caldera collapse and tsunami, and the 7000 BP Storegga landslide-generated tsunami. Prior to the Quaternary period, the majority of the paleotsunamis are due to impact events such as: the Tertiary Chesapeake Bay Bolide, Cretaceous-Tertiary (K/T) Boundary, Cretaceous Manson, and Devonian Alamo. The tsunami deposits are integrated with the historical tsunami event database where applicable. For example, users can search for articles describing deposits related to the 1755 Lisbon tsunami and view those records, as well as link to the related historic event record. The data and information may be viewed using tools designed to extract and display data (selection forms, Web Map Services, and Web Feature Services).

  20. The visual-landscape analysis during the integration of high-rise buildings within the historic urban environment

    NASA Astrophysics Data System (ADS)

    Akristiniy, Vera A.; Dikova, Elena A.

    2018-03-01

    The article is devoted to one of the types of urban planning studies - the visual-landscape analysis during the integration of high-rise buildings within the historic urban environment for the purposes of providing pre-design and design studies in terms of preserving the historical urban environment and the implementation of the reconstructional resource of the area. In the article formed and systematized the stages and methods of conducting the visual-landscape analysis taking into account the influence of high-rise buildings on objects of cultural heritage and valuable historical buildings of the city. Practical application of the visual-landscape analysis provides an opportunity to assess the influence of hypothetical location of high-rise buildings on the perception of a historically developed environment and optimal building parameters. The contents of the main stages in the conduct of the visual - landscape analysis and their key aspects, concerning the construction of predicted zones of visibility of the significant historically valuable urban development objects and hypothetically planned of the high-rise buildings are revealed. The obtained data are oriented to the successive development of the planning and typological structure of the city territory and preservation of the compositional influence of valuable fragments of the historical environment in the structure of the urban landscape. On their basis, an information database is formed to determine the permissible urban development parameters of the high-rise buildings for the preservation of the compositional integrity of the urban area.

  1. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  2. Documentation and Cultural Heritage Inventories - Case of the Historic City of Ahmadabad

    NASA Astrophysics Data System (ADS)

    Shah, K.

    2015-08-01

    Located in the western Indian state of Gujarat, the historic city of Ahmadabad is renowned for the unparalleled richness of its monumental architecture, traditional house form, community based settlement patterns, city structure, crafts and mercantile culture. This paper describes the process followed for documentation and development of comprehensive Heritage Inventories for the historic city with an aim of illustrating the Outstanding Universal Values of its Architectural and Urban Heritage. The exercise undertaken between 2011 & 2014 as part of the preparation of world heritage nomination dossier included thorough archival research, field surveys, mapping and preparation of inventories using a combination of traditional data procurement and presentation tools as well as creation of advanced digital database using GIS. The major challenges encountered were: need to adapt documentation methodology and survey formats to field conditions, changing and ever widening scope of work, corresponding changes in time frame, management of large quantities of data generated during the process along with difficulties in correlating existing databases procured from the local authority in varying formats. While the end result satisfied the primary aim, the full potential of Heritage Inventory as a protection and management tool will only be realised after its acceptance as the statutory list and its integration within the larger urban development plan to guide conservation, development and management strategy for the city. The rather detailed description of evolution of documentation process and the complexities involved is presented to understand the relevance of methods used in Ahmadabad and guide similar future efforts in the field.

  3. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data.

    PubMed

    Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.

  4. StatsDB: platform-agnostic storage and understanding of next generation sequencing run metrics

    PubMed Central

    Ramirez-Gonzalez, Ricardo H.; Leggett, Richard M.; Waite, Darren; Thanki, Anil; Drou, Nizar; Caccamo, Mario; Davey, Robert

    2014-01-01

    Modern sequencing platforms generate enormous quantities of data in ever-decreasing amounts of time. Additionally, techniques such as multiplex sequencing allow one run to contain hundreds of different samples. With such data comes a significant challenge to understand its quality and to understand how the quality and yield are changing across instruments and over time. As well as the desire to understand historical data, sequencing centres often have a duty to provide clear summaries of individual run performance to collaborators or customers. We present StatsDB, an open-source software package for storage and analysis of next generation sequencing run metrics. The system has been designed for incorporation into a primary analysis pipeline, either at the programmatic level or via integration into existing user interfaces. Statistics are stored in an SQL database and APIs provide the ability to store and access the data while abstracting the underlying database design. This abstraction allows simpler, wider querying across multiple fields than is possible by the manual steps and calculation required to dissect individual reports, e.g. ”provide metrics about nucleotide bias in libraries using adaptor barcode X, across all runs on sequencer A, within the last month”. The software is supplied with modules for storage of statistics from FastQC, a commonly used tool for analysis of sequence reads, but the open nature of the database schema means it can be easily adapted to other tools. Currently at The Genome Analysis Centre (TGAC), reports are accessed through our LIMS system or through a standalone GUI tool, but the API and supplied examples make it easy to develop custom reports and to interface with other packages. PMID:24627795

  5. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data

    PubMed Central

    Vaccarino, Anthony L.; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R.; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G.; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F. Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M.; Stuss, Donald T.; Theriault, Elizabeth; Evans, Kenneth R.

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute’s “Brain-CODE” is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care. PMID:29875648

  6. A framework for capturing clinical data sets from computerized sources.

    PubMed

    McDonald, C J; Overhage, J M; Dexter, P; Takesue, B Y; Dwyer, D M

    1997-10-15

    The pressure to improve health care and provide better care at a lower cost has generated the need for efficient capture of clinical data. Many data sets are now being defined to analyze health care. Historically, review and research organizations have simply determined what data they wanted to collect, developed forms, and then gathered the information through chart review without regard to what is already available institutionally in computerized databases. Today, much electronic patient information is available in operational data systems (for example, laboratory systems, pharmacy systems, and surgical scheduling systems) and is accessible by agencies and organizations through standards for messages, codes, and encrypted electronic mail. Such agencies and organizations should define the elements of their data sets in terms of standardized operational data, and data producers should fully adopt these code and message standards. The Health Plan Employer Data and Information Set and the Council of State and Territorial Epidemiologists in collaboration with the Centers for Disease Control and Prevention and the Association of State and Territorial Public Health Laboratory Directors provide examples of how this can be done.

  7. Value added data archiving

    NASA Technical Reports Server (NTRS)

    Berard, Peter R.

    1993-01-01

    Researchers in the Molecular Sciences Research Center (MSRC) of Pacific Northwest Laboratory (PNL) currently generate massive amounts of scientific data. The amount of data that will need to be managed by the turn of the century is expected to increase significantly. Automated tools that support the management, maintenance, and sharing of this data are minimal. Researchers typically manage their own data by physically moving datasets to and from long term storage devices and recording a dataset's historical information in a laboratory notebook. Even though it is not the most efficient use of resources, researchers have tolerated the process. The solution to this problem will evolve over the next three years in three phases. PNL plans to add sophistication to existing multilevel file system (MLFS) software by integrating it with an object database management system (ODBMS). The first phase in the evolution is currently underway. A prototype system of limited scale is being used to gather information that will feed into the next two phases. This paper describes the prototype system, identifies the successes and problems/complications experienced to date, and outlines PNL's long term goals and objectives in providing a permanent solution.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Lopez, A.; Sengupta, M.

    Typical Meteorological Year (TMY) data sets provide industry standard resource information for building designers and are commonly used by the solar industry to estimate photovoltaic and concentrating solar power system performance. Historically, TMY data sets were only available for certain station locations, but current TMY data sets are available on the same grid as the National Solar Radiation Database data and are referred to as the gridded TMY. In this report, a comparison of TMY, typical direct (normal irradiance) year (TDY), and typical global (horizontal irradiance) year (TGY) data sets were performed to better understand the impact of ancillary weathermore » variables upon them. These analyses identified geographical areas of high and low temporal and spatial variability, thereby providing insight into the representativeness of a particular TMY data set for use in renewable energy as well as other applications.« less

  9. Initial Results of Aperture Area Comparisons for Exo-Atmospheric Total Solar Irradiance Measurements

    NASA Technical Reports Server (NTRS)

    Johnson, B. Carol; Litorja, Maritoni; Fowler, Joel B.; Butler, James J.

    2009-01-01

    In the measurement of exo-atmospheric total solar irradiance (TSI), instrument aperture area is a critical component in converting solar radiant flux to irradiance. In a May 2000 calibration workshop for the Total Irradiance Monitor (TIM) on the Earth Observing System (EOS) Solar Radiation and Climate Experiment (SORCE), the solar irradiance measurement community recommended that NASA and NISI coordinate an aperture area measurement comparison to quantify and validate aperture area uncertainties and their overall effect on TSI uncertainties. From May 2003 to February 2006, apertures from 4 institutions with links to the historical TSI database were measured by NIST and the results were compared to the aperture area determined by each institution. The initial results of these comparisons are presented and preliminary assessments of the participants' uncertainties are discussed.

  10. A flood geodatabase and its climatological applications: the case of Catalonia for the last century

    NASA Astrophysics Data System (ADS)

    Barnolas, M.; Llasat, M. C.

    2007-04-01

    Floods are the natural hazards that produce the highest number of casualties and material damage in the Western Mediterranean. An improvement in flood risk assessment and study of a possible increase in flooding occurrence are therefore needed. To carry out these tasks it is important to have at our disposal extensive knowledge on historical floods and to find an efficient way to manage this geographical data. In this paper we present a complete flood database spanning the 20th century for the whole of Catalonia (NE Spain), which includes documentary information (affected areas and damage) and instrumental information (meteorological and hydrological records). This geodatabase, named Inungama, has been implemented on a GIS (Geographical Information System) in order to display all the information within a given geographical scenario, as well as to carry out an analysis thereof using queries, overlays and calculus. Following a description of the type and amount of information stored in the database and the structure of the information system, the first applications of Inungama are presented. The geographical distribution of floods shows the localities which are more likely to be flooded, confirming that the most affected municipalities are the most densely populated ones in coastal areas. Regarding the existence of an increase in flooding occurrence, a temporal analysis has been carried out, showing a steady increase over the last 30 years.

  11. The Game of Life: College Sports and Educational Values.

    ERIC Educational Resources Information Center

    Shulman, James L.; Bowen, William G.

    Drawing on historical research, data on alumni giving, information on budgetary spending on college athletics, and a database of 90,000 students from 30 selective colleges and universities in the 1950s, 1970, and 1990s, this book demonstrates how athletics influences the class composition and campus ethos of selective schools. The chapters are:…

  12. ERIC Annual Report, 2002: Summarizing the Recent Accomplishments of the Educational Resources Information Center.

    ERIC Educational Resources Information Center

    Smarte, Lynn; Starcher, Heather

    This ERIC Annual Report presents both accomplishments and historical perspectives, as 2001 marks 35 years of ERIC service in delivering educational research and information to the public. This annual report describes the developments in the database of educational literature, the growing variety of ERIC Web-based products and user services, and…

  13. Incorporating a New Bioinformatics Component into Genetics at a Historically Black College: Outcomes and Lessons

    ERIC Educational Resources Information Center

    Holtzclaw, J. David; Eisen, Arri; Whitney, Erika M.; Penumetcha, Meera; Hoey, J. Joseph; Kimbro, K. Sean

    2006-01-01

    Many students at minority-serving institutions are underexposed to Internet resources such as the human genome project, PubMed, NCBI databases, and other Web-based technologies because of a lack of financial resources. To change this, we designed and implemented a new bioinformatics component to supplement the undergraduate Genetics course at…

  14. The Past in the Future: Problems and Potentials of Historical Reception Studies.

    ERIC Educational Resources Information Center

    Jensen, Klaus Bruhn

    1993-01-01

    Gives examples of how qualitative methodologies have been employed to study media reception in the present. Identifies some forms of evidence that can creatively fill the gaps in knowledge about media reception in the past. Argues that the field must develop databases documenting media reception, which may broaden the scope of audience research in…

  15. Uterine transplantation: Review in human research.

    PubMed

    Favre-Inhofer, A; Rafii, A; Carbonnel, M; Revaux, A; Ayoubi, J M

    2018-06-01

    Uterine transplantation is the solution to treat absolute uterine fertility. In this review, we present the historical, medical, technical, psychological and ethical perspectives in human uterine transplantation research. We reviewed the PubMed database following PRISMA guidelines and added data presented by several research teams during the first international congress on uterine transplantation. Copyright © 2018. Published by Elsevier Masson SAS.

  16. Marriage and Family Counseling. Searchlight Plus: Relevant Resources in High Interest Areas. 57+.

    ERIC Educational Resources Information Center

    Okun, Barbara F.

    This information analysis paper is based on a computer search of the ERIC database from November 1966 through March 1984, and on pertinent outside resources related to marriage and family counseling. A brief historical perspective of the field of marriage and family counseling is provided, and the differences and overlaps between family,…

  17. Forest inventory, catastrophic events and historic geospatial assessments in the south

    Treesearch

    Dennis M. Jacobs

    2007-01-01

    Catastrophic events are a regular occurrence of disturbance to forestland in the Southern United States. Each major event affects the integrity of the forest inventory database developed and maintained by the Forest Inventory & Analysis Research Work Unit of the U.S. Department of Agriculture, Forest Service. Some of these major disturbances through the years have...

  18. Simulation of the effects of rainfall and groundwater use on historical lake water levels, groundwater levels, and spring flows in central Florida

    USGS Publications Warehouse

    O'Reilly, Andrew M.; Roehl, Edwin A.; Conrads, Paul; Daamen, Ruby C.; Petkewich, Matthew D.

    2014-01-01

    The urbanization of central Florida has progressed substantially in recent decades, and the total population in Lake, Orange, Osceola, Polk, and Seminole Counties more than quadrupled from 1960 to 2010. The Floridan aquifer system is the primary source of water for potable, industrial, and agricultural purposes in central Florida. Despite increases in groundwater withdrawals to meet the demand of population growth, recharge derived by infiltration of rainfall in the well-drained karst terrain of central Florida is the largest component of the long-term water balance of the Floridan aquifer system. To complement existing physics-based groundwater flow models, artificial neural networks and other data-mining techniques were used to simulate historical lake water level, groundwater level, and spring flow at sites throughout the area. Historical data were examined using descriptive statistics, cluster analysis, and other exploratory analysis techniques to assess their suitability for more intensive data-mining analysis. Linear trend analyses of meteorological data collected by the National Oceanic and Atmospheric Administration at 21 sites indicate 67 percent of sites exhibited upward trends in air temperature over at least a 45-year period of record, whereas 76 percent exhibited downward trends in rainfall over at least a 95-year period of record. Likewise, linear trend analyses of hydrologic response data, which have varied periods of record ranging in length from 10 to 79 years, indicate that water levels in lakes (307 sites) were about evenly split between upward and downward trends, whereas water levels in 69 percent of wells (out of 455 sites) and flows in 68 percent of springs (out of 19 sites) exhibited downward trends. Total groundwater use in the study area increased from about 250 million gallons per day (Mgal/d) in 1958 to about 590 Mgal/d in 1980 and remained relatively stable from 1981 to 2008, with a minimum of 559 Mgal/d in 1994 and a maximum of 773 Mgal/d in 2000. The change in groundwater-use trend in the early 1980s and the following period of relatively slight trend is attributable to the concomitant effects of increasing public-supply withdrawals and decreasing use of water by the phosphate industry and agriculture. On the basis of available historical data and exploratory analyses, empirical lake water-level, groundwater-level, and spring-flow models were developed for 22 lakes, 23 wells, and 6 springs. Input time series consisting of various frequencies and frequency-band components of daily rainfall (1942 to 2008) and monthly total groundwater use (1957 to 2008) resulted in hybrid signal-decomposition artificial neural network models. The final models explained much of the variability in observed hydrologic data, with 43 of the 51 sites having coefficients of determination exceeding 0.6, and the models matched the magnitude of the observed data reasonably well, such that models for 32 of the 51 sites had root-mean-square errors less than 10 percent of the measured range of the data. The Central Florida Artificial Neural Network Decision Support System was developed to integrate historical databases and the 102 site-specific artificial neural network models, model controls, and model output into a spreadsheet application with a graphical user interface that allows the user to simulate scenarios of interest. Overall, the data-mining analyses indicate that the Floridan aquifer system in central Florida is a highly conductive, dynamic, open system that is strongly influenced by external forcing. The most important external forcing appears to be rainfall, which explains much of the multiyear cyclic variability and long-term downward trends observed in lake water levels, groundwater levels, and spring flows. For most sites, groundwater use explains less of the observed variability in water levels and flows than rainfall. Relative groundwater-use impacts are greater during droughts, however, and long-term trends in water levels and flows were identified that are consistent with historical groundwater-use patterns. The sensitivity of the hydrologic system to rainfall is expected, owing to the well-drained karst terrain and relatively thin confinement of the Floridan aquifer system in much of central Florida. These characteristics facilitate the relatively rapid transmission of infiltrating water from rainfall to the water table and contribute to downward leakage of water to the Floridan aquifer system. The areally distributed nature of rainfall, as opposed to the site-specific nature of groundwater use, and the generally high transmissivity and low storativity properties of the semiconfined Floridan aquifer system contribute to the prevalence of water-level and flow patterns that mimic rainfall patterns. In general, the data-mining analyses demonstrate that the hydrologic system in central Florida is affected by groundwater use differently during wet periods, when little or no system storage is available (high water levels), compared to dry periods, when there is excess system storage (low water levels). Thus, by driving the overall behavior of the system, rainfall indirectly influences the degree to which groundwater use will effect persistent trends in water levels and flows, with groundwater-use impacts more prevalent during periods of low water levels and spring flows caused by low rainfall and less prevalent during periods of high water levels and spring flows caused by high rainfall. Differences in the magnitudes of rainfall and groundwater use during wet and dry periods also are important determinants of hydrologic response. An important implication of the data-mining analyses is that rainfall variability at subannual to multidecadal timescales must be considered in combination with groundwater use to provide robust system-response predictions that enhance sustainable resource management in an open karst aquifer system. The data-driven approach was limited, however, by the confounding effects of correlation between rainfall and groundwater use, the quality and completeness of the historical databases, and the spatial variations in groundwater use. The data-mining analyses indicate that available historical data when used alone do not contain sufficient information to definitively quantify the related individual effects of rainfall and groundwater use on hydrologic response. The knowledge gained from data-driven modeling and the results from physics-based modeling, when compared and used in combination, can yield a more comprehensive assessment and a more robust understanding of the hydrologic system than either of the approaches used separately.

  19. Landslide databases to compare regional repair and mitigation strategies of transportation infrastructure

    NASA Astrophysics Data System (ADS)

    Wohlers, Annika; Damm, Bodo

    2017-04-01

    Regional data of the Central German Uplands are extracted from the German landslide database in order to understand the complex interactions between landslide risks and public risk awareness considering transportation infrastructure. Most information within the database is gathered by means of archive studies from inventories of emergency agencies, state, press and web archives, company and department records as well as scientific and (geo)technical literature. The information includes land use practices, repair and mitigation measures with resultant costs of the German road network as well as railroad and waterway networks. It therefore contains valuable information of historical and current landslide impacts, elements at risk and provides an overview of spatiotemporal changes in social exposure and vulnerability to landslide hazards over the last 120 years. On a regional scale the recorded infrastructure damages, and consequential repair or mitigation measures were categorized and classified, according to relevant landslide types, processes and types of infrastructure. In a further step, the data of recent landslides are compared with historical and modern repair and mitigation measures and are correlated with socioeconomic concepts. As a result, it is possible to identify some complex interactions between landslide hazard, risk perception, and damage impact, including time lags and intensity thresholds. The data reveal distinct concepts of repairing respectively mitigating landslides on different types of transportation infrastructure, which are not exclusively linked to higher construction efforts (e.g. embankments on railroads and channels), but changing levels of economic losses and risk perception as well. In addition, a shift from low cost prevention measures such as the removal of loose rock and vegetation, rock blasting, and catch barriers towards expensive mitigation measures such as catch fences, soil anchoring and rock nailing over time can be noticed. This temporal shift is associated with a higher public hazard awareness towards landslides which is at some sites linked to an apparent increase in landslide frequency and magnitude. Damm B., Klose M. (2015) The landslide database for Germany: Closing the gap at national level. Geomorphology. 249: 82-93. Klose, M., Damm, B., Terhorst, B. (2015): Landslide cost modeling for transportation infrastructures: a methodological approach. Landslides 12: 321-334. Klose M., Maurischat P., Damm B. (2016) Landslide impacts in Germany: A historical and socioeconomic perspective. Landslides. 13: 183-199.

  20. Evolving health information technology and the timely availability of visit diagnoses from ambulatory visits: a natural experiment in an integrated delivery system.

    PubMed

    Bardach, Naomi S; Huang, Jie; Brand, Richard; Hsu, John

    2009-07-17

    Health information technology (HIT) may improve health care quality and outcomes, in part by making information available in a timelier manner. However, there are few studies documenting the changes in timely availability of data with the use of a sophisticated electronic medical record (EMR), nor a description of how the timely availability of data might differ with different types of EMRs. We hypothesized that timely availability of data would improve with use of increasingly sophisticated forms of HIT. We used an historical observation design (2004-2006) using electronic data from office visits in an integrated delivery system with three types of HIT: Basic, Intermediate, and Advanced. We calculated the monthly percentage of visits using the various types of HIT for entry of visit diagnoses into the delivery system's electronic database, and the time between the visit and the availability of the visit diagnoses in the database. In January 2004, when only Basic HIT was available, 10% of office visits had diagnoses entered on the same day as the visit and 90% within a week; 85% of office visits used paper forms for recording visit diagnoses, 16% used Basic at that time. By December 2006, 95% of all office visits had diagnoses available on the same day as the visit, when 98% of office visits used some form of HIT for entry of visit diagnoses (Advanced HIT for 67% of visits). Use of HIT systems is associated with dramatic increases in the timely availability of diagnostic information, though the effects may vary by sophistication of HIT system. Timely clinical data are critical for real-time population surveillance, and valuable for routine clinical care.

  1. Combining history of medicine and library instruction: an innovative approach to teaching database searching to medical students.

    PubMed

    Timm, Donna F; Jones, Dee; Woodson, Deidra; Cyrus, John W

    2012-01-01

    Library faculty members at the Health Sciences Library at the LSU Health Shreveport campus offer a database searching class for third-year medical students during their surgery rotation. For a number of years, students completed "ten-minute clinical challenges," but the instructors decided to replace the clinical challenges with innovative exercises using The Edwin Smith Surgical Papyrus to emphasize concepts learned. The Surgical Papyrus is an online resource that is part of the National Library of Medicine's "Turning the Pages" digital initiative. In addition, vintage surgical instruments and historic books are displayed in the classroom to enhance the learning experience.

  2. Prolonged survival of patients with non-small-cell lung cancer with leptomeningeal carcinomatosis in the modern treatment era.

    PubMed

    Riess, Jonathan W; Nagpal, Seema; Iv, Michael; Zeineh, Michael; Gubens, Matthew A; Ramchandran, Kavitha; Neal, Joel W; Wakelee, Heather A

    2014-05-01

    Leptomeningeal carcinomatosis (LM) is a severe complication of non-small-cell lung cancer (NSCLC) historically associated with poor prognosis. New chemotherapeutic and targeted treatments could potentially affect the natural history of LM. Patients with a pathologic diagnosis of NSCLC with LM treated at Stanford between 2003 and 2011 were identified via institutional databases and medical records. LM was defined by cerebrospinal fluid (CSF) that was positive for malignant cells or by LM enhancement on magnetic resonance imaging with gadolinium contrast. Retrospective, landmark analyses were performed to estimate survival. Statistical analyses were performed using SAS Enterprise Guide, version 4.3. LM was identified in 30 patients. All cases were adenocarcinoma; 60% of patients had a known or suspected driver mutation. The mean age was 58 years. Of the 30 patients, 67% were women; 70% were nonsmokers; 27% initially presented with LM; 84% received systemic treatment at or after development of LM; and 53% of these patients received modern systemic therapy for their LM, defined as a regimen containing pemetrexed, bevacizumab, or a tyrosine kinase inhibitor. Mean overall survival after LM diagnosis was 6 months (95% CI, 3-12). Patients who received modern systemic therapy for LM had decreased hazard of death (hazard ratio [HR], 0.24; P = .007). In this retrospective, single-institution analysis, median survival with LM was higher compared with historical experience. Patients who received modern systemic therapy for their LM had particularly good outcomes. These data provide evidence for improving survival outcomes in the modern treatment era for this difficult-to-treat complication. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Comparison of Lamiaceae medicinal uses in eastern Morocco and eastern Andalusia and in Ibn al-Baytar's Compendium of Simple Medicaments (13th century CE).

    PubMed

    El-Gharbaoui, Asmae; Benítez, Guillermo; González-Tejero, M Reyes; Molero-Mesa, Joaquín; Merzouki, Abderrahmane

    2017-04-18

    Transmission of traditional knowledge over time and across culturally and historically related territories is an important topic in ethnopharmacology. Here, we contribute to this knowledge by analysing data on medicinal uses in two neighbouring areas of the Western Mediterranean in relation to a historical text that has been scarcely mentioned in historical studies despite its interest. This paper discusses the sharing of popular knowledge on the medicinal uses of plants between eastern Morocco and eastern Andalusia (Spain), focusing on one of the most useful plant families in the Mediterranean area: Lamiaceae. Moreover, we used the classical work of Ibn al-Baytar (13th century CE) The Compendium of Simple Medicaments and Foods as a basis to contrast the possible link of this information, analysing the influence of this historical text on current popular tradition of medicinal plant use in both territories. For data collection, we performed ethnobotanical field research in the eastern part of Morocco, recording current medicinal uses for the Lamiaceae. In addition, we systematically reviewed the ethnobotanical literature from eastern Andalusia, developing a database. We investigated the possible historical link of the shared uses and included in this database the information from Ibn al-Baytar's Compendium. To compare the similarity and diversity of the data, we used Jaccard's similarity index. Our field work provided ethnobotanical information for 14 Lamiaceae species with 95 medicinal uses, serving to treat 13 different pathological groups. Of the total uses recorded in Morocco, 30.5% were shared by eastern Andalusia and found in Ibn al-Baytar's work. There was a higher similarity when comparing current uses of the geographically close territories of eastern Morocco and eastern Andalucía (64%) than for eastern Morocco and this historical text (43%). On the other hand, coincidences between current uses in eastern Andalusia and the ones related in the Compendium are lower, 28%. The coincidence of the current ethnobotanical knowledge in the two territories is high for the Lamiaceae. Probably the shared historical background, recent exchanges, information flow, and the influence of the historical herbal texts have influenced this coincidence. In this sense, there is a high plant-use overlap between Ibn al-Baytar's text and both territories: nearly half of the uses currently shared by eastern Morocco and eastern Andalusia were included in the Compendium and are related to this period of Islamic medicine, indicating a high level of preservation in the knowledge of plant usage. The study of 14 species of Lamiaceae suggests that this classical codex, which includes a high number of medicinal plants and uses, constitutes a valuable bibliographical source for comparing ancient and modern applications of plants. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  4. A web-based approach for electrocardiogram monitoring in the home.

    PubMed

    Magrabi, F; Lovell, N H; Celler, B G

    1999-05-01

    A Web-based electrocardiogram (ECG) monitoring service in which a longitudinal clinical record is used for management of patients, is described. The Web application is used to collect clinical data from the patient's home. A database on the server acts as a central repository where this clinical information is stored. A Web browser provides access to the patient's records and ECG data. We discuss the technologies used to automate the retrieval and storage of clinical data from a patient database, and the recording and reviewing of clinical measurement data. On the client's Web browser, ActiveX controls embedded in the Web pages provide a link between the various components including the Web server, Web page, the specialised client side ECG review and acquisition software, and the local file system. The ActiveX controls also implement FTP functions to retrieve and submit clinical data to and from the server. An intelligent software agent on the server is activated whenever new ECG data is sent from the home. The agent compares historical data with newly acquired data. Using this method, an optimum patient care strategy can be evaluated, a summarised report along with reminders and suggestions for action is sent to the doctor and patient by email.

  5. Representation and alignment of sung queries for music information retrieval

    NASA Astrophysics Data System (ADS)

    Adams, Norman H.; Wakefield, Gregory H.

    2005-09-01

    The pursuit of robust and rapid query-by-humming systems, which search melodic databases using sung queries, is a common theme in music information retrieval. The retrieval aspect of this database problem has received considerable attention, whereas the front-end processing of sung queries and the data structure to represent melodies has been based on musical intuition and historical momentum. The present work explores three time series representations for sung queries: a sequence of notes, a ``smooth'' pitch contour, and a sequence of pitch histograms. The performance of the three representations is compared using a collection of naturally sung queries. It is found that the most robust performance is achieved by the representation with highest dimension, the smooth pitch contour, but that this representation presents a formidable computational burden. For all three representations, it is necessary to align the query and target in order to achieve robust performance. The computational cost of the alignment is quadratic, hence it is necessary to keep the dimension small for rapid retrieval. Accordingly, iterative deepening is employed to achieve both robust performance and rapid retrieval. Finally, the conventional iterative framework is expanded to adapt the alignment constraints based on previous iterations, further expediting retrieval without degrading performance.

  6. Update on Geothermal Direct-Use Installations in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckers, Koenraad J; Young, Katherine R; Snyder, Diana M.

    Direct-use of geothermal energy currently has limited penetration in the United States, with an estimated installed capacity of about 500 MWth, supplying on the order of 0.01% of the total annual U.S. heat demand (about 30 EJ). We see higher penetration levels in other countries such as Iceland (about 90%) and Hungary (2.5%). An updated database of geothermal direct-use systems in the U.S. has been compiled and analyzed, building upon the Oregon Institute of Technology (OIT) Geo-Heat Center direct-use database. Types of directuse applications examined include hot springs resorts and pools, aquaculture farms, greenhouses, and district heating systems, among others;more » power-generating facilities and ground-source heat pumps were excluded. Where possible, the current operation status, open and close dates, well data, and other technical data were obtained for each entry. The database contains 545 installations, of which 407 are open, 108 are closed, and 30 have an unknown status. Spas are the most common type of installation, accounting for 50% of installations by number. Aquaculture installations (46 out of 407 open installations) account for the largest percentage (26%) of installed capacity in operation (129 MWth out of 501 MWth). Historical deployment curves show the installed capacity significantly increased in the 1970s and 1980s mainly due to development of geothermal district heating, aquaculture, and greenhouse systems. Since the 2000s, geothermal direct-use development appears to have slowed, and the number of sites in operation decreased due to closures. Case studies reveal multiple barriers to geothermal direct-use implementation and operation, including 1) existence of an information gap among stakeholders, developers, and the general public, 2) competition from cheap natural gas, and 3) the family-owned, small-scale nature of businesses might result in discontinuation among generations.« less

  7. Historical and social aspects of halitosis.

    PubMed

    Elias, Marina Sá; Ferriani, Maria das Graças Carvalho

    2006-01-01

    Buccal odors have always been a factor of concern for society. This study aims to investigate the historical and social base of halitosis, through systematized research in the database BVS (biblioteca virtual em saúde - virtual library in health) and also in books. Lack of knowledge on how to prevent halitosis allows for its occurrence, limiting quality of life. As social relationships are one of the pillars of the quality of life concept, halitosis needs to be considered a factor of negative interference. Education in health should be accomplished with a view to a dynamic balance, involving human beings' physical and psychological aspects, as well as their social interactions, so that individuals do not become jigsaw puzzles of sick parts.

  8. KSC History Project

    NASA Technical Reports Server (NTRS)

    Moore, Patrick K.

    2002-01-01

    The 2002 NASA/ASEE KSC History Project focused on a series of seven history initiatives designed to acquire, preserve, and interpret the history of Kennedy Space Center. These seven projects included the co-authoring of Voices From the Cape, historical work with NASA historian Roger Launius, the completion of a series of oral histories with key KSC personnel, a monograph on Public Affairs, the development of a Historical Concept Map (CMap) for history knowledge preservation, advice on KSC history database and web interface capabilities, the development of a KSC oral history program and guidelines of training and collection, and the development of collaborative relationships between Kennedy Space Center, the University of West Florida, and the University of Central Florida.

  9. Heterogeneity of long-history migration predicts emotion recognition accuracy.

    PubMed

    Wood, Adrienne; Rychlowska, Magdalena; Niedenthal, Paula M

    2016-06-01

    Recent work (Rychlowska et al., 2015) demonstrated the power of a relatively new cultural dimension, historical heterogeneity, in predicting cultural differences in the endorsement of emotion expression norms. Historical heterogeneity describes the number of source countries that have contributed to a country's present-day population over the last 500 years. People in cultures originating from a large number of source countries may have historically benefited from greater and clearer emotional expressivity, because they lacked a common language and well-established social norms. We therefore hypothesized that in addition to endorsing more expressive display rules, individuals from heterogeneous cultures will also produce facial expressions that are easier to recognize by people from other cultures. By reanalyzing cross-cultural emotion recognition data from 92 papers and 82 cultures, we show that emotion expressions of people from heterogeneous cultures are more easily recognized by observers from other cultures than are the expressions produced in homogeneous cultures. Heterogeneity influences expression recognition rates alongside the individualism-collectivism of the perceivers' culture, as more individualistic cultures were more accurate in emotion judgments than collectivistic cultures. This work reveals the present-day behavioral consequences of long-term historical migration patterns and demonstrates the predictive power of historical heterogeneity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.

  11. Geolocation of man-made reservoirs across terrains of varying complexity using GIS

    USGS Publications Warehouse

    Mixon, D.M.; Kinner, D.A.; Stallard, R.F.; Syvitski, J.P.M.

    2008-01-01

    The Reservoir Sedimentation Survey Information System (RESIS) is one of the world's most comprehensive databases of reservoir sedimentation rates, comprising nearly 6000 surveys for 1819 reservoirs across the continental United States. Sediment surveys in the database date from 1904 to 1999, though more than 95% of surveys were entered prior to 1980, making RESIS largely a historical database. The use of this database for large-scale studies has been limited by the lack of precise coordinates for the reservoirs. Many of the reservoirs are relatively small structures and do not appear on current USGS topographic maps. Others have been renamed or have only approximate (i.e. township and range) coordinates. This paper presents a method scripted in ESRI's ARC Macro Language (AML) to locate the reservoirs on digital elevation models using information available in RESIS. The script also delineates the contributing watersheds and compiles several hydrologically important parameters for each reservoir. Evaluation of the method indicates that, for watersheds larger than 5 km2, the correct outlet is identified over 80% of the time. The importance of identifying the watershed outlet correctly depends on the application. Our intent is to collect spatial data for watersheds across the continental United States and describe the land use, soils, and topography for each reservoir's watershed. Because of local landscape similarity in these properties, we show that choosing the incorrect watershed does not necessarily mean that the watershed characteristics will be misrepresented. We present a measure termed terrain complexity and examine its relationship to geolocation success rate and its influence on the similarity of nearby watersheds. ?? 2008 Elsevier Ltd. All rights reserved.

  12. Summary of hydrologic modeling for the Delaware River Basin using the Water Availability Tool for Environmental Resources (WATER)

    USGS Publications Warehouse

    Williamson, Tanja N.; Lant, Jeremiah G.; Claggett, Peter; Nystrom, Elizabeth A.; Milly, Paul C.D.; Nelson, Hugh L.; Hoffman, Scott A.; Colarullo, Susan J.; Fischer, Jeffrey M.

    2015-11-18

    The Water Availability Tool for Environmental Resources (WATER) is a decision support system for the nontidal part of the Delaware River Basin that provides a consistent and objective method of simulating streamflow under historical, forecasted, and managed conditions. In order to quantify the uncertainty associated with these simulations, however, streamflow and the associated hydroclimatic variables of potential evapotranspiration, actual evapotranspiration, and snow accumulation and snowmelt must be simulated and compared to long-term, daily observations from sites. This report details model development and optimization, statistical evaluation of simulations for 57 basins ranging from 2 to 930 km2 and 11.0 to 99.5 percent forested cover, and how this statistical evaluation of daily streamflow relates to simulating environmental changes and management decisions that are best examined at monthly time steps normalized over multiple decades. The decision support system provides a database of historical spatial and climatic data for simulating streamflow for 2001–11, in addition to land-cover and general circulation model forecasts that focus on 2030 and 2060. WATER integrates geospatial sampling of landscape characteristics, including topographic and soil properties, with a regionally calibrated hillslope-hydrology model, an impervious-surface model, and hydroclimatic models that were parameterized by using three hydrologic response units: forested, agricultural, and developed land cover. This integration enables the regional hydrologic modeling approach used in WATER without requiring site-specific optimization or those stationary conditions inferred when using a statistical model.

  13. An online analytical processing multi-dimensional data warehouse for malaria data

    PubMed Central

    Madey, Gregory R; Vyushkov, Alexander; Raybaud, Benoit; Burkot, Thomas R; Collins, Frank H

    2017-01-01

    Abstract Malaria is a vector-borne disease that contributes substantially to the global burden of morbidity and mortality. The management of malaria-related data from heterogeneous, autonomous, and distributed data sources poses unique challenges and requirements. Although online data storage systems exist that address specific malaria-related issues, a globally integrated online resource to address different aspects of the disease does not exist. In this article, we describe the design, implementation, and applications of a multi-dimensional, online analytical processing data warehouse, named the VecNet Data Warehouse (VecNet-DW). It is the first online, globally-integrated platform that provides efficient search, retrieval and visualization of historical, predictive, and static malaria-related data, organized in data marts. Historical and static data are modelled using star schemas, while predictive data are modelled using a snowflake schema. The major goals, characteristics, and components of the DW are described along with its data taxonomy and ontology, the external data storage systems and the logical modelling and physical design phases. Results are presented as screenshots of a Dimensional Data browser, a Lookup Tables browser, and a Results Viewer interface. The power of the DW emerges from integrated querying of the different data marts and structuring those queries to the desired dimensions, enabling users to search, view, analyse, and store large volumes of aggregated data, and responding better to the increasing demands of users. Database URL https://dw.vecnet.org/datawarehouse/ PMID:29220463

  14. Temporal Wind Pairs for Space Launch Vehicle Capability Assessment and Risk Mitigation

    NASA Technical Reports Server (NTRS)

    Decker, Ryan K.; Barbre, Robert E., Jr.

    2015-01-01

    Space launch vehicles incorporate upper-level wind assessments to determine wind effects on the vehicle and for a commit to launch decision. These assessments make use of wind profiles measured hours prior to launch and may not represent the actual wind the vehicle will fly through. Uncertainty in the winds over the time period between the assessment and launch introduces uncertainty in assessment of vehicle controllability and structural integrity that must be accounted for to ensure launch safety. Temporal wind pairs are used in engineering development of allowances to mitigate uncertainty. Five sets of temporal wind pairs at various times (0.75, 1.5, 2, 3 and 4-hrs) at the United States Air Force Eastern Range and Western Range, as well as the National Aeronautics and Space Administration's Wallops Flight Facility are developed for use in upper-level wind assessments on vehicle performance. Historical databases are compiled from balloon-based and vertically pointing Doppler radar wind profiler systems. Various automated and manual quality control procedures are used to remove unacceptable profiles. Statistical analyses on the resultant wind pairs from each site are performed to determine if the observed extreme wind changes in the sample pairs are representative of extreme temporal wind change. Wind change samples in the Eastern Range and Western Range databases characterize extreme wind change. However, the small sample sizes in the Wallops Flight Facility databases yield low confidence that the sample population characterizes extreme wind change that could occur.

  15. Temporal Wind Pairs for Space Launch Vehicle Capability Assessment and Risk Mitigation

    NASA Technical Reports Server (NTRS)

    Decker, Ryan K.; Barbre, Robert E., Jr.

    2014-01-01

    Space launch vehicles incorporate upper-level wind assessments to determine wind effects on the vehicle and for a commit to launch decision. These assessments make use of wind profiles measured hours prior to launch and may not represent the actual wind the vehicle will fly through. Uncertainty in the winds over the time period between the assessment and launch introduces uncertainty in assessment of vehicle controllability and structural integrity that must be accounted for to ensure launch safety. Temporal wind pairs are used in engineering development of allowances to mitigate uncertainty. Five sets of temporal wind pairs at various times (0.75, 1.5, 2, 3 and 4-hrs) at the United States Air Force Eastern Range and Western Range, as well as the National Aeronautics and Space Administration's Wallops Flight Facility are developed for use in upper-level wind assessments on vehicle performance. Historical databases are compiled from balloon-based and vertically pointing Doppler radar wind profiler systems. Various automated and manual quality control procedures are used to remove unacceptable profiles. Statistical analyses on the resultant wind pairs from each site are performed to determine if the observed extreme wind changes in the sample pairs are representative of extreme temporal wind change. Wind change samples in the Eastern Range and Western Range databases characterize extreme wind change. However, the small sample sizes in the Wallops Flight Facility databases yield low confidence that the sample population characterizes extreme wind change that could occur.

  16. HydroClim: a Continental-Scale Database of Contemporary and Future Streamflow and Stream Temperature Estimates for Aquatic Ecosystem Studies

    NASA Astrophysics Data System (ADS)

    Knouft, J.; Ficklin, D. L.; Bart, H. L.; Rios, N. E.

    2017-12-01

    Streamflow and water temperature are primary factors influencing the traits, distribution, and diversity of freshwater species. Ongoing changes in climate are causing directional alteration of these environmental conditions, which can impact local ecological processes. Accurate estimation of these variables is critical for predicting the responses of species to ongoing changes in freshwater habitat, yet ecologically relevant high-resolution data describing variation in streamflow and water temperature across North America are not available. Considering the vast amount of web-accessible freshwater biodiversity data, development and application of appropriate hydrologic data are critical to the advancement of our understanding of freshwater systems. To address this issue, we are developing the "HydroClim" database, which will provide web-accessible (www.hydroclim.org) historical and projected monthly streamflow and water temperature data for stream sections in all major watersheds across the United States and Canada from 1950-2099. These data will also be integrated with FishNet 2 (www.fishnet2.net), an online biodiversity database that provides open access to over 2 million localities of freshwater fish species in the United States and Canada, thus allowing for the characterization of the habitat requirements of freshwater species across this region. HydroClim should provide a vast array of opportunities for a greater understanding of water resources as well as information for the conservation of freshwater biodiversity in the United States and Canada in the coming century.

  17. PhOD - The Global Drifter Program

    Science.gov Websites

    February 2018. (As of May 2018). Download all data Subsets of database are also available through February 2018. Download subsets of data Many historical drogue off dates have been reevaluated. As of November buoydata_15001_feb18.dat-gz, dirfl_15001_feb18.dat May 1, 2018: Milestone reached On Tuesday, May 1, 2018, NOAA's

  18. Building a Database for the Historical Analysis of the General Chemistry Curriculum Using ACS General Chemistry Exams as Artifacts

    ERIC Educational Resources Information Center

    Luxford, Cynthia J.; Linenberger, Kimberly J.; Raker, Jeffrey R.; Baluyut, John Y.; Reed, Jessica J.; De Silva, Chamila; Holme, Thomas A.

    2015-01-01

    As a discipline, chemistry enjoys a unique position. While many academic areas prepared "cooperative examinations" in the 1930s, only chemistry maintained the activity within what has become the ACS Examinations Institute. As a result, the long-term existence of community-built, norm-referenced, standardized exams provides a historical…

  19. Mortality Rates in the General Irish Population Compared to Those with an Intellectual Disability from 2003 to 2012

    ERIC Educational Resources Information Center

    McCarron, Mary; Carroll, Rachael; Kelly, Caraiosa; McCallion, Philip

    2015-01-01

    Background:Historically, there has been higher and earlier mortality among people with intellectual disability as compared to the general population, but there have also been methodological problems and differences in the available studies. Method: Data were drawn from the 2012 National Intellectual Disability Database and the Census in Ireland. A…

  20. Site characteristics of red spruce witness tree locations in the uplands of West Virginia, USA

    Treesearch

    Melissa Thomas-Van Gundy; Michael Strager; James. Rentch

    2012-01-01

    Knowledge, both of the historical range of spruce-dominated forests and associated site conditions, is needed by land managers to help define restoration goals and potential sites for restoration. We used an existing digital database of witness trees listed in deeds from 1752 to 1899 to compare characteristics of red spruce (Picea rubens Sarg.) sites...

  1. Computational Solutions for Today’s Navy: New Methods are Being Employed to Meet the Navy’s Changing Software-Development Environment

    DTIC Science & Technology

    2008-03-01

    software- development environment. ▶ Frank W. Bentrem, Ph.D., John T. Sample, Ph.D., and Michael M. Harris he Naval Research Labor - atory (NRL) is the...sonars (Through-the-Sensor technology), supercomputer generated numer- ical models, and historical/ clima - tological databases. It uses a vari- ety of

  2. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    NASA Astrophysics Data System (ADS)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  3. Using Proxy Records to Document Gulf of Mexico Tropical Cyclones from 1820-1915

    PubMed Central

    Rohli, Robert V.; DeLong, Kristine L.; Harley, Grant L.; Trepanier, Jill C.

    2016-01-01

    Observations of pre-1950 tropical cyclones are sparse due to observational limitations; therefore, the hurricane database HURDAT2 (1851–present) maintained by the National Oceanic and Atmospheric Administration may be incomplete. Here we provide additional documentation for HURDAT2 from historical United States Army fort records (1820–1915) and other archived documents for 28 landfalling tropical cyclones, 20 of which are included in HURDAT2, along the northern Gulf of Mexico coast. One event that occurred in May 1863 is not currently documented in the HURDAT2 database but has been noted in other studies. We identify seven tropical cyclones that occurred before 1851, three of which are potential tropical cyclones. We corroborate the pre-HURDAT2 storms with a tree-ring reconstruction of hurricane impacts from the Florida Keys (1707–2009). Using this information, we suggest landfall locations for the July 1822 hurricane just west of Mobile, Alabama and 1831 hurricane near Last Island, Louisiana on 18 August. Furthermore, we model the probable track of the August 1831 hurricane using the weighted average distance grid method that incorporates historical tropical cyclone tracks to supplement report locations. PMID:27898726

  4. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    PubMed

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  5. Ecology of Alpine Macrofungi - Combining Historical with Recent Data

    PubMed Central

    Brunner, Ivano; Frey, Beat; Hartmann, Martin; Zimmermann, Stephan; Graf, Frank; Suz, Laura M.; Niskanen, Tuula; Bidartondo, Martin I.; Senn-Irlet, Beatrice

    2017-01-01

    Historical datasets of living communities are important because they can be used to document creeping shifts in species compositions. Such a historical data set exists for alpine fungi. From 1941 to 1953, the Swiss geologist Jules Favre visited yearly the region of the Swiss National Park and recorded the occurring fruiting bodies of fungi >1 mm (so-called “macrofungi”) in the alpine zone. Favre can be regarded as one of the pioneers of alpine fungal ecology not least because he noted location, elevation, geology, and associated plants during his numerous excursions. However, some relevant information is only available in his unpublished field-book. Overall, Favre listed 204 fungal species in 26 sampling sites, with 46 species being previously unknown. The analysis of his data revealed that the macrofungi recorded belong to two major ecological groups, either they are symbiotrophs and live in ectomycorrhizal associations with alpine plant hosts, or they are saprotrophs and decompose plant litter and soil organic matter. The most frequent fungi were members of Inocybe and Cortinarius, which form ectomycorrhizas with Dryas octopetala or the dwarf alpine Salix species. The scope of the present study was to combine Favre's historical dataset with more recent data, either with the “SwissFungi” database or with data from major studies of the French and German Alps, and with the data from novel high-throughput DNA sequencing techniques of soils from the Swiss Alps. Results of the latter application revealed, that problems associated with these new techniques are manifold and species determination remains often unclear. At this point, the fungal taxa collected by Favre and deposited as exsiccata at the “Conservatoire et Jardin Botaniques de la Ville de Genève” could be used as a reference sequence dataset for alpine fungal studies. In conclusion, it can be postulated that new improved databases are urgently necessary for the near future, particularly, with regard to investigating fungal communities from alpine regions using new techniques. PMID:29123508

  6. Ecology of Alpine Macrofungi - Combining Historical with Recent Data.

    PubMed

    Brunner, Ivano; Frey, Beat; Hartmann, Martin; Zimmermann, Stephan; Graf, Frank; Suz, Laura M; Niskanen, Tuula; Bidartondo, Martin I; Senn-Irlet, Beatrice

    2017-01-01

    Historical datasets of living communities are important because they can be used to document creeping shifts in species compositions. Such a historical data set exists for alpine fungi. From 1941 to 1953, the Swiss geologist Jules Favre visited yearly the region of the Swiss National Park and recorded the occurring fruiting bodies of fungi >1 mm (so-called "macrofungi") in the alpine zone. Favre can be regarded as one of the pioneers of alpine fungal ecology not least because he noted location, elevation, geology, and associated plants during his numerous excursions. However, some relevant information is only available in his unpublished field-book. Overall, Favre listed 204 fungal species in 26 sampling sites, with 46 species being previously unknown. The analysis of his data revealed that the macrofungi recorded belong to two major ecological groups, either they are symbiotrophs and live in ectomycorrhizal associations with alpine plant hosts, or they are saprotrophs and decompose plant litter and soil organic matter. The most frequent fungi were members of Inocybe and Cortinarius , which form ectomycorrhizas with Dryas octopetala or the dwarf alpine Salix species. The scope of the present study was to combine Favre's historical dataset with more recent data, either with the "SwissFungi" database or with data from major studies of the French and German Alps, and with the data from novel high-throughput DNA sequencing techniques of soils from the Swiss Alps. Results of the latter application revealed, that problems associated with these new techniques are manifold and species determination remains often unclear. At this point, the fungal taxa collected by Favre and deposited as exsiccata at the "Conservatoire et Jardin Botaniques de la Ville de Genève" could be used as a reference sequence dataset for alpine fungal studies. In conclusion, it can be postulated that new improved databases are urgently necessary for the near future, particularly, with regard to investigating fungal communities from alpine regions using new techniques.

  7. Estimation Model of Spacecraft Parameters and Cost Based on a Statistical Analysis of COMPASS Designs

    NASA Technical Reports Server (NTRS)

    Gerberich, Matthew W.; Oleson, Steven R.

    2013-01-01

    The Collaborative Modeling for Parametric Assessment of Space Systems (COMPASS) team at Glenn Research Center has performed integrated system analysis of conceptual spacecraft mission designs since 2006 using a multidisciplinary concurrent engineering process. The set of completed designs was archived in a database, to allow for the study of relationships between design parameters. Although COMPASS uses a parametric spacecraft costing model, this research investigated the possibility of using a top-down approach to rapidly estimate the overall vehicle costs. This paper presents the relationships between significant design variables, including breakdowns of dry mass, wet mass, and cost. It also develops a model for a broad estimate of these parameters through basic mission characteristics, including the target location distance, the payload mass, the duration, the delta-v requirement, and the type of mission, propulsion, and electrical power. Finally, this paper examines the accuracy of this model in regards to past COMPASS designs, with an assessment of outlying spacecraft, and compares the results to historical data of completed NASA missions.

  8. The cultural psychology endeavor to make culture central to psychology: Comment on Hall et al. (2016).

    PubMed

    Dvorakova, Antonie

    2016-12-01

    When Hall, Yip, and Zárate (2016) suggested that cultural psychology focused on reporting differences between groups, they described comparative research conducted in other fields, including cross-cultural psychology. Cultural psychology is a different discipline with methodological approaches reflecting its dissimilar goal, which is to highlight the cultural grounding of human psychological characteristics, and ultimately make culture central to psychology in general. When multicultural psychology considers, according to Hall et al., the mechanisms of culture's influence on behavior, it treats culture the same way as cross-cultural psychology does. In contrast, cultural psychology goes beyond treating culture as an external variable when it proposes that culture and psyche are mutually constitutive. True psychology of the human experience must encompass world populations through research of the ways in which (a) historically grounded sociocultural contexts enable the distinct meaning systems that people construct, and (b) these systems simultaneously guide the human formation of the environments. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. An informatics approach to assess pediatric pharmacotherapy: design and implementation of a hospital drug utilization system.

    PubMed

    Zuppa, Athena; Vijayakumar, Sundararajan; Jayaraman, Bhuvana; Patel, Dimple; Narayan, Mahesh; Vijayakumar, Kalpana; Mondick, John T; Barrett, Jeffrey S

    2007-09-01

    Drug utilization in the inpatient setting can provide a mechanism to assess drug prescribing trends, efficiency, and cost-effectiveness of hospital formularies and examine subpopulations for which prescribing habits may be different. Such data can be used to correlate trends with time-dependent or seasonal changes in clinical event rates or the introduction of new pharmaceuticals. It is now possible to provide a robust, dynamic analysis of drug utilization in a large pediatric inpatient setting through the creation of a Web-based hospital drug utilization system that retrieves source data from our accounting database. The production implementation provides a dynamic and historical account of drug utilization at the authors' institution. The existing application can easily be extended to accommodate a multi-institution environment. The creation of a national or even global drug utilization network would facilitate the examination of geographical and/or socioeconomic influences in drug utilization and prescribing practices in general.

  10. VizieR Online Data Catalog: Accurate astrometry & RVs of 4 multiple systems (Tokovinin+, 2017)

    NASA Astrophysics Data System (ADS)

    Tokovinin, A.; Latham, D. W.

    2017-10-01

    The outer subsystems are classical visual binaries. Historic micrometric measurements and modern speckle interferometric data have been obtained from the WDS database on our request. Additionally, we secured new speckle astrometry and relative photometry of two systems at the 4.1m SOAR telescope. Published radial velocities (RVs) are used here together with the new data. The RVs were measured with the CfA Digital Speedometers, initially using the 1.5m Wyeth Reflector at the Oak Ridge Observatory in the town of Harvard, Massachusetts, and subsequently with the 1.5m Tillinghast Reflector at the Whipple Observatory on Mount Hopkins, Arizona. Starting in 2009, the new fiber-fed Tillinghast Reflector Echelle Spectrograph (TRES) was used. The spectral resolution was 44000 for all three spectrographs. Two objects, HIP 101955 and 103987, were observed in 2015 with the CHIRON echelle spectrograph at the 1.5m telescope at CTIO with a spectral resolution of 80000. (4 data files).

  11. Fifteen hundred guidelines and growing: the UK database of clinical guidelines.

    PubMed

    van Loo, John; Leonard, Niamh

    2006-06-01

    The National Library for Health offers a comprehensive searchable database of nationally approved clinical guidelines, called the Guidelines Finder. This resource, commissioned in 2002, is managed and developed by the University of Sheffield Health Sciences Library. The authors introduce the historical and political dimension of guidelines and the nature of guidelines as a mechanism to ensure clinical effectiveness in practice. The article then outlines the maintenance and organisation of the Guidelines Finder database itself, the criteria for selection, who publishes guidelines and guideline formats, usage of the Guidelines Finder service and finally looks at some lessons learnt from a local library offering a national service. Clinical guidelines are central to effective clinical practice at the national, organisational and individual level. The Guidelines Finder is one of the most visited resources within the National Library for Health and is successful in answering information needs related to specific patient care, clinical research, guideline development and education.

  12. A national environmental monitoring system to support the Moroccan sustainable development strategy

    NASA Astrophysics Data System (ADS)

    Mourhir, A.; Rachidi, T.

    2010-12-01

    Morocco is a mountainous country, subject to both marine and Saharan influences. The increase in population has led to an increase of the gross domestic product (GDP), which accentuated by inadequate resource management, has been accompanied by the degradation of the environment. The annual cost of environmental damage has been estimated at nearly eight percent of Morocco’s GDP. Morocco is a country that has scarce natural resources, especially arable land and water. In recent years, intensive agricultural production, large-scale irrigation schemes, industrialization, and urbanization have been creating serious problems. The country has faced severe air, water and soil pollution, environmental health problems, deforestation and soil erosion. The country is very vulnerable to impacts of global climate change. Morocco’s approach to sustainable development (SD) is mainly environmental. The two main documents for Morocco’s SD strategy are the National Strategy for the Protection of the Environment and Sustainable Development (SNPEDD), 1995, and the National Plan of Action for the Environment (PANE), 1998. SNPEDD’s main objective is the integration and strengthening of environmental concerns in economic development activities. The activities for the formulation and implementation of the strategy include: a) studies on the state of the Moroccan environment; b) formulation of the PANE; c) preparation of a sensitization program on environmental issues and the implementation of a database and information system on the environment; (d) preparation of regional and local environmental monographies. The aim of the current work is to create an information system as an approach to complex sustainability analyses at the national level using GIS technologies. This information system includes the following: 1.Development of a database of SD indicators and historical data. Morocco has been involved in the working framework of the Mediterranean Commission for Sustainable Development to set up an indicator system (IDD) specific to Morocco. The National Committee for Sustainable Development Indicators was set up to create a program to test and validate the IDD. A number of indicators have been chosen and the Moroccan government’s Environment Department has made the database available through a publication and via the internet, which will be updated regularly. The database will be organized to facilitate ad hoc query and analysis. 2.Development of a GIS structure to help map plans for achieving successful management strategies that are sustainable both at the regional and national levels. 3.Visualization and analysis tools for spatial and temporal changes of environmental indicators to help manage growth and change.

  13. Precise photorealistic visualization for restoration of historic buildings based on tacheometry data

    NASA Astrophysics Data System (ADS)

    Ragia, Lemonia; Sarri, Froso; Mania, Katerina

    2018-03-01

    This paper puts forward a 3D reconstruction methodology applied to the restoration of historic buildings taking advantage of the speed, range and accuracy of a total geodetic station. The measurements representing geo-referenced points produced an interactive and photorealistic geometric mesh of a monument named `Neoria.' `Neoria' is a Venetian building located by the old harbor at Chania, Crete, Greece. The integration of tacheometry acquisition and computer graphics puts forward a novel integrated software framework for the accurate 3D reconstruction of a historical building. The main technical challenge of this work was the production of a precise 3D mesh based on a sufficient number of tacheometry measurements acquired fast and at low cost, employing a combination of surface reconstruction and processing methods. A fully interactive application based on game engine technologies was developed. The user can visualize and walk through the monument and the area around it as well as photorealistically view it at different times of day and night. Advanced interactive functionalities are offered to the user in relation to identifying restoration areas and visualizing the outcome of such works. The user could visualize the coordinates of the points measured, calculate distances and navigate through the complete 3D mesh of the monument. The geographical data are stored in a database connected with the application. Features referencing and associating the database with the monument are developed. The goal was to utilize a small number of acquired data points and present a fully interactive visualization of a geo-referenced 3D model.

  14. Precise photorealistic visualization for restoration of historic buildings based on tacheometry data

    NASA Astrophysics Data System (ADS)

    Ragia, Lemonia; Sarri, Froso; Mania, Katerina

    2018-04-01

    This paper puts forward a 3D reconstruction methodology applied to the restoration of historic buildings taking advantage of the speed, range and accuracy of a total geodetic station. The measurements representing geo-referenced points produced an interactive and photorealistic geometric mesh of a monument named `Neoria.' `Neoria' is a Venetian building located by the old harbor at Chania, Crete, Greece. The integration of tacheometry acquisition and computer graphics puts forward a novel integrated software framework for the accurate 3D reconstruction of a historical building. The main technical challenge of this work was the production of a precise 3D mesh based on a sufficient number of tacheometry measurements acquired fast and at low cost, employing a combination of surface reconstruction and processing methods. A fully interactive application based on game engine technologies was developed. The user can visualize and walk through the monument and the area around it as well as photorealistically view it at different times of day and night. Advanced interactive functionalities are offered to the user in relation to identifying restoration areas and visualizing the outcome of such works. The user could visualize the coordinates of the points measured, calculate distances and navigate through the complete 3D mesh of the monument. The geographical data are stored in a database connected with the application. Features referencing and associating the database with the monument are developed. The goal was to utilize a small number of acquired data points and present a fully interactive visualization of a geo-referenced 3D model.

  15. Multimedia systems for art and culture: a case study of Brihadisvara Temple

    NASA Astrophysics Data System (ADS)

    Jain, Anil K.; Goel, Sanjay; Agarwal, Sachin; Mittal, Vipin; Sharma, Hariom; Mahindru, Ranjeev

    1997-01-01

    In India a temple is not only a structure of religious significance and celebration, but it also plays an important role in the social, administrative and cultural life of the locality. Temples have served as centers for learning Indian scriptures. Music and dance were fostered and performed in the precincts of the temples. Built at the end of the 10th century, the Brihadisvara temple signified new design methodologies. We have access to a large number of images, audio and video recordings, architectural drawings and scholarly publications of this temple. A multimedia system for this temple is being designed which is intended to be used for the following purposes: (1) to inform and enrich the general public, and (2) to assist the scholars in their research. Such a system will also preserve and archive old historical documents and images. The large database consists primarily of images which can be retrieved using keywords, but the emphasis here is largely on techniques which will allow access using image content. Besides classifying images as either long shots or close-ups, deformable template matching is used for shape-based query by image content, and digital video retrieval. Further, to exploit the non-linear accessibility of video sequences, key frames are determined to aid the domain experts in getting a quick preview of the video. Our database also has images of several old, and rare manuscripts many of which are noisy and difficult to read. We have enhanced them to make them more legible. We are also investigating the optimal trade-off between image quality and compression ratios.

  16. Calculating length of gestation from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System (SART CORS) database versus vital records may alter reported rates of prematurity.

    PubMed

    Stern, Judy E; Kotelchuck, Milton; Luke, Barbara; Declercq, Eugene; Cabral, Howard; Diop, Hafsatou

    2014-05-01

    To compare length of gestation after assisted reproductive technology (ART) as calculated by three methods from the Society for Assisted Reproductive Technology Clinic Outcome Reporting System (SART CORS) and vital records (birth and fetal death) in the Massachusetts Pregnancy to Early Life Longitudinal Data System (PELL). Historical cohort study. Database linkage analysis. Live or stillborn deliveries. None. ART deliveries were linked to live birth or fetal death certificates. Length of gestation in 7,171 deliveries from fresh autologous ART cycles (2004-2008) was calculated and compared with that of SART CORS with the use of methods: M1 = outcome date - cycle start date; M2 = outcome date - transfer date + 17 days; and M3 = outcome date - transfer date + 14 days + day of transfer. Generalized estimating equation models were used to compare methods. Singleton and multiple deliveries were included. Overall prematurity (delivery <37 weeks) varied by method of calculation: M1 29.1%; M2 25.6%; M3 25.2%; and PELL 27.2%. The SART methods, M1-M3, varied from those of PELL by ≥ 3 days in >45% of deliveries and by more than 1 week in >22% of deliveries. Each method differed from each other. Estimates of preterm birth in ART vary depending on source of data and method of calculation. Some estimates may overestimate preterm birth rates for ART conceptions. Copyright © 2014 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  17. Water level ingest, archive and processing system - an integral part of NOAA's tsunami database

    NASA Astrophysics Data System (ADS)

    McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.

    2013-12-01

    The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.

  18. [The theme of disaster in health care: profile of technical and scientific production in the specialized database on disasters of the Virtual Health Library - VHL].

    PubMed

    Rocha, Vania; Ximenes, Elisa Francioli; Carvalho, Mauren Lopes de; Alpino, Tais de Moura Ariza; Freitas, Carlos Machado de

    2014-09-01

    In the specialized database of the Virtual Health Library (VHL), the DISASTER database highlights the importance of the theme for the health sector. The scope of this article is to identify the profiles of technical and scientific publications in the specialized database. Based on systematic searches and the analysis of results it is possible to determine: the type of publication; the main topics addressed; the most common type of disasters mentioned in published materials, countries and regions as subjects, historic periods with the most publications and the current trend of publications. When examining the specialized data in detail, it soon becomes clear that the number of major topics is very high, making a specific search process in this database a challenging exercise. On the other hand, it is encouraging that the disaster topic is discussed and assessed in a broad and diversified manner, associated with different aspects of the natural and social sciences. The disaster issue requires the production of interdisciplinary knowledge development to reduce the impacts of disasters and for risk management. In this way, since the health sector is a interdisciplinary area, it can contribute to knowledge production.

  19. Cost estimating methods for advanced space systems

    NASA Technical Reports Server (NTRS)

    Cyr, Kelley

    1994-01-01

    NASA is responsible for developing much of the nation's future space technology. Cost estimates for new programs are required early in the planning process so that decisions can be made accurately. Because of the long lead times required to develop space hardware, the cost estimates are frequently required 10 to 15 years before the program delivers hardware. The system design in conceptual phases of a program is usually only vaguely defined and the technology used is so often state-of-the-art or beyond. These factors combine to make cost estimating for conceptual programs very challenging. This paper describes an effort to develop parametric cost estimating methods for space systems in the conceptual design phase. The approach is to identify variables that drive cost such as weight, quantity, development culture, design inheritance and time. The nature of the relationships between the driver variables and cost will be discussed. In particular, the relationship between weight and cost will be examined in detail. A theoretical model of cost will be developed and tested statistically against a historical database of major research and development projects.

  20. Vehicle Integrated Prognostic Reasoner (VIPR) 2010 Annual Final Report

    NASA Technical Reports Server (NTRS)

    Hadden, George D.; Mylaraswamy, Dinkar; Schimmel, Craig; Biswas, Gautam; Koutsoukos, Xenofon; Mack, Daniel

    2011-01-01

    Honeywell's Central Maintenance Computer Function (CMCF) and Aircraft Condition Monitoring Function (ACMF) represent the state-of-the art in integrated vehicle health management (IVHM). Underlying these technologies is a fault propagation modeling system that provides nose-to-tail coverage and root cause diagnostics. The Vehicle Integrated Prognostic Reasoner (VIPR) extends this technology to interpret evidence generated by advanced diagnostic and prognostic monitors provided by component suppliers to detect, isolate, and predict adverse events that affect flight safety. This report describes year one work that included defining the architecture and communication protocols and establishing the user requirements for such a system. Based on these and a set of ConOps scenarios, we designed and implemented a demonstration of communication pathways and associated three-tiered health management architecture. A series of scripted scenarios showed how VIPR would detect adverse events before they escalate as safety incidents through a combination of advanced reasoning and additional aircraft data collected from an aircraft condition monitoring system. Demonstrating VIPR capability for cases recorded in the ASIAS database and cross linking them with historical aircraft data is planned for year two.

  1. The evolution of spinal instrumentation for the management of occipital cervical and cervicothoracic junctional injuries.

    PubMed

    Smucker, Joseph D; Sasso, Rick C

    2006-05-15

    Independent computer-based literature review of articles pertaining to instrumentation and fusion of junctional injuries of the cervical spine. To review and discuss the evolution of instrumentation techniques and systems used in the treatment of cervical spine junctional injuries. Instrumentation of junctional injuries of the cervical spine has been limited historically by failure to achieve rigid internal fixation in multiple planes. The evolution of these techniques has required increased insight into the morphology and unique biomechanics of the structures to be instrumented. Computer-based literature search of Ovid and PubMed databases. Extensive literature search yielded insights into the evolution of systems initially based on onlay bone graft combined with wiring techniques. Such techniques have come to include systems incorporating rigid, longitudinal struts that accommodate multiplanar screws placed in the lateral masses, pedicles, transarticular regions, and occipital bone. Despite a rapid evolution of techniques and instrumentation technologies, it remains incumbent on the physician to provide the patient with a surgical procedure that balances the likelihood of a favorable outcome with the risk inherent in the implementation of the procedure.

  2. Adherence to tuberculosis care in Canadian Aboriginal populations, Part 1: definition, measurement, responsibility, barriers.

    PubMed

    Orr, Pamela

    2011-04-01

    In a 2-part series, the current literature with respect to adherence to tuberculosis care among Canadian Aboriginal populations is reviewed. In the current paper, which comprises part 1 of this review, adherence is defined, and methods of measurement, issues of responsibility and potential barriers to adherence are explored. Study design. Literature review. A systematic search and analytic review of relevant studies was undertaken, including an online search of electronic databases (PubMed, PsychINFO, MEDLINE, Native Health Database, Scopus, Social Science Citation Index) and publications by governmental and non-governmental agencies. Poor adherence to therapy for TB disease is the most common cause of initial treatment failure and of disease relapse worldwide. Adherence to care for TB disease is necessary for the health of both the affected individual and society as a whole. Adherence is a task-specific behaviour that is not inherent to ethnic identity. The term applies only when common agreement over a care plan has been reached between patient and provider. The International Standards for Tuberculosis Care and the Patients Charter outline the responsibilities for adherence on the part of both patients and providers. For Canadian Aboriginals, barriers to adherence may derive from a complex interaction between the health system, personal factors and social factors, which may include dysfunctional acute and public health systems, dissonant (between health care provider and patient) belief systems, concurrent co-morbidities and life stressors, poverty and social stigma. Adherence is a task-specific behaviour, not a personality trait. It is influenced by the interaction of systemic, personal and societal factors. These factors must be understood within the historical experience of TB and the cultural meaning of health and illness among Indigenous Canadians.

  3. MEDWISE: an innovative public health information system infrastructure.

    PubMed

    Sahin, Yasar Guneri; Celikkan, Ufuk

    2012-06-01

    In this paper, we present MedWise, a high level design of a medical information infrastructure, and its architecture. The proposed system offers a comprehensive, modular, robust and extensible infrastructure to be used in public health care systems. The system gathers reliable and evidence based health data, which it then classifies, interprets and stores into a particular database. It creates a healthcare ecosystem that aids the medical community by providing for less error prone diagnoses and treatment of diseases. This system will be standards-compliant; therefore it would be complementary to the existing healthcare and clinical information systems. The key objective of the proposed system is to provide as much medical historical and miscellaneous data as possible about the patients with minimal consultation, thus allowing physicians to easily access Patients' Ancillary Data (PAD) such as hereditary, residential, travel, custom, meteorological, biographical and demographical data before the consultation. In addition, the system can help to diminish problems and misdiagnosis situations caused by language barriers-disorders and misinformation. MedWise can assist physicians to shorten time for diagnosis and consultations, therefore dramatically improving quality and quantity of the physical examinations of patients. Furthermore, since it intends to supply a significant amount of data, it may be used to improve skills of students in medical education.

  4. Sovereign immunity: Principles and application in medical malpractice.

    PubMed

    Suk, Michael

    2012-05-01

    Tort law seeks accountability when parties engage in negligent conduct, and aims to compensate the victims of such conduct. An exception to this general rule governing medical negligence is the doctrine of sovereign immunity. Historically, individuals acting under the authority of the government or other sovereign entity had almost complete protection against tort liability. This article addressed the following: (1) the development of sovereign immunity in law, (2) the lasting impact of the Federal Tort Claims Act on sovereign immunity, and (3) the contemporary application of sovereign immunity to medical malpractice, using case examples from Virginia and Florida. I performed an Internet search to identify sources that addressed the concept of sovereign immunity, followed by a focused search for relevant articles in PubMed and LexisNexis, literature databases for medical and legal professionals, respectively. Historically, sovereign liability conferred absolute immunity from lawsuits in favor of the sovereign (ie, the government). Practical considerations in our democratic system have contributed to an evolution of this doctrine. Understanding sovereign immunity and its contemporary application are of value for any physician interested in the debate concerning medical malpractice in the United States. Under certain circumstances, physicians working as employees of the federal or state government may be protected against individual liability if the government is substituted as the defendant.

  5. Avian Species Inventory at Manzanar National Historic Site, California - Final Report to the National Park Service

    USGS Publications Warehouse

    Hart, Jan; Drost, Charles

    2008-01-01

    We conducted a baseline inventory for avian species at Manzanar National Historic Site, Inyo County, Calif., from 2002 to 2005. Under the guidelines of the Mojave Network Biological Inventory Program, the primary objectives for this study were to (1) inventory and document the occurrence of avian species at Manzanar, with the goal of documenting at least 90 percent of the species present; (2) provide a geographic information system (GIS)-referenced list of sensitive species occurring at Manzanar that are rare, on Federal or State lists, or otherwise worthy of special consideration; and (3) enter all species data into the National Park Service NPSpecies database. Survey methods included general area searches, variable circular plot point-count censusing, nocturnal surveys, and nest searching. During 13 year-round survey sessions, we documented the occurrence of 132 bird species at Manzanar and confirmed breeding by 19 of these. Based on our findings, as well as review of the literature and searches for records of species occurrence, we estimate inventory completeness for regularly occurring bird species at Manzanar to be near 90 percent. No sensitive species on Federal or State lists were found. The distribution and relative abundance of common bird species at this site is now well enough known to begin development of a monitoring protocol for this group.

  6. Looking for an old aerial photograph

    USGS Publications Warehouse

    ,

    1997-01-01

    Attempts to photograph the surface of the Earth date from the 1800's, when photographers attached cameras to balloons, kites, and even pigeons. Today, aerial photographs and satellite images are commonplace. The rate of acquiring aerial photographs and satellite images has increased rapidly in recent years. Views of the Earth obtained from aircraft or satellites have become valuable tools to Government resource planners and managers, land-use experts, environmentalists, engineers, scientists, and a wide variety of other users. Many people want historical aerial photographs for business or personal reasons. They may want to locate the boundaries of an old farm or a piece of family property. Or they may want a photograph as a record of changes in their neighborhood, or as a gift. The U.S. Geological Survey (USGS) maintains the Earth Science Information Centers (ESIC?s) to sell aerial photographs, remotely sensed images from satellites, a wide array of digital geographic and cartographic data, as well as the Bureau?s wellknown maps. Declassified photographs from early spy satellites were recently added to the ESIC offerings of historical images. Using the Aerial Photography Summary Record System database, ESIC researchers can help customers find imagery in the collections of other Federal agencies and, in some cases, those of private companies that specialize in esoteric products.

  7. Integrated Digital Platform for the Valorization of a Cultural Landscape

    NASA Astrophysics Data System (ADS)

    Angheluţǎ, L. M.; Ratoiu, L.; Chelmus, A. I.; Rǎdvan, R.; Petculescu, A.

    2017-05-01

    This paper presents a newly started demonstrative project regarding the implementation and validation of an interdisciplinary research model for the Aluniş-Bozioru (Romania) cultural landscape, with the development of an online interactive digital product. This digital product would provide complementary data about the historical monuments and their environment, and also, constant updates and statistical comparison in order to generate an accurate evaluation of the state of conservation for this specific cultural landscape. Furthermore, the resulted information will contribute in the decision making process for the regional development policies. The project is developed by an interdisciplinary joint team of researchers consisted of technical scientists with great experience in advanced non-invasive characterization of the cultural heritage (NIRD for Optoelectronics - INOE 2000) and a group of experts from geology and biology (Romanian Academy's "Emil Racoviţǎ" Institute of Speleology - ISER). Resulted scientific data will include: 3D digital models of the selected historical monuments, microclimate monitoring, Ground Penetrating Radar survey, airborne LIDAR, multispectral and thermal imaging, soil and rock characterization, environmental studies. This digital product is constituted by an intuitive website with a database that allows data corroboration, visualization and comparison of the 3D digital models, as well as a digital mapping in the GIS system.

  8. The history of the CATH structural classification of protein domains.

    PubMed

    Sillitoe, Ian; Dawson, Natalie; Thornton, Janet; Orengo, Christine

    2015-12-01

    This article presents a historical review of the protein structure classification database CATH. Together with the SCOP database, CATH remains comprehensive and reasonably up-to-date with the now more than 100,000 protein structures in the PDB. We review the expansion of the CATH and SCOP resources to capture predicted domain structures in the genome sequence data and to provide information on the likely functions of proteins mediated by their constituent domains. The establishment of comprehensive function annotation resources has also meant that domain families can be functionally annotated allowing insights into functional divergence and evolution within protein families. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  9. Spatiotemporal database of US congressional elections, 1896–2014

    PubMed Central

    Wolf, Levi John

    2017-01-01

    High-quality historical data about US Congressional elections has long provided common ground for electoral studies. However, advances in geographic information science have recently made it efficient to compile, distribute, and analyze large spatio-temporal data sets on the structure of US Congressional districts. A single spatio-temporal data set that relates US Congressional election results to the spatial extent of the constituencies has not yet been developed. To address this, existing high-quality data sets of elections returns were combined with a spatiotemporal data set on Congressional district boundaries to generate a new spatio-temporal database of US Congressional election results that are explicitly linked to the geospatial data about the districts themselves. PMID:28809849

  10. Short term load forecasting using a self-supervised adaptive neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, H.; Pimmel, R.L.

    The authors developed a self-supervised adaptive neural network to perform short term load forecasts (STLF) for a large power system covering a wide service area with several heavy load centers. They used the self-supervised network to extract correlational features from temperature and load data. In using data from the calendar year 1993 as a test case, they found a 0.90 percent error for hour-ahead forecasting and 1.92 percent error for day-ahead forecasting. These levels of error compare favorably with those obtained by other techniques. The algorithm ran in a couple of minutes on a PC containing an Intel Pentium --more » 120 MHz CPU. Since the algorithm included searching the historical database, training the network, and actually performing the forecasts, this approach provides a real-time, portable, and adaptable STLF.« less

  11. The history of nursing services and education in Sri Lanka and the effects on developing professionalism.

    PubMed

    Jayasekara, Rasika S; McCutcheon, Helen

    2006-10-01

    Understanding the evolution of nursing in a country provides perspective on the origins of current successes and dilemmas and enables the development of strategies and plans for future trends in the profession. This article explores the evolution of nursing services and education in Sri Lanka and the effects on developing professionalism in nursing. Internet database searches, personal communication, and published and unpublished literature and reports were reviewed to obtain historical information on nursing services and education in Sri Lanka. The Sri Lankan health system is reviewed, and the establishment of Western medicine in Sri Lanka and its effects on developing institutionalized nursing education is presented, with a focus on the evolution of nursing education. Major challenges for the nursing profession in Sri Lanka are discussed, and some recommendations are shared.

  12. Coupled ocean-atmosphere models feature systematic delay in Indian monsoon onset compared to their atmosphere-only component

    NASA Astrophysics Data System (ADS)

    Turner, Andrew

    2014-05-01

    In this study we examine monsoon onset characteristics in 20th century historical and AMIP integrations of the CMIP5 multi-model database. We use a period of 1979-2005, common to both the AMIP and historical integrations. While all available observed boundary conditions, including sea-surface temperature (SST), are prescribed in the AMIP integrations, the historical integrations feature ocean-atmosphere models that generate SSTs via air-sea coupled processes. The onset of Indian monsoon rainfall is shown to be systematically earlier in the AMIP integrations when comparing groups of models that provide both experiments, and in the multi-model ensemble means for each experiment in turn. We also test some common circulation indices of the monsoon onset including the horizontal shear in the lower troposphere and wind kinetic energy. Since AMIP integrations are forced by observed SSTs and CMIP5 models are known to have large cold SST biases in the northern Arabian Sea during winter and spring that limits their monsoon rainfall, we relate the delayed onset in the coupled historical integrations to cold Arabian Sea SST biases. This study provides further motivation for solving cold SST biases in the Arabian Sea in coupled models.

  13. Genotator: a disease-agnostic tool for genetic annotation of disease.

    PubMed

    Wall, Dennis P; Pivovarov, Rimma; Tong, Mark; Jung, Jae-Yoon; Fusaro, Vincent A; DeLuca, Todd F; Tonellato, Peter J

    2010-10-29

    Disease-specific genetic information has been increasing at rapid rates as a consequence of recent improvements and massive cost reductions in sequencing technologies. Numerous systems designed to capture and organize this mounting sea of genetic data have emerged, but these resources differ dramatically in their disease coverage and genetic depth. With few exceptions, researchers must manually search a variety of sites to assemble a complete set of genetic evidence for a particular disease of interest, a process that is both time-consuming and error-prone. We designed a real-time aggregation tool that provides both comprehensive coverage and reliable gene-to-disease rankings for any disease. Our tool, called Genotator, automatically integrates data from 11 externally accessible clinical genetics resources and uses these data in a straightforward formula to rank genes in order of disease relevance. We tested the accuracy of coverage of Genotator in three separate diseases for which there exist specialty curated databases, Autism Spectrum Disorder, Parkinson's Disease, and Alzheimer Disease. Genotator is freely available at http://genotator.hms.harvard.edu. Genotator demonstrated that most of the 11 selected databases contain unique information about the genetic composition of disease, with 2514 genes found in only one of the 11 databases. These findings confirm that the integration of these databases provides a more complete picture than would be possible from any one database alone. Genotator successfully identified at least 75% of the top ranked genes for all three of our use cases, including a 90% concordance with the top 40 ranked candidates for Alzheimer Disease. As a meta-query engine, Genotator provides high coverage of both historical genetic research as well as recent advances in the genetic understanding of specific diseases. As such, Genotator provides a real-time aggregation of ranked data that remains current with the pace of research in the disease fields. Genotator's algorithm appropriately transforms query terms to match the input requirements of each targeted databases and accurately resolves named synonyms to ensure full coverage of the genetic results with official nomenclature. Genotator generates an excel-style output that is consistent across disease queries and readily importable to other applications.

  14. Remembering historical victimization: collective guilt for current ingroup transgressions.

    PubMed

    Wohl, Michael J A; Branscombe, Nyla R

    2008-06-01

    The authors examined the consequences of remembering historical victimization for emotional reactions to a current adversary. In Experiment 1, Jewish Canadians who were reminded of the Holocaust accepted less collective guilt for their group's harmful actions toward the Palestinians than those not reminded of their ingroup's past victimization. The extent to which the conflict was perceived to be due to Palestinian terrorism mediated this effect. Experiment 2 illustrated that reminding Jewish people, but not non-Jewish people, of the Holocaust decreased collective guilt for current harm doing compared with when the reminder concerned genocide committed against another group (i.e., Cambodians). In Experiments 3 and 4, Americans experienced less collective guilt for their group's harm doing in Iraq following reminders of either the attacks on September 11th, 2001 or the 1941 Japanese attack on Pearl Harbor compared with a historical victimization reminder that was irrelevant to the ingroup. The authors discuss why remembering the ingroup's past affects responses to outgroups in the present. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  15. Race and ethnicity in the workplace: spotlighting the perspectives of historically stigmatized groups.

    PubMed

    Plaut, Victoria C; Thomas, Kecia M; Hebl, Michelle R

    2014-10-01

    Racial and ethnic identity matter and are salient for people in the workplace--a place where people spend a substantial amount of their time. This special issue brings the workplace into the domain of racial and ethnic minority psychology. It also brings to the study of the workplace a relatively neglected perspective: that of people from historically stigmatized racial and ethnic groups. Though there is, of course, need for more work with different themes, outcomes, and populations, this special issue takes us an important step in the direction of understanding better and giving voice to the experiences of racial and ethnic minorities in the workplace. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  16. WOVOdat as a worldwide resource to improve eruption forecasts

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, Christina; Costa, Fidel; Zar Win Nang, Thin; Tan, Karine; Newhall, Chris; Ratdomopurbo, Antonius

    2015-04-01

    During periods of volcanic unrest, volcanologists need to interpret signs of unrest to be able to forecast whether an eruption is likely to occur. Some volcanic eruptions display signs of impending eruption such as seismic activity, surface deformation, or gas emissions; but not all will give signs and not all signs are necessarily followed by an eruption. Volcanoes behave differently. Precursory signs of an eruption are sometimes very short, less than an hour, but can be also weeks, months, or even years. Some volcanoes are regularly active and closely monitored, while other aren't. Often, the record of precursors to historical eruptions of a volcano isn't enough to allow a forecast of its future activity. Therefore, volcanologists must refer to monitoring data of unrest and eruptions at similar volcanoes. WOVOdat is the World Organization of Volcano Observatories' Database of volcanic unrest - an international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. We have up to now incorporated about 15% of worldwide unrest data into WOVOdat, covering more than 100 eruption episodes, which includes: volcanic background data, eruptive histories, monitoring data (seismic, deformation, gas, hydrology, thermal, fields, and meteorology), monitoring metadata, and supporting data such as reports, images, maps and videos. Nearly all data in WOVOdat are time-stamped and geo-referenced. Along with creating a database on volcanic unrest, WOVOdat also developing web-tools to help users to query, visualize, and compare data, which further can be used for probabilistic eruption forecasting. Reference to WOVOdat will be especially helpful at volcanoes that have not erupted in historical or 'instrumental' time and thus for which no previous data exist. The more data in WOVOdat, the more useful it will be. We actively solicit relevant data contributions from volcano observatories, other institutions, and individual researchers. Detailed information and documentation about the database and how to use it can be found at www.wovodat.org.

  17. Fossil-Fuel C02 Emissions Database and Exploration System

    NASA Astrophysics Data System (ADS)

    Krassovski, M.; Boden, T.; Andres, R. J.; Blasing, T. J.

    2012-12-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL) quantifies the release of carbon from fossil-fuel use and cement production at global, regional, and national spatial scales. The CDIAC emission time series estimates are based largely on annual energy statistics published at the national level by the United Nations (UN). CDIAC has developed a relational database to house collected data and information and a web-based interface to help users worldwide identify, explore and download desired emission data. The available information is divided in two major group: time series and gridded data. The time series data is offered for global, regional and national scales. Publications containing historical energy statistics make it possible to estimate fossil fuel CO2 emissions back to 1751. Etemad et al. (1991) published a summary compilation that tabulates coal, brown coal, peat, and crude oil production by nation and year. Footnotes in the Etemad et al.(1991) publication extend the energy statistics time series back to 1751. Summary compilations of fossil fuel trade were published by Mitchell (1983, 1992, 1993, 1995). Mitchell's work tabulates solid and liquid fuel imports and exports by nation and year. These pre-1950 production and trade data were digitized and CO2 emission calculations were made following the procedures discussed in Marland and Rotty (1984) and Boden et al. (1995). The gridded data presents annual and monthly estimates. Annual data presents a time series recording 1° latitude by 1° longitude CO2 emissions in units of million metric tons of carbon per year from anthropogenic sources for 1751-2008. The monthly, fossil-fuel CO2 emissions estimates from 1950-2008 provided in this database are derived from time series of global, regional, and national fossil-fuel CO2 emissions (Boden et al. 2011), the references therein, and the methodology described in Andres et al. (2011). The data accessible here take these tabular, national, mass-emissions data and distribute them spatially on a one degree latitude by one degree longitude grid. The within-country spatial distribution is achieved through a fixed population distribution as reported in Andres et al. (1996). This presentation introduces newly build database and web interface, reflects the present state and functionality of the Fossil-Fuel CO2 Emissions Database and Exploration System as well as future plans for expansion.

  18. Visualization Component of Vehicle Health Decision Support System

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph; Turmon, Michael; Stough, Timothy; Siegel, Herbert; Walter, patrick; Kurt, Cindy

    2008-01-01

    The visualization front-end of a Decision Support System (DSS) also includes an analysis engine linked to vehicle telemetry, and a database of learned models for known behaviors. Because the display is graphical rather than text-based, the summarization it provides has a greater information density on one screen for evaluation by a flight controller.This tool provides a system-level visualization of the state of a vehicle, and drill-down capability for more details and interfaces to separate analysis algorithms and sensor data streams. The system-level view is a 3D rendering of the vehicle, with sensors represented as icons, tied to appropriate positions within the vehicle body and colored to indicate sensor state (e.g., normal, warning, anomalous state, etc.). The sensor data is received via an Information Sharing Protocol (ISP) client that connects to an external server for real-time telemetry. Users can interactively pan, zoom, and rotate this 3D view, as well as select sensors for a detail plot of the associated time series data. Subsets of the plotted data can be selected and sent to an external analysis engine to either search for a similar time series in an historical database, or to detect anomalous events. The system overview and plotting capabilities are completely general in that they can be applied to any vehicle instrumented with a collection of sensors. This visualization component can interface with the ISP for data streams used by NASA s Mission Control Center at Johnson Space Center. In addition, it can connect to, and display results from, separate analysis engine components that identify anomalies or that search for past instances of similar behavior. This software supports NASA's Software, Intelligent Systems, and Modeling element in the Exploration Systems Research and Technology Program by augmenting the capability of human flight controllers to make correct decisions, thus increasing safety and reliability. It was designed specifically as a tool for NASA's flight controllers to monitor the International Space Station and a future Crew Exploration Vehicle.

  19. United States Army Medical Materiel Development Activity: 1997 Annual Report.

    DTIC Science & Technology

    1997-01-01

    business planning and execution information management system (Project Management Division Database ( PMDD ) and Product Management Database System (PMDS...MANAGEMENT • Project Management Division Database ( PMDD ), Product Management Database System (PMDS), and Special Users Database System:The existing...System (FMS), were investigated. New Product Managers and Project Managers were added into PMDS and PMDD . A separate division, Support, was

  20. Estimating historical anthropogenic global sulfur emission patterns for the period 1850-1990

    NASA Astrophysics Data System (ADS)

    Lefohn, Allen S.; Husar, Janja D.; Husar, Rudolf B.

    It is important to establish a reliable regional emission inventory of sulfur as a function of time when assessing the possible effects of global change and acid rain. This study developed a database of annual estimates of national sulfur emissions from 1850 to 1990. A common methodology was applied across all years and countries allowing for global totals to be produced by adding estimates from all countries. The consistent approach facilitates the modification of the database and the observation of changes at national, regional, or global levels. The emission estimates were based on net production (i.e., production plus imports minus exports), sulfur content, and sulfur retention for each country's production activities. Because the emission estimates were based on the above considerations, our database offers an opportunity to independently compare our results with those estimates based on individual country estimates. Fine temporal resolution clearly shows emission changes associated with specific historical events (e.g., wars, depressions, etc.) on a regional, national, or global basis. The spatial pattern of emissions shows that the US, the USSR, and China were the main sulfur emitters (i.e., approximately 50% of the total) in the world in 1990. The USSR and the US appear to have stabilized their sulfur emissions over the past 20 yr, and the recent increases in global sulfur emissions are linked to the rapid increases in emissions from China. Sulfur emissions have been reduced in some cases by switching from high- to low-sulfur coals. Flue gas desulfurization (FGD) has apparently made important contributions to emission reductions in only a few countries, such as Germany.

Top