A User's Applications of Imaging Techniques: The University of Maryland Historic Textile Database.
ERIC Educational Resources Information Center
Anderson, Clarita S.
1991-01-01
Describes the incorporation of textile images into the University of Maryland Historic Textile Database by a computer user rather than a computer expert. Selection of a database management system is discussed, and PICTUREPOWER, a system that integrates photographic quality images with text and numeric information in databases, is described. (three…
New Technology, New Questions: Using an Internet Database in Chemistry.
ERIC Educational Resources Information Center
Hayward, Roger
1996-01-01
Describes chemistry software that is part of a balanced educational program. Provides several applications including graphs of various relationships among the elements. Includes a brief historical treatment of the periodic table and compares the traditional historical approach with perspectives gained by manipulating an electronic database. (DDR)
NASA Astrophysics Data System (ADS)
Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.
2014-07-01
"Prediflood" is a database of historical floods occurred in Catalonia (NE Iberian Peninsula), between 10th Century and 21th Century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, apt to be used in multidisciplinary flood analysis techniques, as meteorological or hydraullic reconstructions.
NASA Astrophysics Data System (ADS)
Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.
2014-12-01
"Prediflood" is a database of historical floods that occurred in Catalonia (NE Iberian Peninsula), between the 11th century and the 21st century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, suitable for use in multidisciplinary flood analysis techniques, such as meteorological or hydraulic reconstructions.
Spatial cyberinfrastructures, ontologies, and the humanities.
Sieber, Renee E; Wellen, Christopher C; Jin, Yuan
2011-04-05
We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success.
Spatial cyberinfrastructures, ontologies, and the humanities
Sieber, Renee E.; Wellen, Christopher C.; Jin, Yuan
2011-01-01
We report on research into building a cyberinfrastructure for Chinese biographical and geographic data. Our cyberinfrastructure contains (i) the McGill-Harvard-Yenching Library Ming Qing Women's Writings database (MQWW), the only online database on historical Chinese women's writings, (ii) the China Biographical Database, the authority for Chinese historical people, and (iii) the China Historical Geographical Information System, one of the first historical geographic information systems. Key to this integration is that linked databases retain separate identities as bases of knowledge, while they possess sufficient semantic interoperability to allow for multidatabase concepts and to support cross-database queries on an ad hoc basis. Computational ontologies create underlying semantics for database access. This paper focuses on the spatial component in a humanities cyberinfrastructure, which includes issues of conflicting data, heterogeneous data models, disambiguation, and geographic scale. First, we describe the methodology for integrating the databases. Then we detail the system architecture, which includes a tier of ontologies and schema. We describe the user interface and applications that allow for cross-database queries. For instance, users should be able to analyze the data, examine hypotheses on spatial and temporal relationships, and generate historical maps with datasets from MQWW for research, teaching, and publication on Chinese women writers, their familial relations, publishing venues, and the literary and social communities. Last, we discuss the social side of cyberinfrastructure development, as people are considered to be as critical as the technical components for its success. PMID:21444819
Repurposing historical control clinical trial data to provide safety context.
Bhuyan, Prakash; Desai, Jigar; Louis, Matthew St; Carlsson, Martin; Bowen, Edward; Danielson, Mark; Cantor, Michael N
2016-02-01
Billions of dollars spent, millions of subject-hours of clinical trial experience and an abundance of archived study-level data, yet why are historical data underutilized? We propose that historical data can be aggregated to provide safety, background incidence rate and context to improve the evaluation of new medicinal products. Here, we describe the development and application of the eControls database, which is derived from the control arms of studies of licensed products, and discuss the challenges and potential solutions to the proper application of historical data to help interpret product safety. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Barriendos, Mariano; Carles Balasch Solanes, Josep; Tuset, Jordi; Lluís Ruiz-Bellet, Josep
2014-05-01
Available information of historical floods can improve the management of hydroclimatic hazards. This approach is useful in ungauged basins or with short instrumental data series. On the other hand, flood risk is increasing due to both the expansion of human land occupation and the modification of rainfall patterns in the present global climatic change scenario. Within the Prediflood Project, we have designed an integrated database of historical floods in Catalonia with the aim to feed data to: 1) Meteorological reconstruction and modelling. 2) Hydrological and hydraulic reconstruction. 3) Human impacts evaluation, of these floods. The firsts steps of the database design focus on spatial location and on the quality of the data sources in three levels: 1) Historical documentary sources and newspapers contemporary with the floods. 2) Local historiography. 3) Technical reports. After the application of historiographical methodologies, more than 2300 flood records have been added to the database so far. Despite the completion of the database is still a work in progress, the firsts analyses are already underway and focus on the largest floods with catastrophic effects simultaneously on more than 15 catchments: November 1617, October 1787, September 1842, May 1853, September 1874, January 1898, October 1907, October 1940, September 1962, November 1982, October 1994 and others.
ERIC Educational Resources Information Center
Nadal, Gloria Claveria; Lancis, Carlos Sanchez
1997-01-01
Notes that the employment of databases to the study of the history of a language is a method that allows for substantial improvement in investigative quality. Illustrates this with the example of the application of this method to two studies of the history of Spanish developed in the Language and Information Seminary of the Independent University…
AR Based App for Tourist Attraction in ESKİ ÇARŞI (Safranbolu)
NASA Astrophysics Data System (ADS)
Polat, Merve; Rakıp Karaş, İsmail; Kahraman, İdris; Alizadehashrafi, Behnam
2016-10-01
This research is dealing with 3D modeling of historical and heritage landmarks of Safranbolu that are registered by UNESCO. This is an Augmented Reality (AR) based project in order to trigger virtual three-dimensional (3D) models, cultural music, historical photos, artistic features and animated text information. The aim is to propose a GIS-based approach with these features and add to the system as attribute data in a relational database. The database will be available in an AR-based application to provide information for the tourists.
A Support Database System for Integrated System Health Management (ISHM)
NASA Technical Reports Server (NTRS)
Schmalzel, John; Figueroa, Jorge F.; Turowski, Mark; Morris, John
2007-01-01
The development, deployment, operation and maintenance of Integrated Systems Health Management (ISHM) applications require the storage and processing of tremendous amounts of low-level data. This data must be shared in a secure and cost-effective manner between developers, and processed within several heterogeneous architectures. Modern database technology allows this data to be organized efficiently, while ensuring the integrity and security of the data. The extensibility and interoperability of the current database technologies also allows for the creation of an associated support database system. A support database system provides additional capabilities by building applications on top of the database structure. These applications can then be used to support the various technologies in an ISHM architecture. This presentation and paper propose a detailed structure and application description for a support database system, called the Health Assessment Database System (HADS). The HADS provides a shared context for organizing and distributing data as well as a definition of the applications that provide the required data-driven support to ISHM. This approach provides another powerful tool for ISHM developers, while also enabling novel functionality. This functionality includes: automated firmware updating and deployment, algorithm development assistance and electronic datasheet generation. The architecture for the HADS has been developed as part of the ISHM toolset at Stennis Space Center for rocket engine testing. A detailed implementation has begun for the Methane Thruster Testbed Project (MTTP) in order to assist in developing health assessment and anomaly detection algorithms for ISHM. The structure of this implementation is shown in Figure 1. The database structure consists of three primary components: the system hierarchy model, the historical data archive and the firmware codebase. The system hierarchy model replicates the physical relationships between system elements to provide the logical context for the database. The historical data archive provides a common repository for sensor data that can be shared between developers and applications. The firmware codebase is used by the developer to organize the intelligent element firmware into atomic units which can be assembled into complete firmware for specific elements.
A Database of Historical Information on Landslides and Floods in Italy
NASA Astrophysics Data System (ADS)
Guzzetti, F.; Tonelli, G.
2003-04-01
For the past 12 years we have maintained and updated a database of historical information on landslides and floods in Italy, known as the National Research Council's AVI (Damaged Urban Areas) Project archive. The database was originally designed to respond to a specific request of the Minister of Civil Protection, and was aimed at helping the regional assessment of landslide and flood risk in Italy. The database was first constructed in 1991-92 to cover the period 1917 to 1990. Information of damaging landslide and flood event was collected by searching archives, by screening thousands of newspaper issues, by reviewing the existing technical and scientific literature on landslides and floods in Italy, and by interviewing landslide and flood experts. The database was then updated chiefly through the analysis of hundreds of newspaper articles, and it now covers systematically the period 1900 to 1998, and non-systematically the periods 1900 to 1916 and 1999 to 2002. Non systematic information on landslide and flood events older than 20th century is also present in the database. The database currently contains information on more than 32,000 landslide events occurred at more than 25,700 sites, and on more than 28,800 flood events occurred at more than 15,600 sites. After a brief outline of the history and evolution of the AVI Project archive, we present and discuss: (a) the present structure of the database, including the hardware and software solutions adopted to maintain, manage, use and disseminate the information stored in the database, (b) the type and amount of information stored in the database, including an estimate of its completeness, and (c) examples of recent applications of the database, including a web-based GIS systems to show the location of sites historically affected by landslides and floods, and an estimate of geo-hydrological (i.e., landslide and flood) risk in Italy based on the available historical information.
Margaret R. Holdaway
1994-01-01
Describes Geo-CLM, a computer application (for Mac or DOS) whose primary aim is to perform multiple kriging runs to interpolate the historic climatic record at research plots in the Lake States. It is an exploration and analysis tool. Addition capabilities include climatic databases, a flexible test mode, cross validation, lat/long conversion, English/metric units,...
A dynamic clinical dental relational database.
Taylor, D; Naguib, R N G; Boulton, S
2004-09-01
The traditional approach to relational database design is based on the logical organization of data into a number of related normalized tables. One assumption is that the nature and structure of the data is known at the design stage. In the case of designing a relational database to store historical dental epidemiological data from individual clinical surveys, the structure of the data is not known until the data is presented for inclusion into the database. This paper addresses the issues concerned with the theoretical design of a clinical dynamic database capable of adapting the internal table structure to accommodate clinical survey data, and presents a prototype database application capable of processing, displaying, and querying the dental data.
The HISTMAG database: combining historical, archaeomagnetic and volcanic data
NASA Astrophysics Data System (ADS)
Arneitz, Patrick; Leonhardt, Roman; Schnepp, Elisabeth; Heilig, Balázs; Mayrhofer, Franziska; Kovacs, Peter; Hejda, Pavel; Valach, Fridrich; Vadasz, Gergely; Hammerl, Christa; Egli, Ramon; Fabian, Karl; Kompein, Niko
2017-09-01
Records of the past geomagnetic field can be divided into two main categories. These are instrumental historical observations on the one hand, and field estimates based on the magnetization acquired by rocks, sediments and archaeological artefacts on the other hand. In this paper, a new database combining historical, archaeomagnetic and volcanic records is presented. HISTMAG is a relational database, implemented in MySQL, and can be accessed via a web-based interface (http://www.conrad-observatory.at/zamg/index.php/data-en/histmag-database). It combines available global historical data compilations covering the last ∼500 yr as well as archaeomagnetic and volcanic data collections from the last 50 000 yr. Furthermore, new historical and archaeomagnetic records, mainly from central Europe, have been acquired. In total, 190 427 records are currently available in the HISTMAG database, whereby the majority is related to historical declination measurements (155 525). The original database structure was complemented by new fields, which allow for a detailed description of the different data types. A user-comment function provides the possibility for a scientific discussion about individual records. Therefore, HISTMAG database supports thorough reliability and uncertainty assessments of the widely different data sets, which are an essential basis for geomagnetic field reconstructions. A database analysis revealed systematic offset for declination records derived from compass roses on historical geographical maps through comparison with other historical records, while maps created for mining activities represent a reliable source.
Overview of Historical Earthquake Document Database in Japan and Future Development
NASA Astrophysics Data System (ADS)
Nishiyama, A.; Satake, K.
2014-12-01
In Japan, damage and disasters from historical large earthquakes have been documented and preserved. Compilation of historical earthquake documents started in the early 20th century and 33 volumes of historical document source books (about 27,000 pages) have been published. However, these source books are not effectively utilized for researchers due to a contamination of low-reliability historical records and a difficulty for keyword searching by characters and dates. To overcome these problems and to promote historical earthquake studies in Japan, construction of text database started in the 21 century. As for historical earthquakes from the beginning of the 7th century to the early 17th century, "Online Database of Historical Documents in Japanese Earthquakes and Eruptions in the Ancient and Medieval Ages" (Ishibashi, 2009) has been already constructed. They investigated the source books or original texts of historical literature, emended the descriptions, and assigned the reliability of each historical document on the basis of written age. Another database compiled the historical documents for seven damaging earthquakes occurred along the Sea of Japan coast in Honshu, central Japan in the Edo period (from the beginning of the 17th century to the middle of the 19th century) and constructed text database and seismic intensity data base. These are now publicized on the web (written only in Japanese). However, only about 9 % of the earthquake source books have been digitized so far. Therefore, we plan to digitize all of the remaining historical documents by the research-program which started in 2014. The specification of the data base will be similar for previous ones. We also plan to combine this database with liquefaction traces database, which will be constructed by other research program, by adding the location information described in historical documents. Constructed database would be utilized to estimate the distributions of seismic intensities and tsunami heights.
Applications of Historical Analyses in Combat Modelling
2011-12-01
effectiveness, P Dexter, J Battlefield Technology 6, 33-39, (2003). 37. Long term behaviour of solutions of the Lotka- Volterra system under small...2643 database of historical battles. This includes an examination of the inclusion of a fractal model of spatial dispersion on casualty values [6] and...system is viewed as no more than “the sum of its parts” in which all phenomena can be explained in terms of other, more fundamental, phenomena
Service Management Database for DSN Equipment
NASA Technical Reports Server (NTRS)
Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Wolgast, Paul; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed
2009-01-01
This data- and event-driven persistent storage system leverages the use of commercial software provided by Oracle for portability, ease of maintenance, scalability, and ease of integration with embedded, client-server, and multi-tiered applications. In this role, the Service Management Database (SMDB) is a key component of the overall end-to-end process involved in the scheduling, preparation, and configuration of the Deep Space Network (DSN) equipment needed to perform the various telecommunication services the DSN provides to its customers worldwide. SMDB makes efficient use of triggers, stored procedures, queuing functions, e-mail capabilities, data management, and Java integration features provided by the Oracle relational database management system. SMDB uses a third normal form schema design that allows for simple data maintenance procedures and thin layers of integration with client applications. The software provides an integrated event logging system with ability to publish events to a JMS messaging system for synchronous and asynchronous delivery to subscribed applications. It provides a structured classification of events and application-level messages stored in database tables that are accessible by monitoring applications for real-time monitoring or for troubleshooting and analysis over historical archives.
NASA Technical Reports Server (NTRS)
1993-01-01
All the options in the NASA VEGetation Workbench (VEG) make use of a database of historical cover types. This database contains results from experiments by scientists on a wide variety of different cover types. The learning system uses the database to provide positive and negative training examples of classes that enable it to learn distinguishing features between classes of vegetation. All the other VEG options use the database to estimate the error bounds involved in the results obtained when various analysis techniques are applied to the sample of cover type data that is being studied. In the previous version of VEG, the historical cover type database was stored as part of the VEG knowledge base. This database was removed from the knowledge base. It is now stored as a series of flat files that are external to VEG. An interface between VEG and these files was provided. The interface allows the user to select which files of historical data to use. The files are then read, and the data are stored in Knowledge Engineering Environment (KEE) units using the same organization of units as in the previous version of VEG. The interface also allows the user to delete some or all of the historical database units from VEG and load new historical data from a file. This report summarizes the use of the historical cover type database in VEG. It then describes the new interface to the files containing the historical data. It describes minor changes that were made to VEG to enable the externally stored database to be used. Test runs to test the operation of the new interface and also to test the operation of VEG using historical data loaded from external files are described. Task F was completed. A Sun cartridge tape containing the KEE and Common Lisp code for the new interface and the modified version of the VEG knowledge base was delivered to the NASA GSFC technical representative.
NASA Astrophysics Data System (ADS)
Feller, Jens; Feller, Sebastian; Mauersberg, Bernhard; Mergenthaler, Wolfgang
2009-09-01
Many applications in plant management require close monitoring of equipment performance, in particular with the objective to prevent certain critical events. At each point in time, the information available to classify the criticality of the process, is represented through the historic signal database as well as the actual measurement. This paper presents an approach to detect and predict critical events, based on pattern recognition and discriminance analysis.
The Ogallala Agro-Climate Tool (Technical Description)
USDA-ARS?s Scientific Manuscript database
A Visual Basic agro-climate application capable of estimating irrigation demand and crop water use over the Ogallala Aquifer region is described here. The application’s meteorological database consists of daily precipitation and temperature data from 141 U.S. Historical Climatology Network stations ...
Using Geocoded Databases in Teaching Urban Historical Geography.
ERIC Educational Resources Information Center
Miller, Roger P.
1986-01-01
Provides information regarding hardware and software requirements for using geocoded databases in urban historical geography. Reviews 11 IBM and Apple Macintosh database programs and describes the pen plotter and digitizing table interface used with the databases. (JDH)
NASA Astrophysics Data System (ADS)
Palumbo, Gaetano; Powlesland, Dominic
1996-12-01
The Getty Conservation Institute is exploring the feasibility of using remote sensing associated with a geographic database management system (GDBMS) in order to provide archaeological and historic site managers with sound evaluations of the tools available for site and information management. The World Heritage Site of Chaco Canyon, New Mexico, a complex of archeological sites dating to the 10th to the 13th centuries AD, was selected as a test site. Information from excavations conducted there since the 1930s, and a range of documentation generated by the National Park Service was gathered. NASA's John C. Stennis Space Center contributed multispectral data of the area, and the Jet Propulsion Laboratory contributed data from ATLAS (airborne terrestrial applications sensor) and CAMS (calibrated airborne multispectral scanner) scanners. Initial findings show that while 'automatic monitoring systems' will probably never be a reality, with careful comparisons of historic and modern photographs, and performing digital analysis of remotely sensed data, excellent results are possible.
A Global Geospatial Database of 5000+ Historic Flood Event Extents
NASA Astrophysics Data System (ADS)
Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.
2017-12-01
A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.
The New Zealand Tsunami Database: historical and modern records
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.
2016-12-01
A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.
Historical hydrology and database on flood events (Apulia, southern Italy)
NASA Astrophysics Data System (ADS)
Lonigro, Teresa; Basso, Alessia; Gentile, Francesco; Polemio, Maurizio
2014-05-01
Historical data about floods represent an important tool for the comprehension of the hydrological processes, the estimation of hazard scenarios as a basis for Civil Protection purposes, as a basis of the rational land use management, especially in karstic areas, where time series of river flows are not available and the river drainage is rare. The research shows the importance of the improvement of existing flood database with an historical approach, finalized to collect past or historical floods event, in order to better assess the occurrence trend of floods, in the case for the Apulian region (south Italy). The main source of records of flood events for Apulia was the AVI (the acronym means Italian damaged areas) database, an existing Italian database that collects data concerning damaging floods from 1918 to 1996. The database was expanded consulting newspapers, publications, and technical reports from 1996 to 2006. In order to expand the temporal range further data were collected searching in the archives of regional libraries. About 700 useful news from 17 different local newspapers were found from 1876 to 1951. From a critical analysis of the 700 news collected since 1876 to 1952 only 437 were useful for the implementation of the Apulia database. The screening of these news showed the occurrence of about 122 flood events in the entire region. The district of Bari, the regional main town, represents the area in which the great number of events occurred; the historical analysis confirms this area as flood-prone. There is an overlapping period (from 1918 to 1952) between old AVI database and new historical dataset obtained by newspapers. With regard to this period, the historical research has highlighted new flood events not reported in the existing AVI database and it also allowed to add more details to the events already recorded. This study shows that the database is a dynamic instrument, which allows a continuous implementation of data, even in real time. More details on previous results of this research activity were recently published (Polemio, 2010; Basso et al., 2012; Lonigro et al., 2013) References Basso A., Lonigro T. and Polemio M. (2012) "The improvement of historical database on damaging hydrogeological events in the case of Apulia (Southern Italy)". Rendiconti online della Società Geologica Italiana, 21: 379-380; Lonigro T., Basso A. and Polemio M. (2013) "Historical database on damaging hydrogeological events in Apulia region (Southern Italy)". Rendiconti online della Società Geologica Italiana, 24: 196-198; Polemio M. (2010) "Historical floods and a recent extreme rainfall event in the Murgia karstic environment (Southern Italy)". Zeitschrift für Geomorphologie, 54(2): 195-219.
Keeping Track of Our Treasures: Managing Historical Data with Relational Database Software.
ERIC Educational Resources Information Center
Gutmann, Myron P.; And Others
1989-01-01
Describes the way a relational database management system manages a large historical data collection project. Shows that such databases are practical to construct. States that the programing tasks involved are not for beginners, but the rewards of having data organized are worthwhile. (GG)
Labay, Ben; Cohen, Adam E; Sissel, Blake; Hendrickson, Dean A; Martin, F Douglas; Sarkar, Sahotra
2011-01-01
Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities.
2006 Compilation of Alaska Gravity Data and Historical Reports
Saltus, Richard W.; Brown, Philip J.; Morin, Robert L.; Hill, Patricia L.
2008-01-01
Gravity anomalies provide fundamental geophysical information about Earth structure and dynamics. To increase geologic and geodynamic understanding of Alaska, the U.S. Geological Survey (USGS) has collected and processed Alaska gravity data for the past 50 years. This report introduces and describes an integrated, State-wide gravity database and provides accompanying gravity calculation tools to assist in its application. Additional information includes gravity base station descriptions and digital scans of historical USGS reports. The gravity calculation tools enable the user to reduce new gravity data in a consistent manner for combination with the existing database. This database has sufficient resolution to define the regional gravity anomalies of Alaska. Interpretation of regional gravity anomalies in parts of the State are hampered by the lack of local isostatic compensation in both southern and northern Alaska. However, when filtered appropriately, the Alaska gravity data show regional features having geologic significance. These features include gravity lows caused by low-density rocks of Cenozoic basins, flysch belts, and felsic intrusions, as well as many gravity highs associated with high-density mafic and ultramafic complexes.
NASA Astrophysics Data System (ADS)
Maurer, Joshua; Rupper, Summer
2015-10-01
Declassified historical imagery from the Hexagon spy satellite database has near-global coverage, yet remains a largely untapped resource for geomorphic change studies. Unavailable satellite ephemeris data make DEM (digital elevation model) extraction difficult in terms of time and accuracy. A new fully-automated pipeline for DEM extraction and image orthorectification is presented which yields accurate results and greatly increases efficiency over traditional photogrammetric methods, making the Hexagon image database much more appealing and accessible. A 1980 Hexagon DEM is extracted and geomorphic change computed for the Thistle Creek Landslide region in the Wasatch Range of North America to demonstrate an application of the new method. Surface elevation changes resulting from the landslide show an average elevation decrease of 14.4 ± 4.3 m in the source area, an increase of 17.6 ± 4.7 m in the deposition area, and a decrease of 30.2 ± 5.1 m resulting from a new roadcut. Two additional applications of the method include volume estimates of material excavated during the Mount St. Helens volcanic eruption and the volume of net ice loss over a 34-year period for glaciers in the Bhutanese Himalayas. These results show the value of Hexagon imagery in detecting and quantifying historical geomorphic change, especially in regions where other data sources are limited.
Fast Demand Forecast of Electric Vehicle Charging Stations for Cell Phone Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majidpour, Mostafa; Qiu, Charlie; Chung, Ching-Yen
This paper describes the core cellphone application algorithm which has been implemented for the prediction of energy consumption at Electric Vehicle (EV) Charging Stations at UCLA. For this interactive user application, the total time of accessing database, processing the data and making the prediction, needs to be within a few seconds. We analyze four relatively fast Machine Learning based time series prediction algorithms for our prediction engine: Historical Average, kNearest Neighbor, Weighted k-Nearest Neighbor, and Lazy Learning. The Nearest Neighbor algorithm (k Nearest Neighbor with k=1) shows better performance and is selected to be the prediction algorithm implemented for themore » cellphone application. Two applications have been designed on top of the prediction algorithm: one predicts the expected available energy at the station and the other one predicts the expected charging finishing time. The total time, including accessing the database, data processing, and prediction is about one second for both applications.« less
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.
2016-12-01
The Magnetics Information Consortium (https://earthref.org/MagIC/) develops and maintains a database and web application for supporting the paleo-, geo-, and rock magnetic scientific community. Historically, this objective has been met with an Oracle database and a Perl web application at the San Diego Supercomputer Center (SDSC). The Oracle Enterprise Cluster at SDSC, however, was decommissioned in July of 2016 and the cost for MagIC to continue using Oracle became prohibitive. This provided MagIC with a unique opportunity to reexamine the entire technology stack and data model. MagIC has developed an open-source web application using the Meteor (http://meteor.com) framework and a MongoDB database. The simplicity of the open-source full-stack framework that Meteor provides has improved MagIC's development pace and the increased flexibility of the data schema in MongoDB encouraged the reorganization of the MagIC Data Model. As a result of incorporating actively developed open-source projects into the technology stack, MagIC has benefited from their vibrant software development communities. This has translated into a more modern web application that has significantly improved the user experience for the paleo-, geo-, and rock magnetic scientific community.
Documentary evidence of past floods in Europe and their utility in flood frequency estimation
NASA Astrophysics Data System (ADS)
Kjeldsen, T. R.; Macdonald, N.; Lang, M.; Mediero, L.; Albuquerque, T.; Bogdanowicz, E.; Brázdil, R.; Castellarin, A.; David, V.; Fleig, A.; Gül, G. O.; Kriauciuniene, J.; Kohnová, S.; Merz, B.; Nicholson, O.; Roald, L. A.; Salinas, J. L.; Sarauskiene, D.; Šraj, M.; Strupczewski, W.; Szolgay, J.; Toumazis, A.; Vanneuville, W.; Veijalainen, N.; Wilson, D.
2014-09-01
This review outlines the use of documentary evidence of historical flood events in contemporary flood frequency estimation in European countries. The study shows that despite widespread consensus in the scientific literature on the utility of documentary evidence, the actual migration from academic to practical application has been limited. A detailed review of flood frequency estimation guidelines from different countries showed that the value of historical data is generally recognised, but practical methods for systematic and routine inclusion of this type of data into risk analysis are in most cases not available. Studies of historical events were identified in most countries, and good examples of national databases attempting to collate the available information were identified. The conclusion is that there is considerable potential for improving the reliability of the current flood risk assessments by harvesting the valuable information on past extreme events contained in the historical data sets.
NOAA Data Rescue of Key Solar Databases and Digitization of Historical Solar Images
NASA Astrophysics Data System (ADS)
Coffey, H. E.
2006-08-01
Over a number of years, the staff at NOAA National Geophysical Data Center (NGDC) has worked to rescue key solar databases by converting them to digital format and making them available via the World Wide Web. NOAA has had several data rescue programs where staff compete for funds to rescue important and critical historical data that are languishing in archives and at risk of being lost due to deteriorating condition, loss of any metadata or descriptive text that describe the databases, lack of interest or funding in maintaining databases, etc. The Solar-Terrestrial Physics Division at NGDC was able to obtain funds to key in some critical historical tabular databases. Recently the NOAA Climate Database Modernization Program (CDMP) funded a project to digitize historical solar images, producing a large online database of historical daily full disk solar images. The images include the wavelengths Calcium K, Hydrogen Alpha, and white light photos, as well as sunspot drawings and the comprehensive drawings of a multitude of solar phenomena on one daily map (Fraunhofer maps and Wendelstein drawings). Included in the digitization are high resolution solar H-alpha images taken at the Boulder Solar Observatory 1967-1984. The scanned daily images document many phases of solar activity, from decadal variation to rotational variation to daily changes. Smaller versions are available online. Larger versions are available by request. See http://www.ngdc.noaa.gov/stp/SOLAR/ftpsolarimages.html. The tabular listings and solar imagery will be discussed.
Timothy G.F. Kittel; Nan. A. Rosenbloom; J.A. Royle; C. Daly; W.P. Gibson; H.H. Fisher; P. Thornton; D.N. Yates; S. Aulenbach; C. Kaufman; R. McKeown; Dominque Bachelet; David S. Schimel
2004-01-01
Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the...
An Earthquake Information Service with Free and Open Source Tools
NASA Astrophysics Data System (ADS)
Schroeder, M.; Stender, V.; Jüngling, S.
2015-12-01
At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.
Visualization of historical data for the ATLAS detector controls - DDV
NASA Astrophysics Data System (ADS)
Maciejewski, J.; Schlenker, S.
2017-10-01
The ATLAS experiment is one of four detectors located on the Large Hardon Collider (LHC) based at CERN. Its detector control system (DCS) stores the slow control data acquired within the back-end of distributed WinCC OA applications, which enables the data to be retrieved for future analysis, debugging and detector development in an Oracle relational database. The ATLAS DCS Data Viewer (DDV) is a client-server application providing access to the historical data outside of the experiment network. The server builds optimized SQL queries, retrieves the data from the database and serves it to the clients via HTTP connections. The server also implements protection methods to prevent malicious use of the database. The client is an AJAX-type web application based on the Vaadin (framework build around the Google Web Toolkit (GWT)) which gives users the possibility to access the data with ease. The DCS metadata can be selected using a column-tree navigation or a search engine supporting regular expressions. The data is visualized by a selection of output modules such as a java script value-over time plots or a lazy loading table widget. Additional plugins give the users the possibility to retrieve the data in ROOT format or as an ASCII file. Control system alarms can also be visualized in a dedicated table if necessary. Python mock-up scripts can be generated by the client, allowing the user to query the pythonic DDV server directly, such that the users can embed the scripts into more complex analysis programs. Users are also able to store searches and output configurations as XML on the server to share with others via URL or to embed in HTML.
Labay, Ben; Cohen, Adam E.; Sissel, Blake; Hendrickson, Dean A.; Martin, F. Douglas; Sarkar, Sahotra
2011-01-01
Accurate establishment of baseline conditions is critical to successful management and habitat restoration. We demonstrate the ability to robustly estimate historical fish community composition and assess the current status of the urbanized Barton Creek watershed in central Texas, U.S.A. Fish species were surveyed in 2008 and the resulting data compared to three sources of fish occurrence information: (i) historical records from a museum specimen database and literature searches; (ii) a nearly identical survey conducted 15 years earlier; and (iii) a modeled historical community constructed with species distribution models (SDMs). This holistic approach, and especially the application of SDMs, allowed us to discover that the fish community in Barton Creek was more diverse than the historical data and survey methods alone indicated. Sixteen native species with high modeled probability of occurrence within the watershed were not found in the 2008 survey, seven of these were not found in either survey or in any of the historical collection records. Our approach allowed us to more rigorously establish the true baseline for the pre-development fish fauna and then to more accurately assess trends and develop hypotheses regarding factors driving current fish community composition to better inform management decisions and future restoration efforts. Smaller, urbanized freshwater systems, like Barton Creek, typically have a relatively poor historical biodiversity inventory coupled with long histories of alteration, and thus there is a propensity for land managers and researchers to apply inaccurate baseline standards. Our methods provide a way around that limitation by using SDMs derived from larger and richer biodiversity databases of a broader geographic scope. Broadly applied, we propose that this technique has potential to overcome limitations of popular bioassessment metrics (e.g., IBI) to become a versatile and robust management tool for determining status of freshwater biotic communities. PMID:21966438
The market value of cultural heritage in urban areas: an application of spatial hedonic pricing
NASA Astrophysics Data System (ADS)
Lazrak, Faroek; Nijkamp, Peter; Rietveld, Piet; Rouwendal, Jan
2014-01-01
The current literature often values intangible goods like cultural heritage by applying stated preference methods. In recent years, however, the increasing availability of large databases on real estate transactions and listed prices has opened up new research possibilities and has reduced various existing barriers to applications of conventional (spatial) hedonic analysis to the real estate market. The present paper provides one of the first applications using a spatial autoregressive model to investigate the impact of cultural heritage—in particular, listed buildings and historic-cultural sites (or historic landmarks)—on the value of real estate in cities. In addition, this paper suggests a novel way of specifying the spatial weight matrix—only prices of sold houses influence current price—in identifying the spatial dependency effects between sold properties. The empirical application in the present study concerns the Dutch urban area of Zaanstad, a historic area for which over a long period of more than 20 years detailed information on individual dwellings, and their market prices are available in a GIS context. In this paper, the effect of cultural heritage is analysed in three complementary ways. First, we measure the effect of a listed building on its market price in the relevant area concerned. Secondly, we investigate the value that listed heritage has on nearby property. And finally, we estimate the effect of historic-cultural sites on real estate prices. We find that, to purchase a listed building, buyers are willing to pay an additional 26.9 %, while surrounding houses are worth an extra 0.28 % for each additional listed building within a 50-m radius. Houses sold within a conservation area appear to gain a premium of 26.4 % which confirms the existence of a `historic ensemble' effect.
Precise photorealistic visualization for restoration of historic buildings based on tacheometry data
NASA Astrophysics Data System (ADS)
Ragia, Lemonia; Sarri, Froso; Mania, Katerina
2018-03-01
This paper puts forward a 3D reconstruction methodology applied to the restoration of historic buildings taking advantage of the speed, range and accuracy of a total geodetic station. The measurements representing geo-referenced points produced an interactive and photorealistic geometric mesh of a monument named `Neoria.' `Neoria' is a Venetian building located by the old harbor at Chania, Crete, Greece. The integration of tacheometry acquisition and computer graphics puts forward a novel integrated software framework for the accurate 3D reconstruction of a historical building. The main technical challenge of this work was the production of a precise 3D mesh based on a sufficient number of tacheometry measurements acquired fast and at low cost, employing a combination of surface reconstruction and processing methods. A fully interactive application based on game engine technologies was developed. The user can visualize and walk through the monument and the area around it as well as photorealistically view it at different times of day and night. Advanced interactive functionalities are offered to the user in relation to identifying restoration areas and visualizing the outcome of such works. The user could visualize the coordinates of the points measured, calculate distances and navigate through the complete 3D mesh of the monument. The geographical data are stored in a database connected with the application. Features referencing and associating the database with the monument are developed. The goal was to utilize a small number of acquired data points and present a fully interactive visualization of a geo-referenced 3D model.
Precise photorealistic visualization for restoration of historic buildings based on tacheometry data
NASA Astrophysics Data System (ADS)
Ragia, Lemonia; Sarri, Froso; Mania, Katerina
2018-04-01
This paper puts forward a 3D reconstruction methodology applied to the restoration of historic buildings taking advantage of the speed, range and accuracy of a total geodetic station. The measurements representing geo-referenced points produced an interactive and photorealistic geometric mesh of a monument named `Neoria.' `Neoria' is a Venetian building located by the old harbor at Chania, Crete, Greece. The integration of tacheometry acquisition and computer graphics puts forward a novel integrated software framework for the accurate 3D reconstruction of a historical building. The main technical challenge of this work was the production of a precise 3D mesh based on a sufficient number of tacheometry measurements acquired fast and at low cost, employing a combination of surface reconstruction and processing methods. A fully interactive application based on game engine technologies was developed. The user can visualize and walk through the monument and the area around it as well as photorealistically view it at different times of day and night. Advanced interactive functionalities are offered to the user in relation to identifying restoration areas and visualizing the outcome of such works. The user could visualize the coordinates of the points measured, calculate distances and navigate through the complete 3D mesh of the monument. The geographical data are stored in a database connected with the application. Features referencing and associating the database with the monument are developed. The goal was to utilize a small number of acquired data points and present a fully interactive visualization of a geo-referenced 3D model.
NASA Astrophysics Data System (ADS)
Giampa', Vincenzo; Pasqua, A. Aurora; Petrucci, Olga
2015-04-01
The paper firstly presents the historical archive of Cosenza IRPI Section and the historical database that has been built basing on the data contained in it. Then, an application of these data to Catanzaro, the town that is the administrative center of Calabria region (Southern Italy), is presented. The gathering of historical data on past floods and landslides in Cosenza IRPI Section has been started since 1996, and it is still in progress. In 2005, some donations coming from regional and municipal Public Works offices greatly increased the documental corpus, and required a more incisive classification and management that led us to organize the documents in a real historical archive. Documents were sorted according to municipalities they concerned. In this way, for each of the 409 municipalities of Calabria a set of documents, maps and images was available. Collected documents mainly concern damage caused by the occurrence, since XIX century, of phenomena as floods, flash floods and landslides triggered by extreme meteorological events, or even damage caused by strong earthquakes. At the beginning of 2014, the central office of IRPI (Perugia) funded a project aiming to the digitalization of the archive and the subsequent publication of it on a web-platform. In this paper, the procedure adopted to build the archive and implement the database is described. Then, the elaboration of the historical series of data on Catanzaro town, which has been frequently damaged by rainfall-induced landslides and floods, is also presented. Basing on the documents coming from the archive of Ministry Public Works and stored in our Historical Archive, an assessment of costs related to damage that during XX century affected the houses of this town has been performed. The research pointed out the types of most damaging phenomena, the municipal sectors most frequently damaged, and the evolution of damaged areas throughout the years according to the increasing urbanization.
Chirila: Contemporary and Historical Resources for the Indigenous Languages of Australia
ERIC Educational Resources Information Center
Bowern, Claire
2016-01-01
Here I present the background to, and a description of, a newly developed database of historical and contemporary lexical data for Australian languages (Chirila), concentrating on the Pama-Nyungan family (the largest family in the country). While the database was initially developed in order to facilitate research on cognate words and…
NASA Astrophysics Data System (ADS)
Bono, Andrea
2007-01-01
The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical context is introduced. Following are the descriptions of the various planning phases, from the conceptual level to the physical implementation of the scheme. Each time principle instructions, rules, considerations of technical-scientific nature are highlighted that take to the final result: a vanguard relational scheme for historical data.
NASA Astrophysics Data System (ADS)
Daniell, James; Skapski, Jens-Udo; Vervaeck, Armand; Wenzel, Friedemann; Schaefer, Andreas
2015-04-01
Over the past 12 years, an in-depth database has been constructed for socio-economic losses from earthquakes and volcanoes. The effects of earthquakes and volcanic eruptions have been documented in many databases, however, many errors and incorrect details are often encountered. To combat this, the database was formed with socioeconomic checks of GDP, capital stock, population and other elements, as well as providing upper and lower bounds to each available event loss. The definition of economic losses within the CATDAT Damaging Earthquakes Database (Daniell et al., 2011a) as of v6.1 has now been redefined to provide three options of natural disaster loss pricing, including reconstruction cost, replacement cost and actual loss, in order to better define the impact of historical disasters. Similarly for volcanoes as for earthquakes, a reassessment has been undertaken looking at the historical net and gross capital stock and GDP at the time of the event, including the depreciated stock, in order to calculate the actual loss. A normalisation has then been undertaken using updated population, GDP and capital stock. The difference between depreciated and gross capital can be removed from the historical loss estimates which have been all calculated without taking depreciation of the building stock into account. The culmination of time series from 1900-2014 of net and gross capital stock, GDP, direct economic loss data, use of detailed studies of infrastructure age, and existing damage surveys, has allowed the first estimate of this nature. The death tolls in earthquakes from 1900-2014 are presented in various forms, showing around 2.32 million deaths due to earthquakes (with a range of 2.18 to 2.63 million) and around 59% due to masonry buildings and 28% from secondary effects. For the death tolls from the volcanic eruption database, 98000 deaths with a range from around 83000 to 107000 is seen from 1900-2014. The application of VSL life costing from death and injury tolls from historic events is discussed. The CATDAT socioeconomic databases of parameters like disaggregated population, GDP, capital stock, building typologies, food security and inter-country export interactions are used to create a current exposure view of the world. The potential for losses globally is discussed with a re-creation of each damaging event since 1900, with well in excess of 10 trillion USD in normalised losses being seen from the 115 years of events. Potential worst case events for volcano and earthquake around the globe are discussed in terms of their potential for damage and huge economic loss today, and over the next century using SSP projections adjusted over a country basis including inter-country effects.
NASA Astrophysics Data System (ADS)
Paprotny, Dominik; Morales-Nápoles, Oswaldo; Jonkman, Sebastiaan N.
2018-03-01
The influence of social and economic change on the consequences of natural hazards has been a matter of much interest recently. However, there is a lack of comprehensive, high-resolution data on historical changes in land use, population, or assets available to study this topic. Here, we present the Historical Analysis of Natural Hazards in Europe (HANZE) database, which contains two parts: (1) HANZE-Exposure with maps for 37 countries and territories from 1870 to 2020 in 100 m resolution and (2) HANZE-Events, a compilation of past disasters with information on dates, locations, and losses, currently limited to floods only. The database was constructed using high-resolution maps of present land use and population, a large compilation of historical statistics, and relatively simple disaggregation techniques and rule-based land use reallocation schemes. Data encompassed in HANZE allow one to "normalize" information on losses due to natural hazards by taking into account inflation as well as changes in population, production, and wealth. This database of past events currently contains 1564 records (1870-2016) of flash, river, coastal, and compound floods. The HANZE database is freely available at https://data.4tu.nl/repository/collection:HANZE.
Cooperative organic mine avoidance path planning
NASA Astrophysics Data System (ADS)
McCubbin, Christopher B.; Piatko, Christine D.; Peterson, Adam V.; Donnald, Creighton R.; Cohen, David
2005-06-01
The JHU/APL Path Planning team has developed path planning techniques to look for paths that balance the utility and risk associated with different routes through a minefield. Extending on previous years' efforts, we investigated real-world Naval mine avoidance requirements and developed a tactical decision aid (TDA) that satisfies those requirements. APL has developed new mine path planning techniques using graph based and genetic algorithms which quickly produce near-minimum risk paths for complicated fitness functions incorporating risk, path length, ship kinematics, and naval doctrine. The TDA user interface, a Java Swing application that obtains data via Corba interfaces to path planning databases, allows the operator to explore a fusion of historic and in situ mine field data, control the path planner, and display the planning results. To provide a context for the minefield data, the user interface also renders data from the Digital Nautical Chart database, a database created by the National Geospatial-Intelligence Agency containing charts of the world's ports and coastal regions. This TDA has been developed in conjunction with the COMID (Cooperative Organic Mine Defense) system. This paper presents a description of the algorithms, architecture, and application produced.
Statewide Inventories of Heritage Resources: Macris and the Experience in Massachusetts
NASA Astrophysics Data System (ADS)
Stott, P. H.
2017-08-01
The Massachusetts Historical Commission (MHC) is the State Historic Preservation Office for Massachusetts. Established in 1963, MHC has been inventorying historic properties for over half a century. Since 1987, it has maintained a heritage database, the Massachusetts Cultural Resource Information System, or MACRIS. Today MACRIS holds over 206,000 records from the 351 towns and cities across the Commonwealth. Since 2004, a selection of the more than 150 MACRIS fields has been available online at mhcmacris. net. MACRIS is widely used by independent consultants preparing project review files, by MHC staff in its regulatory responsibilities, by local historical commissions monitoring threats to their communities, as well as by scholars, historical organizations, genealogists, property owners, reporters, and the general public interested in the history of the built environment. In 2016 MACRIS began migration off of its three-decade old Pick multivalue database to SQL Server, and in 2017, the first redesign of its thirteen-year old web interface should start to improve usability. Longer-term improvements have the goal of standardizing terminology and ultimately bringing interoperability with other heritage databases closer to reality.
NASA Astrophysics Data System (ADS)
Pennington, Catherine; Dashwood, Claire; Freeborough, Katy
2014-05-01
The National Landslide Database has been developed by the British Geological Survey (BGS) and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 16,500 records of landslide events, each documented as fully as possible. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and crowd-sourcing information through social media and other online resources. This information is invaluable for the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domain map currently under development rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures and an understanding of causative factors and their spatial distribution, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP and data collected for the National Landslide Database is used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.
NASA Astrophysics Data System (ADS)
Vileikis, O.; Escalante Carrillo, E.; Allayarov, S.; Feyzulayev, A.
2017-08-01
The historic cities of Uzbekistan are an irreplaceable legacy of the Silk Roads. Currently, Uzbekistan counts with four UNESCO World Heritage Properties, with hundreds of historic monuments and traditional historic houses. However, lack of documentation, systematic monitoring and a digital database, of the historic buildings and dwellings within the historic centers, are threatening the World Heritage properties and delaying the development of a proper management mechanism for the preservation of the heritage and an interwoven city urban development. Unlike the monuments, the traditional historic houses are being demolished without any enforced legal protection, leaving no documentation to understand the city history and its urban fabric as well of way of life, traditions and customs over the past centuries. To fill out this gap, from 2008 to 2015, the Principal Department for Preservation and Utilization of Cultural Objects of the Ministry of Culture and Sports of Uzbekistan with support from the UNESCO Office in Tashkent, and in collaboration with several international and local universities and institutions, carried out a survey of the Historic Centre of Bukhara, Itchan Kala and Samarkand Crossroad of Cultures. The collaborative work along these years have helped to consolidate a methodology and to integrate a GIS database that is currently contributing to the understanding of the outstanding heritage values of these cities as well as to develop preservation and management strategies with a solid base of heritage documentation.
NASA Astrophysics Data System (ADS)
Bliefernicht, Jan; Waongo, Moussa; Annor, Thompson; Laux, Patrick; Lorenz, Manuel; Salack, Seyni; Kunstmann, Harald
2017-04-01
West Africa is a data sparse region. High quality and long-term precipitation data are often not readily available for applications in hydrology, agriculture, meteorology and other needs. To close this gap, we use multiple data sources to develop a precipitation database with long-term daily and monthly time series. This database was compiled from 16 archives including global databases e.g. from the Global Historical Climatology Network (GHCN), databases from research projects (e.g. the AMMA database) and databases of the national meteorological services of some West African countries. The collection consists of more than 2000 precipitation gauges with measurements dating from 1850 to 2015. Due to erroneous measurements (e.g. temporal offsets, unit conversion errors), missing values and inconsistent meta-data, the merging of this precipitation dataset is not straightforward and requires a thorough quality control and harmonization. To this end, we developed geostatistical-based algorithms for quality control of individual databases and harmonization to a joint database. The algorithms are based on a pairwise comparison of the correspondence of precipitation time series in dependence to the distance between stations. They were tested for precipitation time series from gages located in a rectangular domain covering Burkina Faso, Ghana, Benin and Togo. This harmonized and quality controlled precipitation database was recently used for several applications such as the validation of a high resolution regional climate model and the bias correction of precipitation projections provided the Coordinated Regional Climate Downscaling Experiment (CORDEX). In this presentation, we will give an overview of the novel daily and monthly precipitation database and the algorithms used for quality control and harmonization. We will also highlight the quality of global and regional archives (e.g. GHCN, GSOD, AMMA database) in comparison to the precipitation databases provided by the national meteorological services.
NASA Astrophysics Data System (ADS)
Das, I.; Oberai, K.; Sarathi Roy, P.
2012-07-01
Landslides exhibit themselves in different mass movement processes and are considered among the most complex natural hazards occurring on the earth surface. Making landslide database available online via WWW (World Wide Web) promotes the spreading and reaching out of the landslide information to all the stakeholders. The aim of this research is to present a comprehensive database for generating landslide hazard scenario with the help of available historic records of landslides and geo-environmental factors and make them available over the Web using geospatial Free & Open Source Software (FOSS). FOSS reduces the cost of the project drastically as proprietary software's are very costly. Landslide data generated for the period 1982 to 2009 were compiled along the national highway road corridor in Indian Himalayas. All the geo-environmental datasets along with the landslide susceptibility map were served through WEBGIS client interface. Open source University of Minnesota (UMN) mapserver was used as GIS server software for developing web enabled landslide geospatial database. PHP/Mapscript server-side application serve as a front-end application and PostgreSQL with PostGIS extension serve as a backend application for the web enabled landslide spatio-temporal databases. This dynamic virtual visualization process through a web platform brings an insight into the understanding of the landslides and the resulting damage closer to the affected people and user community. The landslide susceptibility dataset is also made available as an Open Geospatial Consortium (OGC) Web Feature Service (WFS) which can be accessed through any OGC compliant open source or proprietary GIS Software.
Research on Historic Bim of Built Heritage in Taiwan - a Case Study of Huangxi Academy
NASA Astrophysics Data System (ADS)
Lu, Y. C.; Shih, T. Y.; Yen, Y. N.
2018-05-01
Digital archiving technology for conserving cultural heritage is an important subject nowadays. The Taiwanese Ministry of Culture continues to try to converge the concept and technology of conservation towards international conventions. However, the products from these different technologies are not yet integrated due to the lack of research and development in this field. There is currently no effective schema in HBIM for Taiwanese cultural heritage. The aim of this research is to establish an HBIM schema for Chinese built heritage in Taiwan. The proposed method starts from the perspective of the components of built heritage buildings, up to the investigation of the important properties of the components through important international charters and Taiwanese laws of cultural heritage conservation. Afterwards, object-oriented class diagram and ontology from the scale of components were defined to clarify the concept and increase the interoperability. A historical database was then established for the historical information of components and to bring it into the concept of BIM in order to build a 3D model of heritage objects which can be used for visualization. An integration platform was developed for the users to browse and manipulate the database and 3D model simultaneously. In addition, this research also evaluated the feasibility of this method using the study case at the Huangxi academy located in Taiwan. The conclusion showed that class diagram could help the establishment of database and even its application for different Chinese built heritage objects. The establishment of ontology helped to convey knowledge and increase interoperability. In comparison to traditional documentation methods, the querying result of the platform was more accurate and less prone to human error.
Dcs Data Viewer, an Application that Accesses ATLAS DCS Historical Data
NASA Astrophysics Data System (ADS)
Tsarouchas, C.; Schlenker, S.; Dimitrov, G.; Jahn, G.
2014-06-01
The ATLAS experiment at CERN is one of the four Large Hadron Collider experiments. The Detector Control System (DCS) of ATLAS is responsible for the supervision of the detector equipment, the reading of operational parameters, the propagation of the alarms and the archiving of important operational data in a relational database (DB). DCS Data Viewer (DDV) is an application that provides access to the ATLAS DCS historical data through a web interface. Its design is structured using a client-server architecture. The pythonic server connects to the DB and fetches the data by using optimized SQL requests. It communicates with the outside world, by accepting HTTP requests and it can be used stand alone. The client is an AJAX (Asynchronous JavaScript and XML) interactive web application developed under the Google Web Toolkit (GWT) framework. Its web interface is user friendly, platform and browser independent. The selection of metadata is done via a column-tree view or with a powerful search engine. The final visualization of the data is done using java applets or java script applications as plugins. The default output is a value-over-time chart, but other types of outputs like tables, ascii or ROOT files are supported too. Excessive access or malicious use of the database is prevented by a dedicated protection mechanism, allowing the exposure of the tool to hundreds of inexperienced users. The current configuration of the client and of the outputs can be saved in an XML file. Protection against web security attacks is foreseen and authentication constrains have been taken into account, allowing the exposure of the tool to hundreds of users world wide. Due to its flexible interface and its generic and modular approach, DDV could be easily used for other experiment control systems.
Database assessment of CMIP5 and hydrological models to determine flood risk areas
NASA Astrophysics Data System (ADS)
Limlahapun, Ponthip; Fukui, Hiromichi
2016-11-01
Solutions for water-related disasters may not be solved with a single scientific method. Based on this premise, we involved logic conceptions, associate sequential result amongst models, and database applications attempting to analyse historical and future scenarios in the context of flooding. The three main models used in this study are (1) the fifth phase of the Coupled Model Intercomparison Project (CMIP5) to derive precipitation; (2) the Integrated Flood Analysis System (IFAS) to extract amount of discharge; and (3) the Hydrologic Engineering Center (HEC) model to generate inundated areas. This research notably focused on integrating data regardless of system-design complexity, and database approaches are significantly flexible, manageable, and well-supported for system data transfer, which makes them suitable for monitoring a flood. The outcome of flood map together with real-time stream data can help local communities identify areas at-risk of flooding in advance.
The CATDAT damaging earthquakes database
NASA Astrophysics Data System (ADS)
Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.
2011-08-01
The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.
S.I.I.A for monitoring crop evolution and anomaly detection in Andalusia by remote sensing
NASA Astrophysics Data System (ADS)
Rodriguez Perez, Antonio Jose; Louakfaoui, El Mostafa; Munoz Rastrero, Antonio; Rubio Perez, Luis Alberto; de Pablos Epalza, Carmen
2004-02-01
A new remote sensing application was developed and incorporated to the Agrarian Integrated Information System (S.I.I.A), project which is involved on integrating the regional farming databases from a geographical point of view, adding new values and uses to the original information. The project is supported by the Studies and Statistical Service, Regional Government Ministry of Agriculture and Fisheries (CAP). The process integrates NDVI values from daily NOAA-AVHRR and monthly IRS-WIFS images, and crop classes location maps. Agrarian local information and meteorological information is being included in the working process to produce a synergistic effect. An updated crop-growing evaluation state is obtained by 10-days periods, crop class, sensor type (including data fusion) and administrative geographical borders. Last ten years crop database (1992-2002) has been organized according to these variables. Crop class database can be accessed by an application which helps users on the crop statistical analysis. Multi-temporal and multi-geographical comparative analysis can be done by the user, not only for a year but also for a historical point of view. Moreover, real time crop anomalies can be detected and analyzed. Most of the output products will be available on Internet in the near future by a on-line application.
NASA Astrophysics Data System (ADS)
Cuttler, R. T. H.; Tonner, T. W. W.; Al-Naimi, F. A.; Dingwall, L. M.; Al-Hemaidi, N.
2013-07-01
The development of the Qatar National Historic Environment Record (QNHER) by the Qatar Museums Authority and the University of Birmingham in 2008 was based on a customised, bilingual Access database and ArcGIS. While both platforms are stable and well supported, neither was designed for the documentation and retrieval of cultural heritage data. As a result it was decided to develop a custom application using Open Source code. The core module of this application is now completed and is orientated towards the storage and retrieval of geospatial heritage data for the curation of heritage assets. Based on MIDAS Heritage data standards and regionally relevant thesauri, it is a truly bilingual system. Significant attention has been paid to the user interface, which is userfriendly and intuitive. Based on a suite of web services and accessed through a web browser, the system makes full use of internet resources such as Google Maps and Bing Maps. The application avoids long term vendor ''tie-ins'' and as a fully integrated data management system, is now an important tool for both cultural resource managers and heritage researchers in Qatar.
Use of national clinical databases for informing and for evaluating health care policies.
Black, Nick; Tan, Stefanie
2013-02-01
Policy-makers and analysts could make use of national clinical databases either to inform or to evaluate meso-level (organisation and delivery of health care) and macro-level (national) policies. Reviewing the use of 15 of the best established databases in England, we identify and describe four published examples of each use. These show that policy-makers can either make use of the data itself or of research based on the database. For evaluating policies, the major advantages are the huge sample sizes available, the generalisability of the data, its immediate availability and historic information. The principal methodological challenges involve the need for risk adjustment and time-series analysis. Given their usefulness in the policy arena, there are several reasons why national clinical databases have not been used more, some due to a lack of 'push' by their custodians and some to the lack of 'pull' by policy-makers. Greater exploitation of these valuable resources would be facilitated by policy-makers' and custodians' increased awareness, minimisation of legal restrictions on data use, improvements in the quality of databases and a library of examples of applications to policy. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.
2016-12-01
There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.
NASA Astrophysics Data System (ADS)
Boyer, T.; Sun, L.; Locarnini, R. A.; Mishonov, A. V.; Hall, N.; Ouellet, M.
2016-02-01
The World Ocean Database (WOD) contains systematically quality controlled historical and recent ocean profile data (temperature, salinity, oxygen, nutrients, carbon cycle variables, biological variables) ranging from Captain Cooks second voyage (1773) to this year's Argo floats. The US National Centers for Environmental Information (NCEI) also hosts the Global Temperature and Salinity Profile Program (GTSPP) Continuously Managed Database (CMD) which provides quality controlled near-real time ocean profile data and higher level quality controlled temperature and salinity profiles from 1990 to present. Both databases are used extensively for ocean and climate studies. Synchronization of these two databases will allow easier access and use of comprehensive regional and global ocean profile data sets for ocean and climate studies. Synchronizing consists of two distinct phases: 1) a retrospective comparison of data in WOD and GTSPP to ensure that the most comprehensive and highest quality data set is available to researchers without the need to individually combine and contrast the two datasets and 2) web services to allow the constantly accruing near-real time data in the GTSPP CMD and the continuous addition and quality control of historical data in WOD to be made available to researchers together, seamlessly.
Auchincloss, Amy H; Moore, Kari A B; Moore, Latetia V; Diez Roux, Ana V
2012-11-01
Access to healthy foods has received increasing attention due to growing prevalence of obesity and diet-related health conditions yet there are major obstacles in characterizing the local food environment. This study developed a method to retrospectively characterize supermarkets for a single historic year, 2005, in 19 counties in 6 states in the USA using a supermarket chain-name list and two business databases. Data preparation, merging, overlaps, added-value amongst various approaches and differences by census tract area-level socio-demographic characteristics are described. Agreement between two food store databases was modest: 63%. Only 55% of the final list of supermarkets were identified by a single business database and selection criteria that included industry classification codes and sales revenue ≥$2 million. The added-value of using a supermarket chain-name list and second business database was identification of an additional 14% and 30% of supermarkets, respectively. These methods are particularly useful to retrospectively characterize access to supermarkets during a historic period and when field observations are not feasible and business databases are used. Copyright © 2012 Elsevier Ltd. All rights reserved.
"Mr. Database" : Jim Gray and the History of Database Technologies.
Hanwahr, Nils C
2017-12-01
Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.
Dangers of Noncritical Use of Historical Plague Data
Roosen, Joris
2018-01-01
Researchers have published several articles using historical data sets on plague epidemics using impressive digital databases that contain thousands of recorded outbreaks across Europe over the past several centuries. Through the digitization of preexisting data sets, scholars have unprecedented access to the historical record of plague occurrences. However, although these databases offer new research opportunities, noncritical use and reproduction of preexisting data sets can also limit our understanding of how infectious diseases evolved. Many scholars have performed investigations using Jean-Noël Biraben’s data, which contains information on mentions of plague from various kinds of sources, many of which were not cited. When scholars fail to apply source criticism or do not reflect on the content of the data they use, the reliability of their results becomes highly questionable. Researchers using these databases going forward need to verify and restrict content spatially and temporally, and historians should be encouraged to compile the work.
An Examination of Selected Software Testing Tools: 1992
1992-12-01
Report ....................................................... 27-19 Figure 27-17. Metrics Manager Database Full Report...historical test database , the test management and problem reporting tools were examined using the sample test database provided by each supplier. 4-4...track the impact of new methods, organi- zational structures, and technologies. Metrics Manager is supported by an industry database that allows
NASA Astrophysics Data System (ADS)
Akristiniy, Vera A.; Dikova, Elena A.
2018-03-01
The article is devoted to one of the types of urban planning studies - the visual-landscape analysis during the integration of high-rise buildings within the historic urban environment for the purposes of providing pre-design and design studies in terms of preserving the historical urban environment and the implementation of the reconstructional resource of the area. In the article formed and systematized the stages and methods of conducting the visual-landscape analysis taking into account the influence of high-rise buildings on objects of cultural heritage and valuable historical buildings of the city. Practical application of the visual-landscape analysis provides an opportunity to assess the influence of hypothetical location of high-rise buildings on the perception of a historically developed environment and optimal building parameters. The contents of the main stages in the conduct of the visual - landscape analysis and their key aspects, concerning the construction of predicted zones of visibility of the significant historically valuable urban development objects and hypothetically planned of the high-rise buildings are revealed. The obtained data are oriented to the successive development of the planning and typological structure of the city territory and preservation of the compositional influence of valuable fragments of the historical environment in the structure of the urban landscape. On their basis, an information database is formed to determine the permissible urban development parameters of the high-rise buildings for the preservation of the compositional integrity of the urban area.
Nolte, Thomas; Rittinghausen, Susanne; Kellner, Rupert; Karbe, Eberhard; Kittel, Birgit; Rinke, Matthias; Deschl, Ulrich
2011-11-01
Historical data for Leydig cell tumors from untreated or vehicle treated rats from carcinogenicity studies collected in the RITA database are presented. Examples are given for analyses of these data for dependency on variables considered to be of possible influence on the spontaneous incidence of Leydig cell tumors. In the 7453 male rats available for analysis, only one case of a Leydig cell carcinoma was identified. The incidence of Leydig cell adenomas differed markedly between strains. High incidences of close to 100% have been found in F344 rats, while the mean incidence was 4.2% in Sprague-Dawley rats and 13.7% in Wistar rats. Incidences in Wistar rats were highly variable, primarily caused by different sources of animals. Mean incidences per breeder varied from 2.8 to 39.9%. Analyses for the dependency on further parameters have been performed in Wistar rats. In breeders G and I, the Leydig cell tumor incidence decreased over the observation period and with increasing mean terminal body weight. The incidence of Leydig cell tumors increased with mean age at necropsy and was higher in studies with dietary admixture compared to gavage studies. These parameters had no effect on Leydig cell tumor incidence in breeders A and B. Animals from almost all breeders had a considerably higher mean age at necropsy when bearing a Leydig cell adenoma than animals without a Leydig cell adenoma. Studies with longitudinal trimming of the testes had a higher incidence than studies with transverse trimming. The observed dependencies and breeder differences are discussed and explanations are given. Consequences for the use of historical control data are outlined. With the retrospective analyses presented here we were able to confirm the published features of Leydig cell adenomas and carcinomas. This indicates that the RITA database is a valuable tool for analyses of tumors for their biological features. Furthermore, it demonstrates that the RITA database is highly beneficial for the definition of reliable historical control data for carcinogenicity studies on a scientifically solid basis. Copyright © 2010 Elsevier GmbH. All rights reserved.
Retrieving Historical Electrorefining Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wheeler, Meagan Daniella
Pyrochemical Operations began at Los Alamos National Laboratory (LANL) during 1962 (1). Electrorefining (ER) has been implemented as a routine process since the 1980’s. The process data that went through the ER operation was recorded but had never been logged in an online database. Without a database new staff members are hindered in their work by the lack of information. To combat the issue a database in Access was created to collect the historical data. The years from 2000 onward were entered and queries were created to analyze trends. These trends will aid engineering and operations staff to reach optimalmore » performance for the startup of the new lines.« less
Peter U. Kennedy; Victor B. Shelburne
2002-01-01
Geographic Information Systems (GIS) data and historical plats ranging from 1716 to 1894 in the Coastal Flatwoods Region of South Carolina were used to quantify changes on a temporal scale. Combining the historic plats and associated witness trees (trees marking the boundaries of historic plats) with an existing database of the soils and other attributes was the basis...
De Natale, Antonino; Pezzatti, Gianni Boris; Pollio, Antonino
2009-01-01
Background Ethnobotanical studies generally describe the traditional knowledge of a territory according to a "hic et nunc" principle. The need of approaching this field also embedding historical data has been frequently acknowledged. With their long history of civilization some regions of the Mediterranean basin seem to be particularly suited for an historical approach to be adopted. Campania, a region of southern Italy, has been selected for a database implementation containing present and past information on plant uses. Methods A relational database has been built on the basis of information gathered from different historical sources, including diaries, travel accounts, and treatises on medicinal plants, written by explorers, botanists, physicians, who travelled in Campania during the last three centuries. Moreover, ethnobotanical uses described in historical herbal collections and in Ancient and Medieval texts from the Mediterranean Region have been included in the database. Results 1672 different uses, ranging from medicinal, to alimentary, ceremonial, veterinary, have been recorded for 474 species listed in the data base. Information is not uniformly spread over the Campanian territory; Sannio being the most studied geographical area and Cilento the least one. About 50 plants have been continuously used in the last three centuries in the cure of the same affections. A comparison with the uses reported for the same species in Ancient treatises shows that the origin of present ethnomedicine from old learned medical doctrines needs a case-by-case confirmation. Conclusion The database is flexible enough to represent a useful tool for researchers who need to store and compare present and previous ethnobotanical uses from Mediterranean Countries. PMID:19228384
Analyzing Historical Primary Source Open Educational Resources: A Blended Pedagogical Approach
ERIC Educational Resources Information Center
Oliver, Kevin M.; Purichia, Heather R.
2018-01-01
This qualitative case study addresses the need for pedagogical approaches to working with open educational resources (OER). Drawing on a mix of historical thinking heuristics and case analysis approaches, a blended pedagogical strategy and primary source database were designed to build student understanding of historical records with transfer of…
An Extensible Information Grid for Risk Management
NASA Technical Reports Server (NTRS)
Maluf, David A.; Bell, David G.
2003-01-01
This paper describes recent work on developing an extensible information grid for risk management at NASA - a RISK INFORMATION GRID. This grid is being developed by integrating information grid technology with risk management processes for a variety of risk related applications. To date, RISK GRID applications are being developed for three main NASA processes: risk management - a closed-loop iterative process for explicit risk management, program/project management - a proactive process that includes risk management, and mishap management - a feedback loop for learning from historical risks that escaped other processes. This is enabled through an architecture involving an extensible database, structuring information with XML, schemaless mapping of XML, and secure server-mediated communication using standard protocols.
NASA Technical Reports Server (NTRS)
Reid, John; Egge, Robert; McAfee, Nancy
2000-01-01
This document summarizes the feedback gathered during the user-testing phase in the development of an electronic library application: the Aeronautics and Space Access Pages (ASAP). It first provides some historical background on the NASA Scientific and Technical Information (STI) program and its efforts to enhance the services it offers the aerospace community. Following a brief overview of the ASAP project, it reviews the results of an online user survey, and from the lessons learned therein, outlines direction for future development of the project.
Brimhall, Bradley B; Hall, Timothy E; Walczak, Steven
2006-01-01
A hospital laboratory relational database, developed over eight years, has demonstrated significant cost savings and a substantial financial return on investment (ROI). In addition, the database has been used to measurably improve laboratory operations and the quality of patient care.
A Database Evaluation Based on Information Needs of Academic Social Scientists.
ERIC Educational Resources Information Center
Buterbaugh, Nancy Toth
This study evaluates two databases, "Historical Abstracts" and REESWeb, to determine their effectiveness in supporting academic social science research. While many performance evaluations gather quantitative data from isolated query and response transactions, this study is a qualitative evaluation of the databases in the context of…
EPA’s RadNet data are available for viewing in a searchable database or as PDF reports. Historical and current RadNet monitoring data are used to estimate long-term trends in environmental radiation levels.
Specific character of citations in historiography (using the example of Polish history).
Kolasa, Władysław Marek
2012-03-01
The first part of the paper deals with the assessment of international databases in relation to the number of historical publications (representation and relevance in comparison with the model database). The second part is focused on providing answer to the question whether historiography is governed by similar bibliometric rules as exact sciences or whether it has its own specific character. Empirical database for this part of the research constituted the database prepared ad hoc: The Citation Index of the History of Polish Media (CIHPM). Among numerous typically historical features the main focus was put on: linguistic localism, specific character of publishing forms, differences in citing of various sources (contributions and syntheses) and specific character of the authorship (the Lorenz Curve and the Lotka's Law). Slightly more attention was devoted to the half-life indicator and its role in a diachronic study of a scientific field; also, a new indicator (HL14), depicting distribution of citations younger then half-life was introduced. Additionally, the comparison and correlation of selected parameters for the body of historical science (citations, HL14, the Hirsch Index, number of publications, volume and other) were also conducted.
Standardization of milk infrared spectra for the retroactive application of calibration models.
Bonfatti, V; Fleming, A; Koeck, A; Miglior, F
2017-03-01
The objective of this study was to standardize the infrared spectra obtained over time and across 2 milk laboratories of Canada to create a uniform historical database and allow (1) the retroactive application of calibration models for prediction of fine milk composition; and (2) the direct use of spectral information for the development of indicators of animal health and efficiency. Spectral variation across laboratories and over time was inspected by principal components analysis (PCA). Shifts in the PCA scores were detected over time, leading to the definition of different subsets of spectra having homogeneous infrared signal. To evaluate the possibility of using common equations on spectra collected by the 2 instruments and over time, we developed a standardization (STD) method. For each subset of data having homogeneous infrared signal, a total of 99 spectra corresponding to the percentiles of the distribution of the absorbance at each wavenumber were created and used to build the STD matrices. Equations predicting contents of saturated fatty acids, short-chain fatty acids, and C18:0 were created and applied on different subsets of spectra, before and after STD. After STD, bias and root mean squared error of prediction decreased by 66% and 32%, respectively. When calibration equations were applied to the historical nonstandardized database of spectra, shifts in the predictions could be observed over time for all investigated traits. Shifts in the distribution of the predictions over time corresponded to the shifts identified by the inspection of the PCA scores. After STD, shifts in the predicted fatty acid contents were greatly reduced. Standardization reduced spectral variability between instruments and over time, allowing the merging of milk spectra data from different instruments into a common database, the retroactive use of calibrations equations, or the direct use of the spectral data without restrictions. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Nobrega, R Paul; Brown, Michael; Williams, Cody; Sumner, Chris; Estep, Patricia; Caffry, Isabelle; Yu, Yao; Lynaugh, Heather; Burnina, Irina; Lilov, Asparouh; Desroches, Jordan; Bukowski, John; Sun, Tingwan; Belk, Jonathan P; Johnson, Kirt; Xu, Yingda
2017-10-01
The state-of-the-art industrial drug discovery approach is the empirical interrogation of a library of drug candidates against a target molecule. The advantage of high-throughput kinetic measurements over equilibrium assessments is the ability to measure each of the kinetic components of binding affinity. Although high-throughput capabilities have improved with advances in instrument hardware, three bottlenecks in data processing remain: (1) intrinsic molecular properties that lead to poor biophysical quality in vitro are not accounted for in commercially available analysis models, (2) processing data through a user interface is time-consuming and not amenable to parallelized data collection, and (3) a commercial solution that includes historical kinetic data in the analysis of kinetic competition data does not exist. Herein, we describe a generally applicable method for the automated analysis, storage, and retrieval of kinetic binding data. This analysis can deconvolve poor quality data on-the-fly and store and organize historical data in a queryable format for use in future analyses. Such database-centric strategies afford greater insight into the molecular mechanisms of kinetic competition, allowing for the rapid identification of allosteric effectors and the presentation of kinetic competition data in absolute terms of percent bound to antigen on the biosensor.
Lund, Jennifer L.; Richardson, David B.; Stürmer, Til
2016-01-01
Better understanding of biases related to selective prescribing of, and adherence to, preventive treatments has led to improvements in the design and analysis of pharmacoepidemiologic studies. One influential development has been the “active comparator, new user” study design, which seeks to emulate the design of a head-to-head randomized controlled trial. In this review, we first discuss biases that may affect pharmacoepidemiologic studies and describe their direction and magnitude in a variety of settings. We then present the historical foundations of the active comparator, new user study design and explain how this design conceptually mitigates biases leading to a paradigm shift in pharmacoepidemiology. We offer practical guidance on the implementation of the study design using administrative databases. Finally, we provide an empirical example in which the active comparator, new user study design addresses biases that have previously impeded pharmacoepidemiologic studies. PMID:26954351
Creating a FIESTA (Framework for Integrated Earth Science and Technology Applications) with MagIC
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Tauxe, L.; Constable, C.
2017-12-01
The Magnetics Information Consortium (https://earthref.org/MagIC) has recently developed a containerized web application to considerably reduce the friction in contributing, exploring and combining valuable and complex datasets for the paleo-, geo- and rock magnetic scientific community. The data produced in this scientific domain are inherently hierarchical and the communities evolving approaches to this scientific workflow, from sampling to taking measurements to multiple levels of interpretations, require a large and flexible data model to adequately annotate the results and ensure reproducibility. Historically, contributing such detail in a consistent format has been prohibitively time consuming and often resulted in only publishing the highly derived interpretations. The new open-source (https://github.com/earthref/MagIC) application provides a flexible upload tool integrated with the data model to easily create a validated contribution and a powerful search interface for discovering datasets and combining them to enable transformative science. MagIC is hosted at EarthRef.org along with several interdisciplinary geoscience databases. A FIESTA (Framework for Integrated Earth Science and Technology Applications) is being created by generalizing MagIC's web application for reuse in other domains. The application relies on a single configuration document that describes the routing, data model, component settings and external services integrations. The container hosts an isomorphic Meteor JavaScript application, MongoDB database and ElasticSearch search engine. Multiple containers can be configured as microservices to serve portions of the application or rely on externally hosted MongoDB, ElasticSearch, or third-party services to efficiently scale computational demands. FIESTA is particularly well suited for many Earth Science disciplines with its flexible data model, mapping, account management, upload tool to private workspaces, reference metadata, image galleries, full text searches and detailed filters. EarthRef's Seamount Catalog of bathymetry and morphology data, EarthRef's Geochemical Earth Reference Model (GERM) databases, and Oregon State University's Marine and Geology Repository (http://osu-mgr.org) will benefit from custom adaptations of FIESTA.
The Rigid Pavement Database: Overview and Data Collection Plan
DOT National Transportation Integrated Search
1998-06-01
The rigid pavement (RP) database contains historical distress data obtained from more than 400 continuously reinforced concrete pavements(CRCP) and jointed concrete pavements (JCP) across the state of Texas. Data collection efforts began in 1974 and ...
NASA Astrophysics Data System (ADS)
Tracey, Emily; Smith, Nichola; Lawrie, Ken
2017-04-01
The principles behind, and the methods of, digital data capture can be applied across many scientific, and other, disciplines, as can be demonstrated by the use of a custom modified version of the British Geological Survey's System for Integrated Geoscience Mapping, (BGS·SIGMA), for the capture of data for use in the conservation of Scottish built heritage. Historic Environment Scotland (HES), an executive agency of the Scottish Government charged with safeguarding the nation's historic environment, is directly responsible for 345 sites of national significance, most of which are built from stone. In common with many other heritage organisations, HES needs a system that can capture, store and present conservation, maintenance and condition indicator information for single or multiple historic sites; this system would then be used to better target and plan effective programmes of maintenance and repair. To meet this need, the British Geological Survey (BGS) has worked with HES to develop an integrated digital site assessment system that provides a refined survey process for stone-built (and other) historic sites. Based on BGS·SIGMA—an integrated workflow underpinned by a geo-spatial platform for data capture and interpretation—the new system is built on top of ESRI's ArcGIS software, and underpinned by a relational database. Users can, in the field or in the office, populate custom-built data entry forms to record maintenance issues and repair specifications for architectural elements ranging from individual blocks of stone to entire building elevations. Photographs, sketches, and digital documents can be linked to architectural elements to enhance the usability of the data. Predetermined data fields and supporting dictionaries constrain the input parameters, ensuring a high degree of standardisation in the datasets and, therefore, enabling highly consistent data extraction and querying. The GIS presentation of the data provides a powerful and versatile planning tool for scheduling works, specifying materials, identifying the skills needed for repairs, and allocating resources more effectively and efficiently. Physical alterations and changes in the overall condition of a single site, or a group of sites can be monitored accurately over time by repeating the original survey (e.g. every 5 years). Other datasets can be linked to the database and other geospatially referenced datasets can be superimposed in GIS, adding considerably to the scope and utility of the system. The system can be applied to any geospatially referenced object in a wide range of situations thus providing many potential applications in conservation, archaeology and other related fields.
Perryman, Sarah A M; Castells-Brooke, Nathalie I D; Glendining, Margaret J; Goulding, Keith W T; Hawkesford, Malcolm J; Macdonald, Andy J; Ostler, Richard J; Poulton, Paul R; Rawlings, Christopher J; Scott, Tony; Verrier, Paul J
2018-05-15
The electronic Rothamsted Archive, e-RA (www.era.rothamsted.ac.uk) provides a permanent managed database to both securely store and disseminate data from Rothamsted Research's long-term field experiments (since 1843) and meteorological stations (since 1853). Both historical and contemporary data are made available via this online database which provides the scientific community with access to a unique continuous record of agricultural experiments and weather measured since the mid-19 th century. Qualitative information, such as treatment and management practices, plans and soil information, accompanies the data and are made available on the e-RA website. e-RA was released externally to the wider scientific community in 2013 and this paper describes its development, content, curation and the access process for data users. Case studies illustrate the diverse applications of the data, including its original intended purposes and recent unforeseen applications. Usage monitoring demonstrates the data are of increasing interest. Future developments, including adopting FAIR data principles, are proposed as the resource is increasingly recognised as a unique archive of data relevant to sustainable agriculture, agroecology and the environment.
Content-based retrieval of historical Ottoman documents stored as textual images.
Saykol, Ediz; Sinop, Ali Kemal; Güdükbay, Ugur; Ulusoy, Ozgür; Cetin, A Enis
2004-03-01
There is an accelerating demand to access the visual content of documents stored in historical and cultural archives. Availability of electronic imaging tools and effective image processing techniques makes it feasible to process the multimedia data in large databases. In this paper, a framework for content-based retrieval of historical documents in the Ottoman Empire archives is presented. The documents are stored as textual images, which are compressed by constructing a library of symbols occurring in a document, and the symbols in the original image are then replaced with pointers into the codebook to obtain a compressed representation of the image. The features in wavelet and spatial domain based on angular and distance span of shapes are used to extract the symbols. In order to make content-based retrieval in historical archives, a query is specified as a rectangular region in an input image and the same symbol-extraction process is applied to the query region. The queries are processed on the codebook of documents and the query images are identified in the resulting documents using the pointers in textual images. The querying process does not require decompression of images. The new content-based retrieval framework is also applicable to many other document archives using different scripts.
Manheim, F.T.; Buchholtz ten Brink, Marilyn R.; Mecray, E.L.
1998-01-01
A comprehensive database of sediment chemistry and environmental parameters has been compiled for Boston Harbor and Massachusetts Bay. This work illustrates methodologies for rescuing and validating sediment data from heterogeneous historical sources. It greatly expands spatial and temporal data coverage of estuarine and coastal sediments. The database contains about 3500 samples containing inorganic chemical, organic, texture and other environmental data dating from 1955 to 1994. Cooperation with local and federal agencies as well as universities was essential in locating and screening documents for the database. More than 80% of references utilized came from sources with limited distribution (gray literature). Task sharing was facilitated by a comprehensive and clearly defined data dictionary for sediments. It also served as a data entry template and flat file format for data processing and as a basis for interpretation and graphical illustration. Standard QA/QC protocols are usually inapplicable to historical sediment data. In this work outliers and data quality problems were identified by batch screening techniques that also provide visualizations of data relationships and geochemical affinities. No data were excluded, but qualifying comments warn users of problem data. For Boston Harbor, the proportion of irreparable or seriously questioned data was remarkably small (<5%), although concentration values for metals and organic contaminants spanned 3 orders of magnitude for many elements or compounds. Data from the historical database provide alternatives to dated cores for measuring changes in surficial sediment contamination level with time. The data indicate that spatial inhomogeneity in harbor environments can be large with respect to sediment-hosted contaminants. Boston Inner Harbor surficial sediments showed decreases in concentrations of Cu, Hg, and Zn of 40 to 60% over a 17-year period.A comprehensive database of sediment chemistry and environmental parameters has been compiled for Boston Harbor and Massachusetts Bay. This work illustrates methodologies for rescuing and validating sediment data from heterogeneous historical sources. It greatly expands spatial and temporal data coverage of estuarine and coastal sediments. The database contains about 3500 samples containing inorganic chemical, organic, texture and other environmental data dating from 1995 to 1994. Cooperation with local and federal agencies as well as universities was essential in locating and screening documents for the database. More than 80% of references utilized came from sources with limited distribution (gray Task sharing was facilitated by a comprehensive and clearly defined data dictionary for sediments. It also served as a data entry template and flat file format for data processing and as a basis for interpretation and graphical illustration. Standard QA/QC protocols are usually inapplicable to historical sediment data. In this work outliers and data quality problems were identified by batch screening techniques that also provide visualizations of data relationships and geochemical affinities. No data were excluded, but qualifying comments warn users of problem data. For Boston Harbor, the proportion of irreparable or seriously questioned data was remarkably small (<5%), although concentration values for metals and organic contaminants spanned 3 orders of magnitude for many elements or compounds. Data from the historical database provide alternatives to dated cores for measuring changes in surficial sediment contamination level with time. The data indicate that spatial inhomogeneity in harbor environments can be large with respect to sediment-hosted contaminants. Boston Inner Harbor surficial sediments showed decreases in concentrations Cu, Hg, and Zn of 40 to 60% over a 17-year period.
Downscaling climate information for local disease mapping.
Bernardi, M; Gommes, R; Grieser, J
2006-06-01
The study of the impacts of climate on human health requires the interdisciplinary efforts of health professionals, climatologists, biologists, and social scientists to analyze the relationships among physical, biological, ecological, and social systems. As the disease dynamics respond to variations in regional and local climate, climate variability affects every region of the world and the diseases are not necessarily limited to specific regions, so that vectors may become endemic in other regions. Climate data at local level are thus essential to evaluate the dynamics of vector-borne disease through health-climate models and most of the times the climatological databases are not adequate. Climate data at high spatial resolution can be derived by statistical downscaling using historical observations but the method is limited by the availability of historical data at local level. Since the 90s', the statistical interpolation of climate data has been an important priority of the Agrometeorology Group of the Food and Agriculture Organization of the United Nations (FAO), as they are required for agricultural planning and operational activities at the local level. Since 1995, date of the first FAO spatial interpolation software for climate data, more advanced applications have been developed such as SEDI (Satellite Enhanced Data Interpolation) for the downscaling of climate data, LOCCLIM (Local Climate Estimator) and the NEW_LOCCLIM in collaboration with the Deutscher Wetterdienst (German Weather Service) to estimate climatic conditions at locations for which no observations are available. In parallel, an important effort has been made to improve the FAO climate database including at present more than 30,000 stations worldwide and expanding the database from developing countries coverage to global coverage.
Sovereign immunity: Principles and application in medical malpractice.
Suk, Michael
2012-05-01
Tort law seeks accountability when parties engage in negligent conduct, and aims to compensate the victims of such conduct. An exception to this general rule governing medical negligence is the doctrine of sovereign immunity. Historically, individuals acting under the authority of the government or other sovereign entity had almost complete protection against tort liability. This article addressed the following: (1) the development of sovereign immunity in law, (2) the lasting impact of the Federal Tort Claims Act on sovereign immunity, and (3) the contemporary application of sovereign immunity to medical malpractice, using case examples from Virginia and Florida. I performed an Internet search to identify sources that addressed the concept of sovereign immunity, followed by a focused search for relevant articles in PubMed and LexisNexis, literature databases for medical and legal professionals, respectively. Historically, sovereign liability conferred absolute immunity from lawsuits in favor of the sovereign (ie, the government). Practical considerations in our democratic system have contributed to an evolution of this doctrine. Understanding sovereign immunity and its contemporary application are of value for any physician interested in the debate concerning medical malpractice in the United States. Under certain circumstances, physicians working as employees of the federal or state government may be protected against individual liability if the government is substituted as the defendant.
NASA Astrophysics Data System (ADS)
Daniell, James; Wenzel, Friedemann
2014-05-01
Over the past decade, the production of economic indices behind the CATDAT Damaging Earthquakes Database has allowed for the conversion of historical earthquake economic loss and cost events into today's terms using long-term spatio-temporal series of consumer price index (CPI), construction costs, wage indices, and GDP from 1900-2013. As part of the doctoral thesis of Daniell (2014), databases and GIS layers for a country and sub-country level have been produced for population, GDP per capita, net and gross capital stock (depreciated and non-depreciated) using studies, census information and the perpetual inventory method. In addition, a detailed study has been undertaken to collect and reproduce as many historical isoseismal maps, macroseismic intensity results and reproductions of earthquakes as possible out of the 7208 damaging events in the CATDAT database from 1900 onwards. a) The isoseismal database and population bounds from 3000+ collected damaging events were compared with the output parameters of GDP and net and gross capital stock per intensity bound and administrative unit, creating a spatial join for analysis. b) The historical costs were divided into shaking/direct ground motion effects, and secondary effects costs. The shaking costs were further divided into gross capital stock related and GDP related costs for each administrative unit, intensity bound couplet. c) Costs were then estimated based on the optimisation of the function in terms of costs vs. gross capital stock and costs vs. GDP via the regression of the function. Losses were estimated based on net capital stock, looking at the infrastructure age and value at the time of the event. This dataset was then used to develop an economic exposure for each historical earthquake in comparison with the loss recorded in the CATDAT Damaging Earthquakes Database. The production of economic fragility functions for each country was possible using a temporal regression based on the parameters of macroseismic intensity, capital stock estimate, GDP estimate, year and the combined seismic building index (a created combination of the global seismic code index, building practice factor, building age and infrastructure vulnerability). The analysis provided three key results: a) The production of economic fragility functions from the 1900-2008 events showed very good correlation to the economic loss and cost from earthquakes from 2009-2013, in real-time. This methodology has been extended to other natural disaster types (typhoon, flood, drought). b) The reanalysis of historical earthquake events in order to check associated historical loss and costs versus the expected exposure in terms of intensities. The 1939 Chillan, 1948 Turkmenistan, 1950 Iran, 1972 Managua, 1980 Western Nepal and 1992 Erzincan earthquake events were seen as huge outliers compared with the modelled capital stock and GDP and thus additional studies were undertaken to check the original loss results. c) A worldwide GIS layer database of capital stock (gross and net), GDP, infrastructure age and economic indices over the period 1900-2013 have been created in conjunction with the CATDAT database in order to define correct economic loss and costs.
range of site-related information easily, especially for historic resources. PropertyQuest draws from databases provided by other DC agencies. Information is presented here for planning purposes only. Please , including: The Office of Planning for historic resources, census information, and boundaries of Chinatown
Information categorization approach to literary authorship disputes
NASA Astrophysics Data System (ADS)
Yang, Albert C.-C.; Peng, C.-K.; Yien, H.-W.; Goldberger, Ary L.
2003-11-01
Scientific analysis of the linguistic styles of different authors has generated considerable interest. We present a generic approach to measuring the similarity of two symbolic sequences that requires minimal background knowledge about a given human language. Our analysis is based on word rank order-frequency statistics and phylogenetic tree construction. We demonstrate the applicability of this method to historic authorship questions related to the classic Chinese novel “The Dream of the Red Chamber,” to the plays of William Shakespeare, and to the Federalist papers. This method may also provide a simple approach to other large databases based on their information content.
Ecology of Alpine Macrofungi - Combining Historical with Recent Data
Brunner, Ivano; Frey, Beat; Hartmann, Martin; Zimmermann, Stephan; Graf, Frank; Suz, Laura M.; Niskanen, Tuula; Bidartondo, Martin I.; Senn-Irlet, Beatrice
2017-01-01
Historical datasets of living communities are important because they can be used to document creeping shifts in species compositions. Such a historical data set exists for alpine fungi. From 1941 to 1953, the Swiss geologist Jules Favre visited yearly the region of the Swiss National Park and recorded the occurring fruiting bodies of fungi >1 mm (so-called “macrofungi”) in the alpine zone. Favre can be regarded as one of the pioneers of alpine fungal ecology not least because he noted location, elevation, geology, and associated plants during his numerous excursions. However, some relevant information is only available in his unpublished field-book. Overall, Favre listed 204 fungal species in 26 sampling sites, with 46 species being previously unknown. The analysis of his data revealed that the macrofungi recorded belong to two major ecological groups, either they are symbiotrophs and live in ectomycorrhizal associations with alpine plant hosts, or they are saprotrophs and decompose plant litter and soil organic matter. The most frequent fungi were members of Inocybe and Cortinarius, which form ectomycorrhizas with Dryas octopetala or the dwarf alpine Salix species. The scope of the present study was to combine Favre's historical dataset with more recent data, either with the “SwissFungi” database or with data from major studies of the French and German Alps, and with the data from novel high-throughput DNA sequencing techniques of soils from the Swiss Alps. Results of the latter application revealed, that problems associated with these new techniques are manifold and species determination remains often unclear. At this point, the fungal taxa collected by Favre and deposited as exsiccata at the “Conservatoire et Jardin Botaniques de la Ville de Genève” could be used as a reference sequence dataset for alpine fungal studies. In conclusion, it can be postulated that new improved databases are urgently necessary for the near future, particularly, with regard to investigating fungal communities from alpine regions using new techniques. PMID:29123508
Ecology of Alpine Macrofungi - Combining Historical with Recent Data.
Brunner, Ivano; Frey, Beat; Hartmann, Martin; Zimmermann, Stephan; Graf, Frank; Suz, Laura M; Niskanen, Tuula; Bidartondo, Martin I; Senn-Irlet, Beatrice
2017-01-01
Historical datasets of living communities are important because they can be used to document creeping shifts in species compositions. Such a historical data set exists for alpine fungi. From 1941 to 1953, the Swiss geologist Jules Favre visited yearly the region of the Swiss National Park and recorded the occurring fruiting bodies of fungi >1 mm (so-called "macrofungi") in the alpine zone. Favre can be regarded as one of the pioneers of alpine fungal ecology not least because he noted location, elevation, geology, and associated plants during his numerous excursions. However, some relevant information is only available in his unpublished field-book. Overall, Favre listed 204 fungal species in 26 sampling sites, with 46 species being previously unknown. The analysis of his data revealed that the macrofungi recorded belong to two major ecological groups, either they are symbiotrophs and live in ectomycorrhizal associations with alpine plant hosts, or they are saprotrophs and decompose plant litter and soil organic matter. The most frequent fungi were members of Inocybe and Cortinarius , which form ectomycorrhizas with Dryas octopetala or the dwarf alpine Salix species. The scope of the present study was to combine Favre's historical dataset with more recent data, either with the "SwissFungi" database or with data from major studies of the French and German Alps, and with the data from novel high-throughput DNA sequencing techniques of soils from the Swiss Alps. Results of the latter application revealed, that problems associated with these new techniques are manifold and species determination remains often unclear. At this point, the fungal taxa collected by Favre and deposited as exsiccata at the "Conservatoire et Jardin Botaniques de la Ville de Genève" could be used as a reference sequence dataset for alpine fungal studies. In conclusion, it can be postulated that new improved databases are urgently necessary for the near future, particularly, with regard to investigating fungal communities from alpine regions using new techniques.
NASA Astrophysics Data System (ADS)
Steigies, C. T.
2015-12-01
Since the International Geophysical Year (IGY) in 1957-58 cosmic rays areroutinely measured by many ground-based Neutron Monitors (NM) around theworld. The World Data Center for Cosmic Rays (WDCCR) was established as apart of this activity and is providing a database of cosmic-ray neutronobservations in unified formats. However, that standard data comprises onlyof one hour averages, whereas most NM stations have been enhanced at the endof the 20th century to provide data in one minute resolution or even better.This data was only available on the web-sites of the institutes operatingthe station, and every station invented their own data format for thehigh-resolution measurements. There were some efforts to collect data fromseveral stations, to make this data available on FTP servers, however noneof these efforts could provide real-time data for all stations.The EU FP7 project NMDB (real-time database for high-resolution NeutronMonitor measurements, http://nmdb.eu) was funded by the European Commission,and a new database was set up by several Neutron Monitor stations in Europeand Asia to store high-resolution data and to provide access to the data inreal-time (i.e. less than five minute delay). By storing the measurements ina database, a standard format for the high-resolution measurements isenforced. This database is complementary to the WDCCR, as it does not (yet)provide all historical data, but the creation of this effort has spurred anew collaboration between Neutron Monitor scientists worldwide, (new)stations have gone online (again), new projects are building on the resultsof NMDB, new users outside of the Cosmic Ray community are starting to useNM data for new applications like soil moisture measurements using cosmicrays. These applications are facilitated by the easy access to the data withthe http://nest.nmdb.eu interface that offers access to all NMDB data forall users.
A k-Vector Approach to Sampling, Interpolation, and Approximation
NASA Astrophysics Data System (ADS)
Mortari, Daniele; Rogers, Jonathan
2013-12-01
The k-vector search technique is a method designed to perform extremely fast range searching of large databases at computational cost independent of the size of the database. k-vector search algorithms have historically found application in satellite star-tracker navigation systems which index very large star catalogues repeatedly in the process of attitude estimation. Recently, the k-vector search algorithm has been applied to numerous other problem areas including non-uniform random variate sampling, interpolation of 1-D or 2-D tables, nonlinear function inversion, and solution of systems of nonlinear equations. This paper presents algorithms in which the k-vector search technique is used to solve each of these problems in a computationally-efficient manner. In instances where these tasks must be performed repeatedly on a static (or nearly-static) data set, the proposed k-vector-based algorithms offer an extremely fast solution technique that outperforms standard methods.
Prediction of pelvic organ prolapse using an artificial neural network.
Robinson, Christopher J; Swift, Steven; Johnson, Donna D; Almeida, Jonas S
2008-08-01
The objective of this investigation was to test the ability of a feedforward artificial neural network (ANN) to differentiate patients who have pelvic organ prolapse (POP) from those who retain good pelvic organ support. Following institutional review board approval, patients with POP (n = 87) and controls with good pelvic organ support (n = 368) were identified from the urogynecology research database. Historical and clinical information was extracted from the database. Data analysis included the training of a feedforward ANN, variable selection, and external validation of the model with an independent data set. Twenty variables were used. The median-performing ANN model used a median of 3 (quartile 1:3 to quartile 3:5) variables and achieved an area under the receiver operator curve of 0.90 (external, independent validation set). Ninety percent sensitivity and 83% specificity were obtained in the external validation by ANN classification. Feedforward ANN modeling is applicable to the identification and prediction of POP.
Hedefalk, Finn; Svensson, Patrick; Harrie, Lars
2017-01-01
This paper presents datasets that enable historical longitudinal studies of micro-level geographic factors in a rural setting. These types of datasets are new, as historical demography studies have generally failed to properly include the micro-level geographic factors. Our datasets describe the geography over five Swedish rural parishes, and by linking them to a longitudinal demographic database, we obtain a geocoded population (at the property unit level) for this area for the period 1813–1914. The population is a subset of the Scanian Economic Demographic Database (SEDD). The geographic information includes the following feature types: property units, wetlands, buildings, roads and railroads. The property units and wetlands are stored in object-lifeline time representations (information about creation, changes and ends of objects are recorded in time), whereas the other feature types are stored as snapshots in time. Thus, the datasets present one of the first opportunities to study historical spatio-temporal patterns at the micro-level. PMID:28398288
Gregoriano cadastre (1818-35) from old maps to a GIS of historical landscape data
NASA Astrophysics Data System (ADS)
Frazzica, V.; Galletti, F.; Orciani, M.; Colosi, L.; Cartaro, A.
2009-04-01
Our analysis covered specifically an area located along the "internal Marche ridge" of the Apennines, in the province of Ancona (Marche Region, Italy). The cartographical working-out for our historical analysis has been conduct drawing up maps originating from the nineteenth century Gregoriano Cadastre (Catasto Gregoriano) maps preserved in the State Archive of Rome, which have been reproduced in digital format, georeferenced and vectorialized. With the creation of a database, it has been possible to add to the maps the information gathered from the property registers concerning crop production and socioeconomic variables, in order to set up a Geographical Information System (G.I.S.). The combination of the database with the digitalized maps has allowed to create an univocal relation between each parcel and the related historical data, obtaining an information system which integrally and completely evidences the original cadastre data as a final result. It was also possible to create a three-dimensional model of the historical landscapes which permits to visualize the cultural diversification of that historical period. The integration in Territorial Information System (S.I.T.) of historical information from Gregoriano Cadastre, of socio-economic analyses concerning business changes and in parallel the study of the transformations of territorial framework, showed to be a very important instrument for the area planning, allowing to identify specific planning approaches not only for urban settlement but also for restoration of variety and complexity of agricultural landscape. The work opens further research in various directions, identifying some pilot areas which test new managerial models, foreseeing simulation of management impacts both on business profitability and landscape configuration. The future development of the project is also the upgrade and evolution of the database, followed by the acquisition of data related to the following historical periods. It'll also allow to improve the three-dimensional model (rendering) of the landscape described in the Gregoriano Cadastre.
Evaluation of Marine Corps Manpower Computer Simulation Model
2016-12-01
merit- based promotion selection that is in conjunction with the “up or out” manpower system. To ensure mission accomplishment within M&RA, it is...historical data the MSM pulls from an online Oracle database. Two types of data base pulls occur here: acquiring historical data of manpower pyramid...is based off of the assumption that the historical manpower progression is constant, and therefore is controllable. This unfortunately does not marry
Data-based comparisons of moments estimators using historical and paleoflood data
England, J.F.; Jarrett, R.D.; Salas, J.D.
2003-01-01
This paper presents the first systematic comparison, using historical and paleoflood data, of moments-based flood frequency methods. Peak flow estimates were compiled from streamflow-gaging stations with historical and/or paleoflood data at 36 sites located in the United States, Argentina, United Kingdom and China, covering a diverse range of hydrologic conditions. The Expected Moments Algorithm (EMA) and the Bulletin 17B historical weighting procedure (B17H) were compared in terms of goodness of fit using 25 of the data sets. Results from this comparison indicate that EMA is a viable alternative to current B17H procedures from an operational perspective, and performed equal to or better than B17H for the data analyzed. We demonstrate satisfactory EMA performance for the remaining 11 sites with multiple thresholds and binomial censoring, which B17H cannot accommodate. It is shown that the EMA estimator readily incorporates these types of information and the LP-III distribution provided an adequate fit to the data in most cases. The results shown here are consistent with Monte Carlo simulation studies, and demonstrate that EMA is preferred overall to B17H. The Bulletin 17B document could be revised to include an option for EMA as an alternative to the existing historical weighting approach. These results are of practical relevance to hydrologists and water resources managers for applications in floodplain management, design of hydraulic structures, and risk analysis for dams. ?? 2003 Elsevier Science B.V. All rights reserved.
Data-based comparisons of moments estimators using historical and paleoflood data
NASA Astrophysics Data System (ADS)
England, John F.; Jarrett, Robert D.; Salas, José D.
2003-07-01
This paper presents the first systematic comparison, using historical and paleoflood data, of moments-based flood frequency methods. Peak flow estimates were compiled from streamflow-gaging stations with historical and/or paleoflood data at 36 sites located in the United States, Argentina, United Kingdom and China, covering a diverse range of hydrologic conditions. The Expected Moments Algorithm (EMA) and the Bulletin 17B historical weighting procedure (B17H) were compared in terms of goodness of fit using 25 of the data sets. Results from this comparison indicate that EMA is a viable alternative to current B17H procedures from an operational perspective, and performed equal to or better than B17H for the data analyzed. We demonstrate satisfactory EMA performance for the remaining 11 sites with multiple thresholds and binomial censoring, which B17H cannot accommodate. It is shown that the EMA estimator readily incorporates these types of information and the LP-III distribution provided an adequate fit to the data in most cases. The results shown here are consistent with Monte Carlo simulation studies, and demonstrate that EMA is preferred overall to B17H. The Bulletin 17B document could be revised to include an option for EMA as an alternative to the existing historical weighting approach. These results are of practical relevance to hydrologists and water resources managers for applications in floodplain management, design of hydraulic structures, and risk analysis for dams.
Geospatial Multi-Agency Coordination (GeoMAC) wildland fire perimeters, 2008
Walters, Sandra P.; Schneider, Norma J.; Guthrie, John D.
2011-01-01
The Geospatial Multi-Agency Coordination (GeoMAC) has been collecting and storing data on wildland fire perimeters since August 2000. The dataset presented via this U.S. Geological Survey Data Series product contains the GeoMAC wildland fire perimeter data for the calendar year 2008, which are based upon input from incident intelligence sources, Global Positioning System (GPS) data, and infrared (IR) imagery. Wildland fire perimeter data are obtained from the incidents, evaluated for completeness and accuracy, and processed to reflect consistent field names and attributes. After a quality check, the perimeters are loaded to GeoMAC databases, which support the GeoMAC Web application for access by wildland fire managers and the public. The wildland fire perimeters are viewed through the Web application. The data are subsequently archived according to year and state and are made available for downloading through the Internet in shapefile and Keyhole Markup Language (KML) format. These wildland fire perimeter data are also retained for historical, planning, and research purposes. The datasets that pertain to this report can be found on the Rocky Mountain Geographic Science Center HTTP site at http://rmgsc.cr.usgs.gov/outgoing/GeoMAC/historic_fire_data/. The links are also provided on the sidebar.
Study of Mobile GIS Application on the Field of GPR in the Road Disease Detection
NASA Astrophysics Data System (ADS)
Liao, Q.; Yang, F.
2013-12-01
With the reflection principle of pulsed electromagnetic waves, ground penetrating radar (GPR) is available to measure depth of the pavement layer, reflecting different hidden danger underground. Currently, GPR has been widely used in road engineering with the constantly improved ability of detection and diagnosis to road diseases. The sum of road disease data of a region, a city, and even a wider range will be a very informative database, so we need a more convenient way to achieve data query intuitively. As mobile internet develops continuously, application of mobile terminal device plays a more important role in information platform. Mobile GIS, with smartphone as its terminal, is supported by the mobile Internet, GPS or base station as its positioning method. In this article, based on Android Platform and using C/S pattern, the LBS application of road diseases information which integrates Baidu Map API and database technology was discussed. After testing, it can display and query the real-time and historical road diseases data, the classification of data on a phone intuitively and easily. Because of the location technique and high portability of smart phone, the spot investigations of road diseases become easier. Though, the system needs further improvement, especially with the improving of the mobile phone performance, the system can also add the function of analysis to the disease data, thus forming a set of service system with more applicable.
Study of Automatic Image Rectification and Registration of Scanned Historical Aerial Photographs
NASA Astrophysics Data System (ADS)
Chen, H. R.; Tseng, Y. H.
2016-06-01
Historical aerial photographs directly provide good evidences of past times. The Research Center for Humanities and Social Sciences (RCHSS) of Taiwan Academia Sinica has collected and scanned numerous historical maps and aerial images of Taiwan and China. Some maps or images have been geo-referenced manually, but most of historical aerial images have not been registered since there are no GPS or IMU data for orientation assisting in the past. In our research, we developed an automatic process of matching historical aerial images by SIFT (Scale Invariant Feature Transform) for handling the great quantity of images by computer vision. SIFT is one of the most popular method of image feature extracting and matching. This algorithm extracts extreme values in scale space into invariant image features, which are robust to changing in rotation scale, noise, and illumination. We also use RANSAC (Random sample consensus) to remove outliers, and obtain good conjugated points between photographs. Finally, we manually add control points for registration through least square adjustment based on collinear equation. In the future, we can use image feature points of more photographs to build control image database. Every new image will be treated as query image. If feature points of query image match the features in database, it means that the query image probably is overlapped with control images.With the updating of database, more and more query image can be matched and aligned automatically. Other research about multi-time period environmental changes can be investigated with those geo-referenced temporal spatial data.
Wood, David B.
2018-03-14
Rock samples have been collected, analyzed, and interpreted from drilling and mining operations at the Nevada National Security Site for over one-half of a century. Records containing geologic and hydrologic analyses and interpretations have been compiled into a series of databases. Rock samples have been photographed and thin sections scanned. Records and images are preserved and available for public viewing and downloading at the U.S. Geological Survey ScienceBase, Mercury Core Library and Data Center Web site at https://www.sciencebase.gov/mercury/ and documented in U.S. Geological Survey Data Series 297. Example applications of these data and images are provided in this report.
Estimating economic losses from earthquakes using an empirical approach
Jaiswal, Kishor; Wald, David J.
2013-01-01
We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.
Historic Bim: a New Repository for Structural Health Monitoring
NASA Astrophysics Data System (ADS)
Banfi, F.; Barazzetti, L.; Previtali, M.; Roncoroni, F.
2017-05-01
Recent developments in Building Information Modelling (BIM) technologies are facilitating the management of historic complex structures using new applications. This paper proposes a generative method combining the morphological and typological aspects of the historic buildings (H-BIM), with a set of monitoring information. This combination of 3D digital survey, parametric modelling and monitoring datasets allows for the development of a system for archiving and visualizing structural health monitoring (SHM) data (Fig. 1). The availability of a BIM database allows one to integrate a different kind of data stored in different ways (e.g. reports, tables, graphs, etc.) with a representation directly connected to the 3D model of the structure with appropriate levels of detail (LoD). Data can be interactively accessed by selecting specific objects of the BIM, i.e. connecting the 3D position of the sensors installed with additional digital documentation. Such innovative BIM objects, which form a new BIM family for SHM, can be then reused in other projects, facilitating data archiving and exploitation of data acquired and processed. The application of advanced modeling techniques allows for the reduction of time and costs of the generation process, and support cooperation between different disciplines using a central workspace. However, it also reveals new challenges for parametric software and exchange formats. The case study presented is the medieval bridge Azzone Visconti in Lecco (Italy), in which multi-temporal vertical movements during load testing were integrated into H-BIM.
Multi-Media and Databases for Historical Enquiry: A Report from the Trenches
ERIC Educational Resources Information Center
Hillis, Peter
2003-01-01
The Victorian period produced a diverse and rich range of historical source materials including census returns, photographs, film, personal reminiscences, music, cartoons, and posters. Recent changes to the history curriculum emphasise the acquisition of enquiry skills alongside developing knowledge and understanding, which necessitates reference…
Harris, Eric S J; Erickson, Sean D; Tolopko, Andrew N; Cao, Shugeng; Craycroft, Jane A; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E; Eisenberg, David M
2011-05-17
Ethnobotanically driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically driven natural product collection and drug-discovery programs. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Harris, Eric S. J.; Erickson, Sean D.; Tolopko, Andrew N.; Cao, Shugeng; Craycroft, Jane A.; Scholten, Robert; Fu, Yanling; Wang, Wenquan; Liu, Yong; Zhao, Zhongzhen; Clardy, Jon; Shamu, Caroline E.; Eisenberg, David M.
2011-01-01
Aim of the study. Ethnobotanically-driven drug-discovery programs include data related to many aspects of the preparation of botanical medicines, from initial plant collection to chemical extraction and fractionation. The Traditional Medicine-Collection Tracking System (TM-CTS) was created to organize and store data of this type for an international collaborative project involving the systematic evaluation of commonly used Traditional Chinese Medicinal plants. Materials and Methods. The system was developed using domain-driven design techniques, and is implemented using Java, Hibernate, PostgreSQL, Business Intelligence and Reporting Tools (BIRT), and Apache Tomcat. Results. The TM-CTS relational database schema contains over 70 data types, comprising over 500 data fields. The system incorporates a number of unique features that are useful in the context of ethnobotanical projects such as support for information about botanical collection, method of processing, quality tests for plants with existing pharmacopoeia standards, chemical extraction and fractionation, and historical uses of the plants. The database also accommodates data provided in multiple languages and integration with a database system built to support high throughput screening based drug discovery efforts. It is accessed via a web-based application that provides extensive, multi-format reporting capabilities. Conclusions. This new database system was designed to support a project evaluating the bioactivity of Chinese medicinal plants. The software used to create the database is open source, freely available, and could potentially be applied to other ethnobotanically-driven natural product collection and drug-discovery programs. PMID:21420479
Basal metabolic rate studies in humans: measurement and development of new equations.
Henry, C J K
2005-10-01
To facilitate the Food and Agriculture Organization/World Health Organization/United Nations University Joint (FAO/WHO/UNU) Expert Consultation on Energy and Protein Requirements which met in Rome in 1981, Schofield et al. reviewed the literature and produced predictive equations for both sexes for the following ages: 0-3, 3-10, 10-18, 18-30, 30-60 and >60 years. These formed the basis for the equations used in 1985 FAO/WHO/UNU document, Energy and Protein Requirements. While Schofield's analysis has served a significant role in re-establishing the importance of using basal metabolic rate (BMR) to predict human energy requirements, recent workers have subsequently queried the universal validity and application of these equations. A survey of the most recent studies (1980-2000) in BMR suggests that in most cases the current FAO/WHO/UNU predictive equations overestimate BMR in many communities. The FAO/WHO/UNU equations to predict BMR were developed using a database that contained a disproportionate number--3388 out of 7173 (47%)--of Italian subjects. The Schofield database contained relatively few subjects from the tropical region. The objective here is to review the historical development in the measurement and application of BMR and to critically review the Schofield et al. BMR database presenting a series of new equations to predict BMR. This division, while arbitrary, will enable readers who wish to omit the historical review of BMR to concentrate on the evolution of the new BMR equations. BMR data collected from published and measured values. A series of new equations (Oxford equations) have been developed using a data set of 10,552 BMR values that (1) excluded all the Italian subjects and (2) included a much larger number (4018) of people from the tropics. In general, the Oxford equations tend to produce lower BMR values than the current FAO/WHO/UNU equations in 18-30 and 30-60 year old males and in all females over 18 years of age. This is an opportune moment to re-examine the role and place of BMR measurements in estimating total energy requirements today. The Oxford equations' future use and application will surely depend on their ability to predict more accurately the BMR in contemporary populations.
THE ART OF DATA MINING THE MINEFIELDS OF TOXICITY DATABASES TO LINK CHEMISTRY TO BIOLOGY
Toxicity databases have a special role in predictive toxicology, providing ready access to historical information throughout the workflow of discovery, development, and product safety processes in drug development as well as in review by regulatory agencies. To provide accurate i...
El-Gharbaoui, Asmae; Benítez, Guillermo; González-Tejero, M Reyes; Molero-Mesa, Joaquín; Merzouki, Abderrahmane
2017-04-18
Transmission of traditional knowledge over time and across culturally and historically related territories is an important topic in ethnopharmacology. Here, we contribute to this knowledge by analysing data on medicinal uses in two neighbouring areas of the Western Mediterranean in relation to a historical text that has been scarcely mentioned in historical studies despite its interest. This paper discusses the sharing of popular knowledge on the medicinal uses of plants between eastern Morocco and eastern Andalusia (Spain), focusing on one of the most useful plant families in the Mediterranean area: Lamiaceae. Moreover, we used the classical work of Ibn al-Baytar (13th century CE) The Compendium of Simple Medicaments and Foods as a basis to contrast the possible link of this information, analysing the influence of this historical text on current popular tradition of medicinal plant use in both territories. For data collection, we performed ethnobotanical field research in the eastern part of Morocco, recording current medicinal uses for the Lamiaceae. In addition, we systematically reviewed the ethnobotanical literature from eastern Andalusia, developing a database. We investigated the possible historical link of the shared uses and included in this database the information from Ibn al-Baytar's Compendium. To compare the similarity and diversity of the data, we used Jaccard's similarity index. Our field work provided ethnobotanical information for 14 Lamiaceae species with 95 medicinal uses, serving to treat 13 different pathological groups. Of the total uses recorded in Morocco, 30.5% were shared by eastern Andalusia and found in Ibn al-Baytar's work. There was a higher similarity when comparing current uses of the geographically close territories of eastern Morocco and eastern Andalucía (64%) than for eastern Morocco and this historical text (43%). On the other hand, coincidences between current uses in eastern Andalusia and the ones related in the Compendium are lower, 28%. The coincidence of the current ethnobotanical knowledge in the two territories is high for the Lamiaceae. Probably the shared historical background, recent exchanges, information flow, and the influence of the historical herbal texts have influenced this coincidence. In this sense, there is a high plant-use overlap between Ibn al-Baytar's text and both territories: nearly half of the uses currently shared by eastern Morocco and eastern Andalusia were included in the Compendium and are related to this period of Islamic medicine, indicating a high level of preservation in the knowledge of plant usage. The study of 14 species of Lamiaceae suggests that this classical codex, which includes a high number of medicinal plants and uses, constitutes a valuable bibliographical source for comparing ancient and modern applications of plants. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Perryman, Sarah A. M.; Castells-Brooke, Nathalie I. D.; Glendining, Margaret J.; Goulding, Keith W. T.; Hawkesford, Malcolm J.; Macdonald, Andy J.; Ostler, Richard J.; Poulton, Paul R.; Rawlings, Christopher J.; Scott, Tony; Verrier, Paul J.
2018-01-01
The electronic Rothamsted Archive, e-RA (www.era.rothamsted.ac.uk) provides a permanent managed database to both securely store and disseminate data from Rothamsted Research’s long-term field experiments (since 1843) and meteorological stations (since 1853). Both historical and contemporary data are made available via this online database which provides the scientific community with access to a unique continuous record of agricultural experiments and weather measured since the mid-19th century. Qualitative information, such as treatment and management practices, plans and soil information, accompanies the data and are made available on the e-RA website. e-RA was released externally to the wider scientific community in 2013 and this paper describes its development, content, curation and the access process for data users. Case studies illustrate the diverse applications of the data, including its original intended purposes and recent unforeseen applications. Usage monitoring demonstrates the data are of increasing interest. Future developments, including adopting FAIR data principles, are proposed as the resource is increasingly recognised as a unique archive of data relevant to sustainable agriculture, agroecology and the environment. PMID:29762552
Near-realtime Cosmic Ray measurements for space weather applications
NASA Astrophysics Data System (ADS)
Steigies, C. T.
2013-12-01
In its FP7 program the European Commission has funded the creation of scientific databases. One successful project is the Neutron Monitor database NMDB which provides near-realtime access to ground-based Neutron Monitor measurements. In its beginning NMDB hosted only data from European and Asian participants, but it has recently grown to also include data from North American stations. We are currently working on providing also data from Australian stations. With the increased coverage of stations the accuracy of the NMDB applications to issue an alert of a ground level enhancement (GLE) or to predict the arrival of a coronal mass ejection (CME) is constantly improving. Besides the Cosmic Ray community and Airlines, that want to calculate radiation doses on flight routes, NMDB has also attracted users from outside the core field, for example hydrologists who compare local Neutron measurements with data from NMDB to determine soil humidity. By providing access to data from 50 stations, NMDB includes already data from the majority of the currently operating stations. However, in the future we want to include data from the few remaining stations, as well as historical data from stations that have been shut down.
Livingston, Kara A.; Chung, Mei; Sawicki, Caleigh M.; Lyle, Barbara J.; Wang, Ding Ding; Roberts, Susan B.; McKeown, Nicola M.
2016-01-01
Background Dietary fiber is a broad category of compounds historically defined as partially or completely indigestible plant-based carbohydrates and lignin with, more recently, the additional criteria that fibers incorporated into foods as additives should demonstrate functional human health outcomes to receive a fiber classification. Thousands of research studies have been published examining fibers and health outcomes. Objectives (1) Develop a database listing studies testing fiber and physiological health outcomes identified by experts at the Ninth Vahouny Conference; (2) Use evidence mapping methodology to summarize this body of literature. This paper summarizes the rationale, methodology, and resulting database. The database will help both scientists and policy-makers to evaluate evidence linking specific fibers with physiological health outcomes, and identify missing information. Methods To build this database, we conducted a systematic literature search for human intervention studies published in English from 1946 to May 2015. Our search strategy included a broad definition of fiber search terms, as well as search terms for nine physiological health outcomes identified at the Ninth Vahouny Fiber Symposium. Abstracts were screened using a priori defined eligibility criteria and a low threshold for inclusion to minimize the likelihood of rejecting articles of interest. Publications then were reviewed in full text, applying additional a priori defined exclusion criteria. The database was built and published on the Systematic Review Data Repository (SRDR™), a web-based, publicly available application. Conclusions A fiber database was created. This resource will reduce the unnecessary replication of effort in conducting systematic reviews by serving as both a central database archiving PICO (population, intervention, comparator, outcome) data on published studies and as a searchable tool through which this data can be extracted and updated. PMID:27348733
Thermal Performance Data Services (TPDS)
NASA Technical Reports Server (NTRS)
French, Richard T.; Wright, Michael J.
2013-01-01
Initiated as a NASA Engineering and Safety Center (NESC) assessment in 2009, the Thermal Performance Database (TPDB) was a response to the need for a centralized thermal performance data archive. The assessment was renamed Thermal Performance Data Services (TPDS) in 2012; the undertaking has had two fronts of activity: the development of a repository software application and the collection of historical thermal performance data sets from dispersed sources within the thermal performance community. This assessment has delivered a foundational tool on which additional features should be built to increase efficiency, expand the protection of critical Agency investments, and provide new discipline-advancing work opportunities. This report contains the information from the assessment.
Emissivity Results on High Temperature Coatings for Refractory Composite Materials
NASA Technical Reports Server (NTRS)
Ohlhorst, Craig W.; Vaughn, Wallace L.; Daryabeigi, Kamran; Lewis, Ronald K.; Rodriguez, Alvaro C.; Milhoan, James D.; Koenig, John R.
2007-01-01
The directional emissivity of various refractory composite materials considered for application for reentry and hypersonic vehicles was investigated. The directional emissivity was measured at elevated temperatures of up to 3400 F using a directional spectral radiometric technique during arc-jet test runs. A laboratory-based relative total radiance method was also used to measure total normal emissivity of some of the refractory composite materials. The data from the two techniques are compared. The paper will also compare the historical database of Reinforced Carbon-Carbon emissivity measurements with emissivity values generated recently on the material using the two techniques described in the paper.
Integrated Historical Tsunami Event and Deposit Database
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; McCullough, H. L.
2010-12-01
The National Geophysical Data Center (NGDC) provides integrated access to historical tsunami event, deposit, and proxy data. The NGDC tsunami archive initially listed tsunami sources and locations with observed tsunami effects. Tsunami frequency and intensity are important for understanding tsunami hazards. Unfortunately, tsunami recurrence intervals often exceed the historic record. As a result, NGDC expanded the archive to include the Global Tsunami Deposits Database (GTD_DB). Tsunami deposits are the physical evidence left behind when a tsunami impacts a shoreline or affects submarine sediments. Proxies include co-seismic subsidence, turbidite deposits, changes in biota following an influx of marine water in a freshwater environment, etc. By adding past tsunami data inferred from the geologic record, the GTD_DB extends the record of tsunamis backward in time. Although the best methods for identifying tsunami deposits and proxies in the geologic record remain under discussion, developing an overall picture of where tsunamis have affected coasts, calculating recurrence intervals, and approximating runup height and inundation distance provides a better estimate of a region’s true tsunami hazard. Tsunami deposit and proxy descriptions in the GTD_DB were compiled from published data found in journal articles, conference proceedings, theses, books, conference abstracts, posters, web sites, etc. The database now includes over 1,200 descriptions compiled from over 1,100 citations. Each record in the GTD_DB is linked to its bibliographic citation where more information on the deposit can be found. The GTD_DB includes data for over 50 variables such as: event description (e.g., 2010 Chile Tsunami), geologic time period, year, deposit location name, latitude, longitude, country, associated body of water, setting during the event (e.g., beach, lake, river, deep sea), upper and lower contacts, underlying and overlying material, etc. If known, the tsunami source mechanism (e.g., earthquake, landslide, volcanic eruption, asteroid impact) is also specified. Observations (grain size, sedimentary structure, bed thickness, number of layers, etc.) are stored along with the conclusions drawn from the evidence by the author (wave height, flow depth, flow velocity, number of waves, etc.). Geologic time periods in the GTD_DB range from Precambrian to Quaternary, but the majority (70%) are from the Quaternary period. This period includes events such as: the 2004 Indian Ocean tsunami, the Cascadia subduction zone earthquakes and tsunamis, the 1755 Lisbon tsunami, the A.D. 79 Vesuvius tsunami, the 3500 BP Santorini caldera collapse and tsunami, and the 7000 BP Storegga landslide-generated tsunami. Prior to the Quaternary period, the majority of the paleotsunamis are due to impact events such as: the Tertiary Chesapeake Bay Bolide, Cretaceous-Tertiary (K/T) Boundary, Cretaceous Manson, and Devonian Alamo. The tsunami deposits are integrated with the historical tsunami event database where applicable. For example, users can search for articles describing deposits related to the 1755 Lisbon tsunami and view those records, as well as link to the related historic event record. The data and information may be viewed using tools designed to extract and display data (selection forms, Web Map Services, and Web Feature Services).
Published toxicity results are reviewed for oils, dispersants and dispersed oils and aquatic plants. The historical phytotoxicity database consists largely of results from a patchwork of research conducted after oil spills to marine waters. Toxicity information is available for ...
NASA Astrophysics Data System (ADS)
Baik, A.; Yaagoubi, R.; Boehm, J.
2015-08-01
This work outlines a new approach for the integration of 3D Building Information Modelling and the 3D Geographic Information System (GIS) to provide semantically rich models, and to get the benefits from both systems to help document and analyse cultural heritage sites. Our proposed framework is based on the Jeddah Historical Building Information Modelling process (JHBIM). This JHBIM consists of a Hijazi Architectural Objects Library (HAOL) that supports higher level of details (LoD) while decreasing the time of modelling. The Hijazi Architectural Objects Library has been modelled based on the Islamic historical manuscripts and Hijazi architectural pattern books. Moreover, the HAOL is implemented using BIM software called Autodesk Revit. However, it is known that this BIM environment still has some limitations with the non-standard architectural objects. Hence, we propose to integrate the developed 3D JHBIM with 3D GIS for more advanced analysis. To do so, the JHBIM database is exported and semantically enriched with non-architectural information that is necessary for restoration and preservation of historical monuments. After that, this database is integrated with the 3D Model in the 3D GIS solution. At the end of this paper, we'll illustrate our proposed framework by applying it to a Historical Building called Nasif Historical House in Jeddah. First of all, this building is scanned by the use of a Terrestrial Laser Scanner (TLS) and Close Range Photogrammetry. Then, the 3D JHBIM based on the HOAL is designed on Revit Platform. Finally, this model is integrated to a 3D GIS solution through Autodesk InfraWorks. The shown analysis presented in this research highlights the importance of such integration especially for operational decisions and sharing the historical knowledge about Jeddah Historical City. Furthermore, one of the historical buildings in Old Jeddah, Nasif Historical House, was chosen as a test case for the project.
NASA Astrophysics Data System's New Data
NASA Astrophysics Data System (ADS)
Eichhorn, G.; Accomazzi, A.; Demleitner, M.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.
2000-05-01
The NASA Astrophysics Data System has greatly increased its data holdings. The Physics database now contains almost 900,000 references and the Astronomy database almost 550,000 references. The Instrumentation database has almost 600,000 references. The scanned articles in the ADS Article Service are increasing in number continuously. Almost 1 million pages have been scanned so far. Recently the abstracts books from the Lunar and Planetary Science Conference have been scanned and put on-line. The Monthly Notices of the Royal Astronomical Society are currently being scanned back to Volume 1. This is the last major journal to be completely scanned and on-line. In cooperation with a conservation project of the Harvard libraries, microfilms of historical observatory literature are currently being scanned. This will provide access to an important part of the historical literature. The ADS can be accessed at: http://adswww.harvard.edu This project is funded by NASA under grant NCC5-189.
GEOGRAPHIC NAMES INFORMATION SYSTEM (GNIS) ...
The Geographic Names Information System (GNIS), developed by the U.S. Geological Survey in cooperation with the U.S. Board on Geographic Names (BGN), contains information about physical and cultural geographic features in the United States and associated areas, both current and historical, but not including roads and highways. The database also contains geographic names in Antarctica. The database holds the Federally recognized name of each feature and defines the location of the feature by state, county, USGS topographic map, and geographic coordinates. Other feature attributes include names or spellings other than the official name, feature designations, feature class, historical and descriptive information, and for some categories of features the geometric boundaries. The database assigns a unique feature identifier, a random number, that is a key for accessing, integrating, or reconciling GNIS data with other data sets. The GNIS is our Nation's official repository of domestic geographic feature names information.
NASA Astrophysics Data System (ADS)
Julius, Musa, Admiral; Pribadi, Sugeng; Muzli, Muzli
2018-03-01
Sulawesi, one of the biggest island in Indonesia, located on the convergence of two macro plate that is Eurasia and Pacific. NOAA and Novosibirsk Tsunami Laboratory show more than 20 tsunami data recorded in Sulawesi since 1820. Based on this data, determination of correlation between tsunami and earthquake parameter need to be done to proved all event in the past. Complete data of magnitudes, fault sizes and tsunami heights on this study sourced from NOAA and Novosibirsk Tsunami database, completed with Pacific Tsunami Warning Center (PTWC) catalog. This study aims to find correlation between moment magnitude, fault size and tsunami height by simple regression. The step of this research are data collecting, processing, and regression analysis. Result shows moment magnitude, fault size and tsunami heights strongly correlated. This analysis is enough to proved the accuracy of historical tsunami database in Sulawesi on NOAA, Novosibirsk Tsunami Laboratory and PTWC.
NASA Astrophysics Data System (ADS)
Magi, B. I.; Marlon, J. R.; Mouillot, F.; Daniau, A. L.; Bartlein, P. J.; Schaefer, A.
2017-12-01
Fire is intertwined with climate variability and human activities in terms of both its causes and consequences, and the most complete understanding will require a multidisciplinary approach. The focus in this study is to compare data-based records of variability in climate and human activities, with fire and land cover change records over the past 250 years in North America and Europe. The past 250 years is a critical period for contextualizing the present-day impact of human activities on climate. Data are from the Global Charcoal Database and from historical reconstructions of past burning. The GCD is comprised of sediment records of charcoal accumulation rates collected around the world by dozens of researchers, and facilitated by the PAGES Global Paleofire Working Group. The historical reconstruction extends back to 1750 CE is based on literature and government records when available, and completed with non-charcoal proxies including tree ring scars or storylines when data are missing. The key data sets are independent records, and the methods and results are independent of any climate or fire-model simulations. Results are presented for Europe, and subsets of North America. Analysis of fire trends from GCD and the historical reconstruction shows broad agreement, with some regional variations as expected. Western USA and North America in general show the best agreement, with departures in the GCD and historical reconstruction fire trends in the present day that may reflect limits in the data itself. Eastern North America shows agreement with an increase in fire from 1750 to 1900, and a strong decreasing trend thereafter. We present ideas for why the trends agree and disagree relative to historical events, and to the sequence of land-cover change in the regions of interest. Together with careful consideration of uncertainties in the data, these results can be used to constrain Earth System Model simulations of both past fire, which explicitly incorporate historical fire emissions, and the pathways of future fire on a warmer planet.
Reliability of the Defense Commissary Agency Personnel Property Database.
2000-02-18
Departments’ personal property databases. The tests were designed to validate the personal property databases. This report is the second in a series of...with the completeness of its data , and key data elements were not reliable for estimating the historical costs of real property for the Military...values of greater than $100,000. However, some of the Military Departments had problems with the completeness of its data , and key data elements
WOVOdat - An online, growing library of worldwide volcanic unrest
NASA Astrophysics Data System (ADS)
Newhall, C. G.; Costa, F.; Ratdomopurbo, A.; Venezky, D. Y.; Widiwijayanti, C.; Win, Nang Thin Zar; Tan, K.; Fajiculay, E.
2017-10-01
The World Organization of Volcano Observatories (WOVO), with major support from the Earth Observatory of Singapore, is developing a web-accessible database of seismic, geodetic, gas, hydrologic, and other unrest from volcanoes around the world. This database, WOVOdat, is intended for reference during volcanic crises, comparative studies, basic research on pre-eruption processes, teaching, and outreach. Data are already processed to have physical meaning, e.g. earthquake hypocenters rather than voltages or arrival times, and are historical rather than real-time, ranging in age from a few days to several decades. Data from > 900 episodes of unrest covering > 75 volcanoes are already accessible. Users can visualize and compare changes from one episode of unrest or from one volcano to the next. As the database grows more complete, users will be able to analyze patterns of unrest in the same way that epidemiologists study the spatial and temporal patterns and associations among diseases. WOVOdat was opened for station and data visualization in August 2013, and now includes utilities for data downloads and Boolean searches. Many more data sets are being added, as well as utilities interfacing to new applications, e.g., the construction of event trees. For more details, please see www.wovodat.org.
Web application and database modeling of traffic impact analysis using Google Maps
NASA Astrophysics Data System (ADS)
Yulianto, Budi; Setiono
2017-06-01
Traffic impact analysis (TIA) is a traffic study that aims at identifying the impact of traffic generated by development or change in land use. In addition to identifying the traffic impact, TIA is also equipped with mitigation measurement to minimize the arising traffic impact. TIA has been increasingly important since it was defined in the act as one of the requirements in the proposal of Building Permit. The act encourages a number of TIA studies in various cities in Indonesia, including Surakarta. For that reason, it is necessary to study the development of TIA by adopting the concept Transportation Impact Control (TIC) in the implementation of the TIA standard document and multimodal modeling. It includes TIA's standardization for technical guidelines, database and inspection by providing TIA checklists, monitoring and evaluation. The research was undertaken by collecting the historical data of junctions, modeling of the data in the form of relational database, building a user interface for CRUD (Create, Read, Update and Delete) the TIA data in the form of web programming with Google Maps libraries. The result research is a system that provides information that helps the improvement and repairment of TIA documents that exist today which is more transparent, reliable and credible.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Health, Education, and Human Services Div.
This report to Congress analyzes student loan default rates at historically black colleges and universities (HBCUs), focusing on student characteristics which may predict the likelihood of default. The study examined available student databases for characteristics identified by previous studies as related to level of student loan defaults. Among…
ERIC Educational Resources Information Center
Thompson, Heather A.
This study is concerned with the importance of historical method in library and information science research. The research conducted in this study specifically examined library and information science doctoral dissertations written between 1984-1999. The study of the "Digital Dissertations" database found that only eight to seventeen percent of…
Duda, Jeffrey J.; Wieferich, Daniel J.; Bristol, R. Sky; Bellmore, J. Ryan; Hutchison, Vivian B.; Vittum, Katherine M.; Craig, Laura; Warrick, Jonathan A.
2016-08-18
The removal of dams has recently increased over historical levels due to aging infrastructure, changing societal needs, and modern safety standards rendering some dams obsolete. Where possibilities for river restoration, or improved safety, exceed the benefits of retaining a dam, removal is more often being considered as a viable option. Yet, as this is a relatively new development in the history of river management, science is just beginning to guide our understanding of the physical and ecological implications of dam removal. Ultimately, the “lessons learned” from previous scientific studies on the outcomes dam removal could inform future scientific understanding of ecosystem outcomes, as well as aid in decision-making by stakeholders. We created a database visualization tool, the Dam Removal Information Portal (DRIP), to display map-based, interactive information about the scientific studies associated with dam removals. Serving both as a bibliographic source as well as a link to other existing databases like the National Hydrography Dataset, the derived National Dam Removal Science Database serves as the foundation for a Web-based application that synthesizes the existing scientific studies associated with dam removals. Thus, using the DRIP application, users can explore information about completed dam removal projects (for example, their location, height, and date removed), as well as discover sources and details of associated of scientific studies. As such, DRIP is intended to be a dynamic collection of scientific information related to dams that have been removed in the United States and elsewhere. This report describes the architecture and concepts of this “metaknowledge” database and the DRIP visualization tool.
Archive and Database as Metaphor: Theorizing the Historical Record
ERIC Educational Resources Information Center
Manoff, Marlene
2010-01-01
Digital media increase the visibility and presence of the past while also reshaping our sense of history. We have extraordinary access to digital versions of books, journals, film, television, music, art and popular culture from earlier eras. New theoretical formulations of database and archive provide ways to think creatively about these changes…
Listing of Education in Archaeological Programs: The LEAP Clearinghouse, 1989-1989 Summary Report.
ERIC Educational Resources Information Center
Knoll, Patricia C., Ed.
This catalog incorporates information gathered between 1987 and 1989 for inclusion into the National Park Service's Listing of Education in Archaeological Programs (LEAP) computerized database. This database is a listing of federal, state, local and private projects promoting positive public awareness of U.S. archaeology--prehistoric and historic,…
Unified Database Development Program. Final Report.
ERIC Educational Resources Information Center
Thomas, Everett L., Jr.; Deem, Robert N.
The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…
Power Plant Model Validation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less
The Use of Intensity Scales In Exploiting Tsunami Historical Databases
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Scheele, F.
2015-12-01
Post-disaster assessments for historical tsunami events (>15 years old) are either scarce or contain limited information. In this study, we are assessing ways to examine tsunami impacts by utilizing data from old events, but more importantly we examine how to best utilize information contained in tsunami historical databases, in order to provide meaningful products that describe the impact of the event. As such, a tsunami intensity scale was applied to two historical events that were observed in New Zealand (one local and one distant), in order to utilize the largest possible number of observations in our dataset. This is especially important for countries like New Zealand where the tsunami historical record is short, going back to only the 19th century, and where instrument recordings are only available for the most recent events. We found that despite a number of challenges in using intensities -uncertainties partly due to limitations of historical event data - these data with the help of GIS tools can be used to produce hazard maps and offer an alternative way to exploit tsunami historical records. Most importantly the assignment of intensities at each point of observation allows for utilization of many more observations than if one depends on physical information alone, such as water heights. We hope these results may be used towards developing a well-defined methodology for hazard assessments, and refine our knowledge for past tsunami events for which the tsunami sources are largely unknown, and also for when physical quantities describing the tsunami (e.g. water height, flood depth, run-up) are scarce.
[Implementation of Oncomelania hupensis monitoring system based on Baidu Map].
Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang
2017-10-25
To construct the Oncomelania hupensis snail monitoring system based on the Baidu Map. The environmental basic information about historical snail environment and existing snail environment, etc. was collected with the monitoring data about different kinds of O. hupensis snails, and then the O. hupensis snail monitoring system was built. Geographic Information System (GIS) and the electronic fence technology and Application Program Interface (API) were applied to set up the electronic fence of the snail surveillance environments, and the electronic fence was connected to the database of the snail surveillance. The O. hupensis snail monitoring system based on the Baidu Map were built up, including three modules of O. hupensis Snail Monitoring Environmental Database, Dynamic Monitoring Platform and Electronic Map. The information about monitoring O. hupensis snails could be obtained through the computer and smartphone simultaneously. The O. hupensis snail monitoring system, which is based on Baidu Map, is a visible platform to follow the process of snailsearching and molluscaciding.
WWW.NMDB.EU: The real-time Neutron Monitor databas
NASA Astrophysics Data System (ADS)
Klein, Karl-Ludwig; Steigies, Christian; Steigies, Christian T.; Wimmer-Schweingruber, Robert F.; Kudela, Karel; Strharsky, Igor; Langer, Ronald; Usoskin, Ilya; Ibragimov, Askar; Flückiger, Erwin O.; Bütikofer, Rolf; Eroshenko, Eugenia; Belov, Anatoly; Yanke, Victor; Klein, Karl-Ludwig; Fuller, Nicolas; Mavromichalaki, Helen; Papaioannou, Athana-Sios; Sarlanis, Christos; Souvatzoglou, George; Plainaki, Christina; Geron-Tidou, Maria; Papailiou, Maria-Christina; Mariatos, George; Chilingaryan, Ashot; Hovsepyan, G.; Reymers, Artur; Parisi, Mario; Kryakunova, Olga; Tsepakina, Irina; Nikolayevskiy, Nikolay; Dor-Man, Lev; Pustil'Nik, Lev; García-Población, Oscar
The Real time database for high-resolution neutron monitor measurements(NMDB), which was supported by the 7th Framework Programme of the European Commission, hosts data on cosmic rays in the GeV range from European and some non-European neutron monitor stations. Besides real-time data and historical data over several decades in a unified format, it offers data products such as galactic cosmic ray spectra and applications including solar energetic particle alerts and the calculation of ionisation rates in the atmosphere and effective radiation dose rates at aircraft altitudes. Furthermore the web site comprises public outreach pages in several languages and offers training material on cosmic rays for university students and researchers and engineers who want to become familiar with cosmic rays and neutron monitor measurements. This contribution presents an overview of the provided services and indications on how to access the database. Operators of other neutron monitor stations are welcome to submit their data to NMDB.
Dynamic taxonomies applied to a web-based relational database for geo-hydrological risk mitigation
NASA Astrophysics Data System (ADS)
Sacco, G. M.; Nigrelli, G.; Bosio, A.; Chiarle, M.; Luino, F.
2012-02-01
In its 40 years of activity, the Research Institute for Geo-hydrological Protection of the Italian National Research Council has amassed a vast and varied collection of historical documentation on landslides, muddy-debris flows, and floods in northern Italy from 1600 to the present. Since 2008, the archive resources have been maintained through a relational database management system. The database is used for routine study and research purposes as well as for providing support during geo-hydrological emergencies, when data need to be quickly and accurately retrieved. Retrieval speed and accuracy are the main objectives of an implementation based on a dynamic taxonomies model. Dynamic taxonomies are a general knowledge management model for configuring complex, heterogeneous information bases that support exploratory searching. At each stage of the process, the user can explore or browse the database in a guided yet unconstrained way by selecting the alternatives suggested for further refining the search. Dynamic taxonomies have been successfully applied to such diverse and apparently unrelated domains as e-commerce and medical diagnosis. Here, we describe the application of dynamic taxonomies to our database and compare it to traditional relational database query methods. The dynamic taxonomy interface, essentially a point-and-click interface, is considerably faster and less error-prone than traditional form-based query interfaces that require the user to remember and type in the "right" search keywords. Finally, dynamic taxonomy users have confirmed that one of the principal benefits of this approach is the confidence of having considered all the relevant information. Dynamic taxonomies and relational databases work in synergy to provide fast and precise searching: one of the most important factors in timely response to emergencies.
False positives complicate ancient pathogen identifications using high-throughput shotgun sequencing
2014-01-01
Background Identification of historic pathogens is challenging since false positives and negatives are a serious risk. Environmental non-pathogenic contaminants are ubiquitous. Furthermore, public genetic databases contain limited information regarding these species. High-throughput sequencing may help reliably detect and identify historic pathogens. Results We shotgun-sequenced 8 16th-century Mixtec individuals from the site of Teposcolula Yucundaa (Oaxaca, Mexico) who are reported to have died from the huey cocoliztli (‘Great Pestilence’ in Nahautl), an unknown disease that decimated native Mexican populations during the Spanish colonial period, in order to identify the pathogen. Comparison of these sequences with those deriving from the surrounding soil and from 4 precontact individuals from the site found a wide variety of contaminant organisms that confounded analyses. Without the comparative sequence data from the precontact individuals and soil, false positives for Yersinia pestis and rickettsiosis could have been reported. Conclusions False positives and negatives remain problematic in ancient DNA analyses despite the application of high-throughput sequencing. Our results suggest that several studies claiming the discovery of ancient pathogens may need further verification. Additionally, true single molecule sequencing’s short read lengths, inability to sequence through DNA lesions, and limited ancient-DNA-specific technical development hinder its application to palaeopathology. PMID:24568097
Jackson, Latifa; Cross, Christopher; Clarke, Cameron
2016-01-01
Objectives How important is it to be able to reconstruct the lives of a highly diverse, historically recent macroethnic group over the course of 400 years? How many insights into human evolutionary biology and disease susceptibilities could be gained, even with this relatively recent window into the past? In this article, we explore the potential ramifications of a newly constructed dataset of Four Centuries of African American Biological Variation (4Cs). Methods This article provides initial lists of digitized variables formatted as SQL tables for the 17th and 18th century samples and for the 19th and 20th century samples. Results This database is dynamic and new information is added yearly. The database provides novel opportunities for significant insights into the past biological history of this group and three case study applications are detailed for comparative computational systems biology studies of (1) hypertension, (2) the oral microbiome, and (3) mental health disorders. Conclusions The 4Cs dataset is ideal for interdisciplinary “next generation” science research and these data represent a unique step toward the accumulation of historically contextualized Big Data on an underrepresented group known to have experienced differential survival over time. Am. J. Hum. Biol. 28:510–513, 2016. © 2016 The Authors American Journal of Human Biology Published byWiley Periodicals, Inc. PMID:26749025
Agroclimate.Org: Tools and Information for a Climate Resilient Agriculture in the Southeast USA
NASA Astrophysics Data System (ADS)
Fraisse, C.
2014-12-01
AgroClimate (http://agroclimate.org) is a web-based system developed to help the agricultural industry in the southeastern USA reduce risks associated with climate variability and change. It includes climate related information and dynamic application tools that interact with a climate and crop database system. Information available includes climate monitoring and forecasts combined with information about crop management practices that help increase the resiliency of the agricultural industry in the region. Recently we have included smartphone apps in the AgroClimate suite of tools, including irrigation management and crop disease alert systems. Decision support tools available in AgroClimate include: (a) Climate risk: expected (probabilistic) and historical climate information and freeze risk; (b) Crop yield risk: expected yield based on soil type, planting date, and basic management practices for selected commodities and historical county yield databases; (c) Crop diseases: disease risk monitoring and forecasting for strawberry and citrus; (d) Crop development: monitoring and forecasting of growing degree-days and chill accumulation; (e) Drought: monitoring and forecasting of selected drought indices, (f) Footprints: Carbon and water footprint calculators. The system also provides background information about the main drivers of climate variability and basic information about climate change in the Southeast USA. AgroClimate has been widely used as an educational tool by the Cooperative Extension Services in the region and also by producers. It is now being replicated internationally with version implemented in Mozambique and Paraguay.
[Historical trauma. Systematic review of a different approach to armed conflict].
Borda Bohigas, Juan Pablo; Carrillo, Juan O; Garzón, Daniel F; Ramírez, María P; Rodríguez, Nicolás
2015-01-01
Historical trauma (HT) is a collective trauma inflicted on a group of people who share an identity or affiliation, and is often characterized by the transgenerational legacy of traumatic experiences, and expressed through various psychological and social responses. This construct is proposed in contrast to post-traumatic stress disorder (PTSD) due to limitations identified with the latter diagnostic category when addressing collective trauma, especially in situations of political and social violence. The purpose of this article is to review the literature published so far on HT. A search was performed using the terms "historical trauma" and "mental health" or "trauma histórico" and "salud mental" in the scientific databases, EMBASE, Ebscohost, JSTOR, ProQuest, LILACS, SciELO, PsycARTICLES, ISI Web of Science and PubMed. The authors reviewed HT definition, paramount characteristics of its traumatic experience, and several theories of on the transgenerational succession if these experiences occur, as well as possible consequences of traumatic events at individual, family and social level. Common characteristics of different therapeutic models are highlighted, in addition to some recommendations for their application. PTSD has clear limitations in addressing community and cumulative traumatic experiences related to specific social and historical contexts. The authors discuss the potential utility of HT in this task. Finally, several gaps in current knowledge regarding this construct are mentioned, and some recommendations for future research are indicated. Copyright © 2014 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Sherba, Jason T.; Sleeter, Benjamin M.; Davis, Adam W.; Parker, Owen P.
2015-01-01
Global land-use/land-cover (LULC) change projections and historical datasets are typically available at coarse grid resolutions and are often incompatible with modeling applications at local to regional scales. The difficulty of downscaling and reapportioning global gridded LULC change projections to regional boundaries is a barrier to the use of these datasets in a state-and-transition simulation model (STSM) framework. Here we compare three downscaling techniques to transform gridded LULC transitions into spatial scales and thematic LULC classes appropriate for use in a regional STSM. For each downscaling approach, Intergovernmental Panel on Climate Change (IPCC) Representative Concentration Pathway (RCP) LULC projections, at the 0.5 × 0.5 cell resolution, were downscaled to seven Level III ecoregions in the Pacific Northwest, United States. RCP transition values at each cell were downscaled based on the proportional distribution between ecoregions of (1) cell area, (2) land-cover composition derived from remotely-sensed imagery, and (3) historic LULC transition values from a LULC history database. Resulting downscaled LULC transition values were aggregated according to their bounding ecoregion and “cross-walked” to relevant LULC classes. Ecoregion-level LULC transition values were applied in a STSM projecting LULC change between 2005 and 2100. While each downscaling methods had advantages and disadvantages, downscaling using the historical land-use history dataset consistently apportioned RCP LULC transitions in agreement with historical observations. Regardless of the downscaling method, some LULC projections remain improbable and require further investigation.
NASA Astrophysics Data System (ADS)
Castagnetti, C.; Dubbini, M.; Ricci, P. C.; Rivola, R.; Giannini, M.; Capra, A.
2017-05-01
The new era of designing in architecture and civil engineering applications lies in the Building Information Modeling (BIM) approach, based on a 3D geometric model including a 3D database. This is easier for new constructions whereas, when dealing with existing buildings, the creation of the BIM is based on the accurate knowledge of the as-built construction. Such a condition is allowed by a 3D survey, often carried out with laser scanning technology or modern photogrammetry, which are able to guarantee an adequate points cloud in terms of resolution and completeness by balancing both time consuming and costs with respect to the request of final accuracy. The BIM approach for existing buildings and even more for historical buildings is not yet a well known and deeply discussed process. There are still several choices to be addressed in the process from the survey to the model and critical issues to be discussed in the modeling step, particularly when dealing with unconventional elements such as deformed geometries or historical elements. The paper describes a comprehensive workflow that goes through the survey and the modeling, allowing to focus on critical issues and key points to obtain a reliable BIM of an existing monument. The case study employed to illustrate the workflow is the Basilica of St. Stefano in Bologna (Italy), a large monumental complex with great religious, historical and architectural assets.
Historical reconstructions of California wildfires vary by data source
Syphard, Alexandra D.; Keeley, Jon E.
2016-01-01
Historical data are essential for understanding how fire activity responds to different drivers. It is important that the source of data is commensurate with the spatial and temporal scale of the question addressed, but fire history databases are derived from different sources with different restrictions. In California, a frequently used fire history dataset is the State of California Fire and Resource Assessment Program (FRAP) fire history database, which circumscribes fire perimeters at a relatively fine scale. It includes large fires on both state and federal lands but only covers fires that were mapped or had other spatially explicit data. A different database is the state and federal governments’ annual reports of all fires. They are more complete than the FRAP database but are only spatially explicit to the level of county (California Department of Forestry and Fire Protection – Cal Fire) or forest (United States Forest Service – USFS). We found substantial differences between the FRAP database and the annual summaries, with the largest and most consistent discrepancy being in fire frequency. The FRAP database missed the majority of fires and is thus a poor indicator of fire frequency or indicators of ignition sources. The FRAP database is also deficient in area burned, especially before 1950. Even in contemporary records, the huge number of smaller fires not included in the FRAP database account for substantial cumulative differences in area burned. Wildfires in California account for nearly half of the western United States fire suppression budget. Therefore, the conclusions about data discrepancies and the implications for fire research are of broad importance.
Digitized Database of Old Seismograms Recorder in Romania
NASA Astrophysics Data System (ADS)
Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea
2016-08-01
The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.
History-Enriched Spaces for Shared Encounters
NASA Astrophysics Data System (ADS)
Konomi, Shin'ichi; Sezaki, Kaoru; Kitsuregawa, Masaru
We discuss "history-enriched spaces" that use historical data to support shared encounters. We first examine our experiences with DeaiExplorer, a social network display that uses RFID and a historical database to support social interactions at academic conferences. This leads to our discussions on three complementary approaches to addressing the issues of supporting social encounters: (1) embedding historical data in embodied interactions, (2) designing for weakly involved interactions such as social navigation, and (3) designing for privacy. Finally, we briefly describe a preliminary prototype of a proxemics-based awareness tool that considers these approaches.
Giffen, Sarah E.
2002-01-01
An environmental database was developed to store water-quality data collected during the 1999 U.S. Geological Survey investigation of the occurrence and distribution of dioxins, furans, and PCBs in the riverbed sediment and fish tissue in the Penobscot River in Maine. The database can be used to store a wide range of detailed information and to perform complex queries on the data it contains. The database also could be used to store data from other historical and any future environmental studies conducted on the Penobscot River and surrounding regions.
THE ART OF DATA MINING THE MINEFIELDS OF TOXICITY ...
Toxicity databases have a special role in predictive toxicology, providing ready access to historical information throughout the workflow of discovery, development, and product safety processes in drug development as well as in review by regulatory agencies. To provide accurate information within a hypothesesbuilding environment, the content of the databases needs to be rigorously modeled using standards and controlled vocabulary. The utilitarian purposes of databases widely vary, ranging from a source for (Q)SAR datasets for modelers to a basis for
Database Are Not Toasters: A Framework for Comparing Data Warehouse Appliances
NASA Astrophysics Data System (ADS)
Trajman, Omer; Crolotte, Alain; Steinhoff, David; Nambiar, Raghunath Othayoth; Poess, Meikel
The success of Business Intelligence (BI) applications depends on two factors, the ability to analyze data ever more quickly and the ability to handle ever increasing volumes of data. Data Warehouse (DW) and Data Mart (DM) installations that support BI applications have historically been built using traditional architectures either designed from the ground up or based on customized reference system designs. The advent of Data Warehouse Appliances (DA) brings packaged software and hardware solutions that address performance and scalability requirements for certain market segments. The differences between DAs and custom installations make direct comparisons between them impractical and suggest the need for a targeted DA benchmark. In this paper we review data warehouse appliances by surveying thirteen products offered today. We assess the common characteristics among them and propose a classification for DA offerings. We hope our results will help define a useful benchmark for DAs.
Generation of Fine Scale Wind and Wave Climatologies
NASA Astrophysics Data System (ADS)
Vandenberghe, F. C.; Filipot, J.; Mouche, A.
2013-12-01
A tool to generate 'on demand' large databases of atmospheric parameters at high resolution has been developed for defense applications. The approach takes advantage of the zooming and relocation capabilities of the embedded domains that can be found in regional models like the community Weather Research and Forecast model (WRF). The WRF model is applied to dynamically downscale NNRP, CFSR and ERA40 global analyses and to generate long records, up to 30 years, of hourly gridded data over 200km2 domains at 3km grid increment. To insure accuracy, observational data from the NCAR ADP historical database are used in combination with the Four-Dimensional Data Assimilation (FDDA) techniques to constantly nudge the model analysis toward observations. The atmospheric model is coupled to secondary applications such as the NOAA's Wave Watch III model the Navy's APM Electromagnetic Propagation model, allowing the creation of high-resolution climatologies of surface winds, waves and electromagnetic propagation parameters. The system was applied at several coastal locations of the Mediterranean Sea where SAR wind and wave observations were available during the entire year of 2008. Statistical comparisons between the model output and SAR observations are presented. Issues related to the global input data, and the model drift, as well as the impact of the wind biases on wave simulations will be discussed.
Integrating GIS, Archeology, and the Internet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sera White; Brenda Ringe Pace; Randy Lee
2004-08-01
At the Idaho National Engineering and Environmental Laboratory's (INEEL) Cultural Resource Management Office, a newly developed Data Management Tool (DMT) is improving management and long-term stewardship of cultural resources. The fully integrated system links an archaeological database, a historical database, and a research database to spatial data through a customized user interface using ArcIMS and Active Server Pages. Components of the new DMT are tailored specifically to the INEEL and include automated data entry forms for historic and prehistoric archaeological sites, specialized queries and reports that address both yearly and project-specific documentation requirements, and unique field recording forms. The predictivemore » modeling component increases the DMT’s value for land use planning and long-term stewardship. The DMT enhances the efficiency of archive searches, improving customer service, oversight, and management of the large INEEL cultural resource inventory. In the future, the DMT will facilitate data sharing with regulatory agencies, tribal organizations, and the general public.« less
77 FR 52757 - Proposed Information Collection; Historic Preservation Certification Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
... an applicant to receive substantial Federal tax incentives authorized by Section 47 of the Internal Revenue Code. These incentives include 20% Federal income tax credit for the rehabilitation of historic... owners of historic properties for Federal tax benefits: (a) The historic character of the property, and...
A Semantic Sensor Web for Environmental Decision Support Applications
Gray, Alasdair J. G.; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A. A.; Paton, Norman W.; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción
2011-01-01
Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England. PMID:22164110
NASA Technical Reports Server (NTRS)
Mortlock, Alan; VanAlstyne, Richard
1998-01-01
The report describes development of databases estimating aircraft engine exhaust emissions for the years 1976 and 1984 from global operations of Military, Charter, historic Soviet and Chinese, Unreported Domestic traffic, and General Aviation (GA). These databases were developed under the National Aeronautics and Space Administration's (NASA) Advanced Subsonic Assessment (AST). McDonnell Douglas Corporation's (MDC), now part of the Boeing Company has previously estimated engine exhaust emissions' databases for the baseline year of 1992 and a 2015 forecast year scenario. Since their original creation, (Ward, 1994 and Metwally, 1995) revised technology algorithms have been developed. Additionally, GA databases have been created and all past NIDC emission inventories have been updated to reflect the new technology algorithms. Revised data (Baughcum, 1996 and Baughcum, 1997) for the scheduled inventories have been used in this report to provide a comparison of the total aviation emission forecasts from various components. Global results of two historic years (1976 and 1984), a baseline year (1992) and a forecast year (2015) are presented. Since engine emissions are directly related to fuel usage, an overview of individual aviation annual global fuel use for each inventory component is also given in this report.
Lythrum salicaria L.-Underestimated medicinal plant from European traditional medicine. A review.
Piwowarski, Jakub P; Granica, Sebastian; Kiss, Anna K
2015-07-21
Purple loosestrife-Lythrum salicaria L. is a herbaceous perennial plant belonging to the Lythraceae family. It has been used for centuries in European traditional medicine. Despite Lythri herba being a pharmacopoeial plant material (Ph. Eur.), L. salicaria popularity as a medicinal plant has recently declined. The aim of the paper is to recall a traditional and historical use of L. salicaria and juxtapose it with comprehensive view on the current knowledge about its chemical composition and documented biological activities in order to bring back the interest into this valuable plant and indicate reasonable directions of future research and possible applications. Systematic survey of historical and ethnopharmacological literature was carried out using sources of European and American libraries. Pharmacological and phytochemical literature research was performed using Scopus, PubMed, Web of Science and Reaxys databases. The review of historical sources from ancient times till 20th century revealed an outstanding position of L. salicaria in traditional medicine. The main applications indicated were gastrointestinal tract ailments (mainly dysentery and diarrhea) as well as different skin and mucosa affections. The current phytochemical studies have shown that polyphenols (C-glucosidic ellagitannins and C-glucosidic flavonoids) as well as heteropolysaccharides are dominating constituents, which probably determine the observed pharmacological effects. The extracts and some isolated compounds were shown to possess antidiarrheal, antimicrobial, anti-oxidant, anti-inflammatory and anti-diabetic activities. The intrinsic literature overview conclusively demonstrates that L. salicaria L. used to be considered as an exceptionally effective remedy in European traditional medicine. Despite its unquestionable important position from unknown reasons its popularity has been weakened during the past few decades. Unfortunately the contemporary pharmacological research is still insufficient to support its thoroughly described traditional uses. The necessity of complex studies regarding modes of action, which would directly refer to L. salicaria main traditional applications-gastrointestinal tract ailments, is strongly underlined. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Global Ocean Currents Database
NASA Astrophysics Data System (ADS)
Boyer, T.; Sun, L.
2016-02-01
The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue
Development and Validation of National Phenology Data Products
NASA Astrophysics Data System (ADS)
Weltzin, J. F.; Rosemartin, A.; Crimmins, T. M.; Gerst, K.
2015-12-01
The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database (NPDb) maintained by USA-NPN contains almost 6 million in-situ observation records for plants and animals for the period 1954-2015. These data have been used in a number of science, conservation and natural resource management applications, including national assessments of historical and potential future trends in phenology and regional assessments of spatio-temporal variation in organismal activity. Customizable downloads of raw or summarized data, freely available from www.usanpn.org, are accompanied by metadata, data-use and data-attribution policies, published protocols, version/change control, documentation of QA/QC, and links to publications that use historical or contemporary data held in the NPDb. The National Coordinating Office of USA-NPN is developing a suite of standard data products (e.g., quality-controlled raw or summarized status data) and tools (e.g., a new visualization tool released in 2015) to facilitate use and application by a diverse set of data users. This presentation outlines a workflow for the development and validation of spatially gridded phenology products, drawing on recent work related to the Spring Indices now included in two national Indicator systems. In addition, we discuss how we engage observers to collect in-situ data to validate model predictions. Preliminary analyses indicate high fidelity between historical in-situ and modeled observations on a national scale, but with considerable variability at the regional scale. Regions with strong differences between expected and observed data are identified and will be the focus of in-situ data collection campaigns using USA-NPN's Nature's Notebook on-line user interface (www.nn.usanpn.org).
A descriptive and historical review of bibliometrics with applications to medical sciences.
Thompson, Dennis F; Walker, Cheri K
2015-06-01
The discipline of bibliometrics involves the application of mathematical and statistical methods to scholarly publications. The first attempts at systematic data collection were provided by Alfred Lotka and Samuel Bradford, who subsequently established the foundational laws of bibliometrics. Eugene Garfield ushered in the modern era of bibliometrics with the routine use of citation analysis and systematized processing. Key elements of bibliometric analysis include database coverage, consistency and accuracy of the data, data fields, search options, and analysis and use of metrics. A number of bibliometric applications are currently being used in medical science and health care. Bibliometric parameters and indexes may be increasingly used by grant funding sources as measures of research success. Universities may build benchmarking standards from bibliometric data to determine academic achievement through promotion and tenure guidelines in the future. This article reviews the history, definition, laws, and elements of bibliometric principles and provides examples of bibliometric applications to the broader health care community. To accomplish this, the Medline (1966-2014) and Web of Science (1945-2014) databases were searched to identify relevant articles; select articles were also cross-referenced. Articles selected were those that provided background, history, descriptive analysis, and application of bibliometric principles and metrics to medical science and health care. No attempt was made to cover all areas exhaustively; rather, key articles were chosen that illustrate bibliometric concepts and enhance the reader's knowledge. It is important that faculty and researchers understand the limitations and appropriate uses of bibliometric data. Bibliometrics has considerable potential as a research area for health care scientists and practitioners that can be used to discover new information about academic trends, pharmacotherapy, disease, and broader health sciences trends. © 2015 Pharmacotherapy Publications, Inc.
Statistical Downscaling in Multi-dimensional Wave Climate Forecast
NASA Astrophysics Data System (ADS)
Camus, P.; Méndez, F. J.; Medina, R.; Losada, I. J.; Cofiño, A. S.; Gutiérrez, J. M.
2009-04-01
Wave climate at a particular site is defined by the statistical distribution of sea state parameters, such as significant wave height, mean wave period, mean wave direction, wind velocity, wind direction and storm surge. Nowadays, long-term time series of these parameters are available from reanalysis databases obtained by numerical models. The Self-Organizing Map (SOM) technique is applied to characterize multi-dimensional wave climate, obtaining the relevant "wave types" spanning the historical variability. This technique summarizes multi-dimension of wave climate in terms of a set of clusters projected in low-dimensional lattice with a spatial organization, providing Probability Density Functions (PDFs) on the lattice. On the other hand, wind and storm surge depend on instantaneous local large-scale sea level pressure (SLP) fields while waves depend on the recent history of these fields (say, 1 to 5 days). Thus, these variables are associated with large-scale atmospheric circulation patterns. In this work, a nearest-neighbors analog method is used to predict monthly multi-dimensional wave climate. This method establishes relationships between the large-scale atmospheric circulation patterns from numerical models (SLP fields as predictors) with local wave databases of observations (monthly wave climate SOM PDFs as predictand) to set up statistical models. A wave reanalysis database, developed by Puertos del Estado (Ministerio de Fomento), is considered as historical time series of local variables. The simultaneous SLP fields calculated by NCEP atmospheric reanalysis are used as predictors. Several applications with different size of sea level pressure grid and with different temporal domain resolution are compared to obtain the optimal statistical model that better represents the monthly wave climate at a particular site. In this work we examine the potential skill of this downscaling approach considering perfect-model conditions, but we will also analyze the suitability of this methodology to be used for seasonal forecast and for long-term climate change scenario projection of wave climate.
Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.
2015-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 128 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Tonsina area in the Chugach Mountains, Valdez quadrangle, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies
Updated Palaeotsunami Database for Aotearoa/New Zealand
NASA Astrophysics Data System (ADS)
Gadsby, M. R.; Goff, J. R.; King, D. N.; Robbins, J.; Duesing, U.; Franz, T.; Borrero, J. C.; Watkins, A.
2016-12-01
The updated configuration, design, and implementation of a national palaeotsunami (pre-historic tsunami) database for Aotearoa/New Zealand (A/NZ) is near completion. This tool enables correlation of events along different stretches of the NZ coastline, provides information on frequency and extent of local, regional and distant-source tsunamis, and delivers detailed information on the science and proxies used to identify the deposits. In A/NZ a plethora of data, scientific research and experience surrounds palaeotsunami deposits, but much of this information has been difficult to locate, has variable reporting standards, and lacked quality assurance. The original database was created by Professor James Goff while working at the National Institute of Water & Atmospheric Research in A/NZ, but has subsequently been updated during his tenure at the University of New South Wales. The updating and establishment of the national database was funded by the Ministry of Civil Defence and Emergency Management (MCDEM), led by Environment Canterbury Regional Council, and supported by all 16 regions of A/NZ's local government. Creation of a single database has consolidated a wide range of published and unpublished research contributions from many science providers on palaeotsunamis in A/NZ. The information is now easily accessible and quality assured and allows examination of frequency, extent and correlation of events. This provides authoritative scientific support for coastal-marine planning and risk management. The database will complement the GNS New Zealand Historical Database, and contributes to a heightened public awareness of tsunami by being a "one-stop-shop" for information on past tsunami impacts. There is scope for this to become an international database, enabling the pacific-wide correlation of large events, as well as identifying smaller regional ones. The Australian research community has already expressed an interest, and the database is also compatible with a similar one currently under development in Japan. Expressions of interest in collaborating with the A/NZ team to expand the database are invited from other Pacific nations.
Detection and measurement of total ozone from stellar spectra: Paper 2. Historic data from 1935 1942
NASA Astrophysics Data System (ADS)
Griffin, R. E. M.
2005-10-01
Atmospheric ozone columns are derived from historic stellar spectra observed between 1935 and 1942 at Mount Wilson Observatory, California. Comparisons with contemporary measurements in the Arosa database show a generally close correspondence. The results of the analysis indicate that astronomy's archives command considerable potential for investigating the natural levels of ozone and its variability during the decades prior to anthropogenic interference.
Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App
NASA Astrophysics Data System (ADS)
Nurnawati, E. K.; Ermawati, E.
2018-02-01
An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.
Fuller, Pamela L.; Neilson, Matthew E.
2015-01-01
The U.S. Geological Survey’s Nonindigenous Aquatic Species (NAS) Database has tracked introductions of freshwater aquatic organisms in the United States for the past four decades. A website provides access to occurrence reports, distribution maps, and fact sheets for more than 1,000 species. The site also includes an on-line reporting system and an alert system for new occurrences. We provide an historical overview of the database, a description of its current capabilities and functionality, and a basic characterization of the data contained within the database.
Conjunctive patches subspace learning with side information for collaborative image retrieval.
Zhang, Lining; Wang, Lipo; Lin, Weisi
2012-08-01
Content-Based Image Retrieval (CBIR) has attracted substantial attention during the past few years for its potential practical applications to image management. A variety of Relevance Feedback (RF) schemes have been designed to bridge the semantic gap between the low-level visual features and the high-level semantic concepts for an image retrieval task. Various Collaborative Image Retrieval (CIR) schemes aim to utilize the user historical feedback log data with similar and dissimilar pairwise constraints to improve the performance of a CBIR system. However, existing subspace learning approaches with explicit label information cannot be applied for a CIR task, although the subspace learning techniques play a key role in various computer vision tasks, e.g., face recognition and image classification. In this paper, we propose a novel subspace learning framework, i.e., Conjunctive Patches Subspace Learning (CPSL) with side information, for learning an effective semantic subspace by exploiting the user historical feedback log data for a CIR task. The CPSL can effectively integrate the discriminative information of labeled log images, the geometrical information of labeled log images and the weakly similar information of unlabeled images together to learn a reliable subspace. We formally formulate this problem into a constrained optimization problem and then present a new subspace learning technique to exploit the user historical feedback log data. Extensive experiments on both synthetic data sets and a real-world image database demonstrate the effectiveness of the proposed scheme in improving the performance of a CBIR system by exploiting the user historical feedback log data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Enrichment Services System, which is the database that tracks uranium enrichment services transactions of the... invoicing and historical tracking of SWU deliveries. Use and burnup charges mean lease charges for the...
[Historical, social and cultural aspects of the deaf population].
Duarte, Soraya Bianca Reis; Chaveiro, Neuma; Freitas, Adriana Ribeiro de; Barbosa, Maria Alves; Porto, Celmo Celeno; Fleck, Marcelo Pio de Almeida
2013-10-01
This work redeems, contextualizes and features the social, historical and cultural aspects of the deaf community that uses the Brazilian Sign Language focusing on the social and anthropological model. The scope of this study was to conduct a bibliographical review in scientific textbooks and articles available in the Virtual Health Library, irrespective of the date of publication. 102 articles and 53 books were located, including 33 textbooks and 26 articles (four from the Lilacs database and 22 from the Medline database) that constituted the sample. Today, in contrast with the past, there are laws that guarantee the right to communication and attendance by means of the Brazilian Sign Language. The repercussion, acceptance and inclusion in health policies of the decrees enshrined in Brazilian laws is a major priority.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tynan, Mark C.; Russell, Glenn P.; Perry, Frank V.
These associated tables, references, notes, and report present a synthesis of some notable geotechnical and engineering information used to create four interactive layer maps for selected: 1) deep mines and shafts; 2) existing, considered or planned radioactive waste management deep underground studies or disposal facilities 3) deep large diameter boreholes, and 4) physics underground laboratories and facilities from around the world. These data are intended to facilitate user access to basic information and references regarding “deep underground” facilities, history, activities, and plans. In general, the interactive maps and database provide each facility’s approximate site location, geology, and engineered features (e.g.:more » access, geometry, depth, diameter, year of operations, groundwater, lithology, host unit name and age, basin; operator, management organization, geographic data, nearby cultural features, other). Although the survey is not comprehensive, it is representative of many of the significant existing and historical underground facilities discussed in the literature addressing radioactive waste management and deep mined geologic disposal safety systems. The global survey is intended to support and to inform: 1) interested parties and decision makers; 2) radioactive waste disposal and siting option evaluations, and 3) safety case development applicable to any mined geologic disposal facility as a demonstration of historical and current engineering and geotechnical capabilities available for use in deep underground facility siting, planning, construction, operations and monitoring.« less
Jackson, Fatimah; Jackson, Latifa; Cross, Christopher; Clarke, Cameron
2016-07-01
How important is it to be able to reconstruct the lives of a highly diverse, historically recent macroethnic group over the course of 400 years? How many insights into human evolutionary biology and disease susceptibilities could be gained, even with this relatively recent window into the past? In this article, we explore the potential ramifications of a newly constructed dataset of Four Centuries of African American Biological Variation (4Cs). This article provides initial lists of digitized variables formatted as SQL tables for the 17th and 18th century samples and for the 19th and 20th century samples. This database is dynamic and new information is added yearly. The database provides novel opportunities for significant insights into the past biological history of this group and three case study applications are detailed for comparative computational systems biology studies of (1) hypertension, (2) the oral microbiome, and (3) mental health disorders. The 4Cs dataset is ideal for interdisciplinary "next generation" science research and these data represent a unique step toward the accumulation of historically contextualized Big Data on an underrepresented group known to have experienced differential survival over time. Am. J. Hum. Biol. 28:510-513, 2016. © 2016 The Authors American Journal of Human Biology Published byWiley Periodicals, Inc. © 2016 The Authors American Journal of Human Biology Published by Wiley Periodicals, Inc.
An archival examination of environment and disease in eastern Africa in recent history
NASA Astrophysics Data System (ADS)
Larsen, L.
2012-04-01
In order to better understand present interactions between climate and infectious disease incidence it is important to examine the history of disease outbreaks and burdens, and their likely links with the environment. This paper will present research that is currently being undertaken on the identification and mapping of historic incidences of malaria, schistosomiasis and Rift Valley fever (RVF) in eastern Africa in relation to possible environmental, social, economic and political contributing factors. The research covers the past one hundred years or so and primarily draws on a range of archival documentary sources located in the region and the former imperial centres. The paper will discuss the methodologies employed in the building of a comprehensive historical database. The research is part of a larger EU FP7-funded project which aims to map, examine and anticipate the future risks of the three diseases in eastern Africa in response to environmental change. The paper will outline how the construction of such a historic database allows the contextualization of current climate-disease relationships and can thus contribute to discussions on the effects of changing climate on future disease trends.
36 CFR 801.3 - Applicant responsibilities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... HISTORIC PRESERVATION REQUIREMENTS OF THE URBAN DEVELOPMENT ACTION GRANT PROGRAM § 801.3 Applicant... expeditiously meeting its historic preservation requirements and facilitate the development of the Council's...) Consulting the National Register of Historic Places to determine whether the project's impact area includes...
36 CFR 801.3 - Applicant responsibilities.
Code of Federal Regulations, 2014 CFR
2014-07-01
... HISTORIC PRESERVATION REQUIREMENTS OF THE URBAN DEVELOPMENT ACTION GRANT PROGRAM § 801.3 Applicant... expeditiously meeting its historic preservation requirements and facilitate the development of the Council's...) Consulting the National Register of Historic Places to determine whether the project's impact area includes...
36 CFR 801.3 - Applicant responsibilities.
Code of Federal Regulations, 2011 CFR
2011-07-01
... HISTORIC PRESERVATION REQUIREMENTS OF THE URBAN DEVELOPMENT ACTION GRANT PROGRAM § 801.3 Applicant... expeditiously meeting its historic preservation requirements and facilitate the development of the Council's...) Consulting the National Register of Historic Places to determine whether the project's impact area includes...
36 CFR 801.3 - Applicant responsibilities.
Code of Federal Regulations, 2012 CFR
2012-07-01
... HISTORIC PRESERVATION REQUIREMENTS OF THE URBAN DEVELOPMENT ACTION GRANT PROGRAM § 801.3 Applicant... expeditiously meeting its historic preservation requirements and facilitate the development of the Council's...) Consulting the National Register of Historic Places to determine whether the project's impact area includes...
Kittel, T.G.F.; Rosenbloom, N.A.; Royle, J. Andrew; Daly, Christopher; Gibson, W.P.; Fisher, H.H.; Thornton, P.; Yates, D.N.; Aulenbach, S.; Kaufman, C.; McKeown, R.; Bachelet, D.; Schimel, D.S.; Neilson, R.; Lenihan, J.; Drapek, R.; Ojima, D.S.; Parton, W.J.; Melillo, J.M.; Kicklighter, D.W.; Tian, H.; McGuire, A.D.; Sykes, M.T.; Smith, B.; Cowling, S.; Hickler, T.; Prentice, I.C.; Running, S.; Hibbard, K.A.; Post, W.M.; King, A.W.; Smith, T.; Rizzo, B.; Woodward, F.I.
2004-01-01
Analysis and simulation of biospheric responses to historical forcing require surface climate data that capture those aspects of climate that control ecological processes, including key spatial gradients and modes of temporal variability. We developed a multivariate, gridded historical climate dataset for the conterminous USA as a common input database for the Vegetation/Ecosystem Modeling and Analysis Project (VEMAP), a biogeochemical and dynamic vegetation model intercomparison. The dataset covers the period 1895-1993 on a 0.5?? latitude/longitude grid. Climate is represented at both monthly and daily timesteps. Variables are: precipitation, mininimum and maximum temperature, total incident solar radiation, daylight-period irradiance, vapor pressure, and daylight-period relative humidity. The dataset was derived from US Historical Climate Network (HCN), cooperative network, and snowpack telemetry (SNOTEL) monthly precipitation and mean minimum and maximum temperature station data. We employed techniques that rely on geostatistical and physical relationships to create the temporally and spatially complete dataset. We developed a local kriging prediction model to infill discontinuous and limited-length station records based on spatial autocorrelation structure of climate anomalies. A spatial interpolation model (PRISM) that accounts for physiographic controls was used to grid the infilled monthly station data. We implemented a stochastic weather generator (modified WGEN) to disaggregate the gridded monthly series to dailies. Radiation and humidity variables were estimated from the dailies using a physically-based empirical surface climate model (MTCLIM3). Derived datasets include a 100 yr model spin-up climate and a historical Palmer Drought Severity Index (PDSI) dataset. The VEMAP dataset exhibits statistically significant trends in temperature, precipitation, solar radiation, vapor pressure, and PDSI for US National Assessment regions. The historical climate and companion datasets are available online at data archive centers. ?? Inter-Research 2004.
36 CFR § 801.3 - Applicant responsibilities.
Code of Federal Regulations, 2013 CFR
2013-07-01
... HISTORIC PRESERVATION REQUIREMENTS OF THE URBAN DEVELOPMENT ACTION GRANT PROGRAM § 801.3 Applicant... expeditiously meeting its historic preservation requirements and facilitate the development of the Council's...) Consulting the National Register of Historic Places to determine whether the project's impact area includes...
Historical earthquakes studies in Eastern Siberia: State-of-the-art and plans for future
NASA Astrophysics Data System (ADS)
Radziminovich, Ya. B.; Shchetnikov, A. A.
2013-01-01
Many problems in investigating historical seismicity of East Siberia remain unsolved. A list of these problems may refer particularly to the quality and reliability of data sources, completeness of parametric earthquake catalogues, and precision and transparency of estimates for the main parameters of historical earthquakes. The main purpose of this paper is to highlight the current status of the studies of historical seismicity in Eastern Siberia, as well as analysis of existing macroseismic and parametric earthquake catalogues. We also made an attempt to identify the main shortcomings of existing catalogues and to clarify the reasons for their appearance in the light of the history of seismic observations in Eastern Siberia. Contentious issues in the catalogues of earthquakes are considered by the example of three strong historical earthquakes, important for assessing seismic hazard in the region. In particular, it was found that due to technical error the parameters of large M = 7.7 earthquakes of 1742 were transferred from the regional catalogue to the worldwide database with incorrect epicenter coordinates. The way some stereotypes concerning active tectonics influences on the localization of the epicenter is shown by the example of a strong М = 6.4 earthquake of 1814. Effect of insufficient use of the primary data source on completeness of earthquake catalogues is illustrated by the example of a strong M = 7.0 event of 1859. Analysis of the state-of-the-art of historical earthquakes studies in Eastern Siberia allows us to propose the following activities in the near future: (1) database compilation including initial descriptions of macroseismic effects with reference to their place and time of occurrence; (2) parameterization of the maximum possible (magnitude-unlimited) number of historical earthquakes on the basis of all the data available; (3) compilation of an improved version of the parametric historical earthquake catalogue for East Siberia with detailed consideration of each event and distinct logic schemes for data interpretation. Thus, we can make the conclusion regarding the necessity of a large-scale revision in historical earthquakes catalogues for the area of study.
The plant phenological online database (PPODB): an online database for long-term phenological data.
Dierenbach, Jonas; Badeck, Franz-W; Schaber, Jörg
2013-09-01
We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via www.ppodb.de .
Ackerman, Katherine V.; Mixon, David M.; Sundquist, Eric T.; Stallard, Robert F.; Schwarz, Gregory E.; Stewart, David W.
2009-01-01
The Reservoir Sedimentation Survey Information System (RESIS) database, originally compiled by the Soil Conservation Service (now the Natural Resources Conservation Service) in collaboration with the Texas Agricultural Experiment Station, is the most comprehensive compilation of data from reservoir sedimentation surveys throughout the conterminous United States (U.S.). The database is a cumulative historical archive that includes data from as early as 1755 and as late as 1993. The 1,823 reservoirs included in the database range in size from farm ponds to the largest U.S. reservoirs (such as Lake Mead). Results from 6,617 bathymetric surveys are available in the database. This Data Series provides an improved version of the original RESIS database, termed RESIS-II, and a report describing RESIS-II. The RESIS-II relational database is stored in Microsoft Access and includes more precise location coordinates for most of the reservoirs than the original database but excludes information on reservoir ownership. RESIS-II is anticipated to be a template for further improvements in the database.
Quantifying Data Quality for Clinical Trials Using Electronic Data Capture
Nahm, Meredith L.; Pieper, Carl F.; Cunningham, Maureen M.
2008-01-01
Background Historically, only partial assessments of data quality have been performed in clinical trials, for which the most common method of measuring database error rates has been to compare the case report form (CRF) to database entries and count discrepancies. Importantly, errors arising from medical record abstraction and transcription are rarely evaluated as part of such quality assessments. Electronic Data Capture (EDC) technology has had a further impact, as paper CRFs typically leveraged for quality measurement are not used in EDC processes. Methods and Principal Findings The National Institute on Drug Abuse Treatment Clinical Trials Network has developed, implemented, and evaluated methodology for holistically assessing data quality on EDC trials. We characterize the average source-to-database error rate (14.3 errors per 10,000 fields) for the first year of use of the new evaluation method. This error rate was significantly lower than the average of published error rates for source-to-database audits, and was similar to CRF-to-database error rates reported in the published literature. We attribute this largely to an absence of medical record abstraction on the trials we examined, and to an outpatient setting characterized by less acute patient conditions. Conclusions Historically, medical record abstraction is the most significant source of error by an order of magnitude, and should be measured and managed during the course of clinical trials. Source-to-database error rates are highly dependent on the amount of structured data collection in the clinical setting and on the complexity of the medical record, dependencies that should be considered when developing data quality benchmarks. PMID:18725958
Stochastic Model for the Vocabulary Growth in Natural Languages
NASA Astrophysics Data System (ADS)
Gerlach, Martin; Altmann, Eduardo G.
2013-04-01
We propose a stochastic model for the number of different words in a given database which incorporates the dependence on the database size and historical changes. The main feature of our model is the existence of two different classes of words: (i) a finite number of core words, which have higher frequency and do not affect the probability of a new word to be used, and (ii) the remaining virtually infinite number of noncore words, which have lower frequency and, once used, reduce the probability of a new word to be used in the future. Our model relies on a careful analysis of the Google Ngram database of books published in the last centuries, and its main consequence is the generalization of Zipf’s and Heaps’ law to two-scaling regimes. We confirm that these generalizations yield the best simple description of the data among generic descriptive models and that the two free parameters depend only on the language but not on the database. From the point of view of our model, the main change on historical time scales is the composition of the specific words included in the finite list of core words, which we observe to decay exponentially in time with a rate of approximately 30 words per year for English.
Data dictionary and discussion for the midnite mine GIS database. Report of investigations/1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, D.C.; Smith, M.A.; Ferderer, D.A.
1996-01-18
A geographic information system (GIS) database has been developed by the U.S. Bureau of Mines (USBM) for the Midnite Mine and surroundings in northeastern Washington State (Stevens County) on the Spokane Indian Reservation. The GIS database was compiled to serve as a repository and source of historical and research information on the mine site. The database also will be used by the Bureau of Land Management and the Bureau of Indian Affairs (as well as others) for environmental assessment and reclamation planning for future remediation and reclamation of the site. This report describes the data in the GIS database andmore » their characteristics. The report also discusses known backgrounds on the data sets and any special considerations encountered by the USBM in developing the database.« less
NASA Astrophysics Data System (ADS)
Boichard, Jean-Luc; Brissebrat, Guillaume; Cloche, Sophie; Eymard, Laurence; Fleury, Laurence; Mastrorillo, Laurence; Moulaye, Oumarou; Ramage, Karim
2010-05-01
The AMMA project includes aircraft, ground-based and ocean measurements, an intensive use of satellite data and diverse modelling studies. Therefore, the AMMA database aims at storing a great amount and a large variety of data, and at providing the data as rapidly and safely as possible to the AMMA research community. In order to stimulate the exchange of information and collaboration between researchers from different disciplines or using different tools, the database provides a detailed description of the products and uses standardized formats. The AMMA database contains: - AMMA field campaigns datasets; - historical data in West Africa from 1850 (operational networks and previous scientific programs); - satellite products from past and future satellites, (re-)mapped on a regular latitude/longitude grid and stored in NetCDF format (CF Convention); - model outputs from atmosphere or ocean operational (re-)analysis and forecasts, and from research simulations. The outputs are processed as the satellite products are. Before accessing the data, any user has to sign the AMMA data and publication policy. This chart only covers the use of data in the framework of scientific objectives and categorically excludes the redistribution of data to third parties and the usage for commercial applications. Some collaboration between data producers and users, and the mention of the AMMA project in any publication is also required. The AMMA database and the associated on-line tools have been fully developed and are managed by two teams in France (IPSL Database Centre, Paris and OMP, Toulouse). Users can access data of both data centres using an unique web portal. This website is composed of different modules : - Registration: forms to register, read and sign the data use chart when an user visits for the first time - Data access interface: friendly tool allowing to build a data extraction request by selecting various criteria like location, time, parameters... The request can concern local, satellite and model data. - Documentation: catalogue of all the available data and their metadata. These tools have been developed using standard and free languages and softwares: - Linux system with an Apache web server and a Tomcat application server; - J2EE tools : JSF and Struts frameworks, hibernate; - relational database management systems: PostgreSQL and MySQL; - OpenLDAP directory. In order to facilitate the access to the data by African scientists, the complete system has been mirrored at AGHRYMET Regional Centre in Niamey and is operational there since January 2009. Users can now access metadata and request data through one or the other of two equivalent portals: http://database.amma-international.org or http://amma.agrhymet.ne/amma-data.
NASA Astrophysics Data System (ADS)
Wild, Martin; Ohmura, Atsumu; Schär, Christoph; Müller, Guido; Folini, Doris; Schwarz, Matthias; Zyta Hakuba, Maria; Sanchez-Lorenzo, Arturo
2017-08-01
The Global Energy Balance Archive (GEBA) is a database for the central storage of the worldwide measured energy fluxes at the Earth's surface, maintained at ETH Zurich (Switzerland). This paper documents the status of the GEBA version 2017 dataset, presents the new web interface and user access, and reviews the scientific impact that GEBA data had in various applications. GEBA has continuously been expanded and updated and contains in its 2017 version around 500 000 monthly mean entries of various surface energy balance components measured at 2500 locations. The database contains observations from 15 surface energy flux components, with the most widely measured quantity available in GEBA being the shortwave radiation incident at the Earth's surface (global radiation). Many of the historic records extend over several decades. GEBA contains monthly data from a variety of sources, namely from the World Radiation Data Centre (WRDC) in St. Petersburg, from national weather services, from different research networks (BSRN, ARM, SURFRAD), from peer-reviewed publications, project and data reports, and from personal communications. Quality checks are applied to test for gross errors in the dataset. GEBA has played a key role in various research applications, such as in the quantification of the global energy balance, in the discussion of the anomalous atmospheric shortwave absorption, and in the detection of multi-decadal variations in global radiation, known as global dimming
and brightening
. GEBA is further extensively used for the evaluation of climate models and satellite-derived surface flux products. On a more applied level, GEBA provides the basis for engineering applications in the context of solar power generation, water management, agricultural production and tourism. GEBA is publicly accessible through the internet via http://www.geba.ethz.ch. Supplementary data are available at https://doi.org/10.1594/PANGAEA.873078.
ACHP | Federal Emergency Management Agency Historic Preservation Program
and Historic Preservation Review: https://www.fema.gov/fema-activities-may-trigger-environmental -historic-review FEMA's Environmental & Historic Preservation Guidance for Grant Applicants: http Historic Preservation Review: http://www.fema.gov/environmental-planning-and-historic-preservation-program
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Glufosinate - ammoni
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Fomesafen ; CASRN 72
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Pirimiphos - methyl
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bromoxynil ; CASRN 1
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Prometryn ; CASRN 72
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Linuron ; CASRN 330
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Dimethoate ; CASRN 6
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Methidathion ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Phenmedipham ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Pendimethalin ; CASR
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Chlorsulfuron ; CASR
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Thiophanate - methyl
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Thiram ; CASRN 137 -
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Benefin ; CASRN 1861
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Cyhalothrin / Karate
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Triallate ; CASRN 23
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Propiconazole ; CASR
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Chlorimuron - ethyl
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Cypermethrin ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Napropamide ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Difenzoquat ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Diphenylamine ; CASR
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Harmony ; CASRN 7927
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bidrin ; CASRN 141 -
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Pursuit ; CASRN 8133
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Imazalil ; CASRN 355
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Vinclozolin ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Chlorpropham ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bayleton ; CASRN 431
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Propargite ; CASRN 2
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Oxyfluorfen ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Amdro ; CASRN 67485
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Sethoxydim ; CASRN 7
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Asulam ; CASRN 3337
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Norflurazon ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Dodine ; CASRN 2439
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Dimethipin ; CASRN 5
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Bromoxynil octanoate
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Folpet ; CASRN 133 -
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Flurprimidol ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Oryzalin ; CASRN 190
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Lactofen ; CASRN 775
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Flutolanil ; CASRN 6
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Acephate ; CASRN 305
NASA Astrophysics Data System (ADS)
Seker, D. Z.; Alkan, M.; Kutoglu, S. S.; Akcin, H.
2010-12-01
Documentation of the cultural heritage sites is extremely important for monitoring and preserves them from natural disasters and human made activities. Due to its very rich historical background from the first human settlements in Catalhoyuk and Alacahoyuk and civilizations such as Byzantine, Seljuk and Ottoman, there are lots of cultural heritage sites in Turkey. 3D modeling and recording of historical buildings using modern tools and techniques in several locations of Turkey have been conducted and still continuing. The nine cultural sites in Turkey are included in the protection list of UNESCO as cultural heritage and one of them is the township of Safranbolu, which is the one of the most outstanding example of the traditional Turkish Architecture and also unique itself in terms of conservation of the human settlement in their authentic environmental motif up till now. In this study outcomes and further studies of a research project related to study area which is supported by the Turkish National Research Center (TUBITAK) with the project number 106Y157, will be presented in details. The basic aim of the study is development a GIS based information and management system for the city of Safranbolu. All historical buildings which are registered are assigned with the database. 3D modeling some of the selected building among the buildings which are registered as historical monuments using different data comes from different sources similar to their original constructions were realized and then it will be distributed via internet by a web-based information system designed during the project. Also some of the buildings were evaluated using close range photogrammetric technique to obtain their façade reliefs, were also assigned with the database. Designed database consists of 3D models, locations, historical information, cadastral and land register data of the selected buildings together with the other data collected during the project related to buildings. Using this system, all kind of spatial and non-spatial analyses were realized and different thematic maps for the historical city were produced. When the project is finalized, all the historical buildings which are consists of houses, mosques, fountains and caravansary in Safranbolu will be recorded permanently and architectural features of them will be integrated to designed spatial information system. In addition, by the help of internet, many people may be reached the data easily which will be very helpful to increase the number of visitor to the town. Also, this project will be guidance for future related studies.
Web-Based Satellite Products Database for Meteorological and Climate Applications
NASA Technical Reports Server (NTRS)
Phan, Dung; Spangenberg, Douglas A.; Palikonda, Rabindra; Khaiyer, Mandana M.; Nordeen, Michele L.; Nguyen, Louis; Minnis, Patrick
2004-01-01
The need for ready access to satellite data and associated physical parameters such as cloud properties has been steadily growing. Air traffic management, weather forecasters, energy producers, and weather and climate researchers among others can utilize more satellite information than in the past. Thus, it is essential that such data are made available in near real-time and as archival products in an easy-access and user friendly environment. A host of Internet web sites currently provide a variety of satellite products for various applications. Each site has a unique contribution with appeal to a particular segment of the public and scientific community. This is no less true for the NASA Langley's Clouds and Radiation (NLCR) website (http://www-pm.larc.nasa.gov) that has been evolving over the past 10 years to support a variety of research projects This website was originally developed to display cloud products derived from the Geostationary Operational Environmental Satellite (GOES) over the Southern Great Plains for the Atmospheric Radiation Measurement (ARM) Program. It has evolved into a site providing a comprehensive database of near real-time and historical satellite products used for meteorological, aviation, and climate studies. To encourage the user community to take advantage of the site, this paper summarizes the various products and projects supported by the website and discusses future options for new datasets.
A flood geodatabase and its climatological applications: the case of Catalonia for the last century
NASA Astrophysics Data System (ADS)
Barnolas, M.; Llasat, M. C.
2007-04-01
Floods are the natural hazards that produce the highest number of casualties and material damage in the Western Mediterranean. An improvement in flood risk assessment and study of a possible increase in flooding occurrence are therefore needed. To carry out these tasks it is important to have at our disposal extensive knowledge on historical floods and to find an efficient way to manage this geographical data. In this paper we present a complete flood database spanning the 20th century for the whole of Catalonia (NE Spain), which includes documentary information (affected areas and damage) and instrumental information (meteorological and hydrological records). This geodatabase, named Inungama, has been implemented on a GIS (Geographical Information System) in order to display all the information within a given geographical scenario, as well as to carry out an analysis thereof using queries, overlays and calculus. Following a description of the type and amount of information stored in the database and the structure of the information system, the first applications of Inungama are presented. The geographical distribution of floods shows the localities which are more likely to be flooded, confirming that the most affected municipalities are the most densely populated ones in coastal areas. Regarding the existence of an increase in flooding occurrence, a temporal analysis has been carried out, showing a steady increase over the last 30 years.
NASA Astrophysics Data System (ADS)
Pennington, Catherine; Freeborough, Katy; Dashwood, Claire; Dijkstra, Tom; Lawrie, Kenneth
2015-11-01
The British Geological Survey (BGS) is the national geological agency for Great Britain that provides geoscientific information to government, other institutions and the public. The National Landslide Database has been developed by the BGS and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 17,000 records of landslide events to date, each documented as fully as possible for inland, coastal and artificial slopes. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and using citizen science through social media and other online resources. This information is invaluable for directing the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domains map currently under development, as well as regional mapping campaigns, rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures, an understanding of causative factors, their spatial distribution and likely impacts, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) and Hazard Impact Model contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP partnership and data collected for the National Landslide Database are used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.
NASA Astrophysics Data System (ADS)
Cavaleri, Tiziana; Buscaglia, Paola; Migliorini, Simonetta; Nervo, Marco; Piccablotto, Gabriele; Piccirillo, Anna; Pisani, Marco; Puglisi, Davide; Vaudan, Dario; Zucco, Massimo
2017-06-01
The conservation of artworks requires a profound knowledge about pictorial materials, their chemical and physical properties and their interaction and/or degradation processes. For this reason, pictorial materials databases are widely used to study and investigate cultural heritage. At Centre for Conservation and Restoration La Venaria Reale, we prepared a set of about 1200 mock-ups with 173 different pigments and/or dyes, used across all the historical times or as products for conservation, four binders, two varnishes and four different materials for underdrawings. In collaboration with the Laboratorio Analisi Scientifiche of Regione Autonoma Valle d'Aosta, the National Institute of Metrological Research and the Department of Architecture and Design of the Polytechnic of Turin, we created a scientific database that is now available online (http://www.centrorestaurovenaria.it/en/areas/diagnostic/pictorial-materials-database) designed as a tool for heritage science and conservation. Here, we present a focus on materials for pictorial retouching where the hyperspectral imaging application, conducted with a prototype of new technology, allowed to provide a list of pigments that could be more suitable for conservation treatments and pictorial retouching. Then we present the case study of the industrial painting Notte Barbara (1962) by Pinot Gallizio where the use of the database including modern and contemporary art materials showed to be very useful and where the fibre optics reflectance spectroscopy technique was decisive for pigment identification purpose. Later in this research, the mock-ups will be exploited to study degradation processes, e.g., the lightfastness, or the possible formation of interaction products, e.g., metal carboxylates.
Development of a Data Citations Database for an Interdisciplinary Data Center
NASA Astrophysics Data System (ADS)
Chen, R. S.; Downs, R. R.; Schumacher, J.; Gerard, A.
2017-12-01
The scientific community has long depended on consistent citation of the scientific literature to enable traceability, support replication, and facilitate analysis and debate about scientific hypotheses, theories, assumptions, and conclusions. However, only in the past few years has the community focused on consistent citation of scientific data, e.g., through the application of Digital Object Identifiers (DOIs) to data, the development of peer-reviewed data publications, community principles and guidelines, and other mechanisms. This means that, moving ahead, it should be easier to identify and track data citations and conduct systematic bibliometric studies. However, this still leaves the problem that many legacy datasets and past citations lack DOIs, making it difficult to develop a historical baseline or assess trends. With this in mind, the NASA Socioeconomic Data and Applications Center (SEDAC) has developed a searchable citations database, containing more than 3,400 citations of SEDAC data and information products over the past 20 years. These citations were collected through various indices and search tools and in some cases through direct contacts with authors. The citations come from a range of natural, social, health, and engineering science journals, books, reports, and other media. The database can be used to find and extract citations filtered by a range of criteria, enabling quantitative analysis of trends, intercomparisons between data collections, and categorization of citations by type. We present a preliminary analysis of citations for selected SEDAC data collections, in order to establish a baseline and assess options for ongoing metrics to track the impact of SEDAC data on interdisciplinary science. We also present an analysis of the uptake of DOIs within data citations reported in published studies that used SEDAC data.
Case Histories of Landslide Impact: A Database-driven Approach
NASA Astrophysics Data System (ADS)
Klose, Martin; Damm, Bodo
2015-04-01
Fundamental understanding of landslide risk requires in-depth knowledge of how landslides have impacted society in the past (e.g., Corominas et al., 2014). A key to obtain insights into the evolution of landslide risk at single facilities of critical infrastructures are case histories of landslide impact. The purpose of such historical analyses is to inform about the site-specific interactions between landslides and land-use activity. Case histories support correlating landslide events and associated damages with multiple control variables of landslide risk, including (i) previous construction works, (ii) hazard awareness, (iii) the type of structure or its material properties, and (iv) measures of post-disaster mitigation. It is a key advantage of case histories to provide an overview of the changes in the exposure and vulnerability of infrastructures over time. Their application helps to learn more about changing patterns in risk culture and the effectiveness of repair or prevention measures (e.g., Klose et al., 2014). Case histories of landslide impact are developed on the basis of information extracted from landslide databases. The use of path diagrams and illustrated flowcharts as data modeling techniques is aimed at structuring, condensing, and visualizing complex historical data sets on landslide activity and land-use. Much of the scientific potential of case histories simply depends on the quality of available database information. Landslide databases relying on a bottom-up approach characterized by targeted local data specification are optimally suited for historical impact analyses. Combined with systematic retrieval, extraction, and integration of data from multiple sources, landslide databases constitute a valuable tool for developing case histories that enable to open a whole new window on the study of landslide impacts (e.g., Damm and Klose, 2014). The present contribution introduces such a case history for a well-known landslide site at a heavily frequented highway in NW Germany. Landslide problems at this site started with road construction in the early 1880s and were related to multiple event clusters, especially those in the years 1936-1937 (n = 4), 1961 (n = 2), 1970-1974 (n = 5), and 1999-2001 (n = 7). The most frequently applied mitigation measures were rudimentary and less expensive, including (i) removal of loose rock and vegetation (1924, 1936, 1961-1962, 1994), (ii) rock blasting (1936), (iii) catch barriers (1937, 1994), and (iv) temporary or perpetual closure of traffic lanes (1982, 1994). A series of destructive landslides forced decision-makers to launch an expensive slope stabilization project in 2001 that resulted in costs of USD 7.1 million. After finalization of the project no further landslide problems have been reported for this site. References Corominas, J., van Westen, C., Frattini, P., Cascini, L., Malet, J.-P., Fotopoulou, S., Catani, F., Van Den Eeckhaut, M., Mavrouli, O., Agliardi, F., Pitilakis, K., Winter, M.G., Pastor, M., Ferlisi, S., Tofani, V., Hervás, J., Smith, J.T., 2014. Recommendations for the quantitative analysis of landslide risk. Bulletin of Engineering Geology and the Environment 73, 209-263. Damm, B., Klose, M., 2014. Landslide database for the Federal Republic of Germany: a tool for analysis of mass movement processes and impacts. In: Sassa, K., Canuti, P., Yin, Y. (Eds.), Landslide Science for a Safer Geoenvironment. Volume 2: Methods of Landslide Studies. Springer, Berlin, pp. 787-792. Klose, M., Damm, B., Terhorst, B., 2014. Landslide cost modeling for transportation infrastructures: a methodological approach. Landslides, DOI 10.1007/s10346-014-0481-1.
Kenow, Kevin P.; Garrison, Paul J.; Fox, Timothy J.; Meyer, Michael W.
2013-01-01
A study was conducted to evaluate changes in water quality and land-use change associated with lakes that are south of the current breeding range of Common Loons in Wisconsin but that historically supported breeding loons. Museum collection records and published accounts were examined to identify lakes in southern Wisconsin with a former history of loon nesting activity. Historical and recent water quality data were obtained from state and USEPA databases for the former loon nesting lakes that were identified and paleolimnological data were acquired for these lakes from sediment cores used to infer historical total phosphorus concentrations from diatom assemblages. U.S. General Land Office notes and maps from the original land survey conducted in Wisconsin during 1832-1866 and the National Land Cover Database 2006 were utilized to assess land use changes that occurred within the drainage basins of former loon nesting lakes. Our results indicate that the landscape of southern Wisconsin has changed dramatically since Common Loons last nested in the region. A number of factors have likely contributed to the decreased appeal of southern Wisconsin lakes to breeding Common Loons, including changes to water quality, altered trophic status resulting from nutrient enrichment, and reductions in suitable nesting habitat stemming from shoreline development and altered water levels. Increased nutrient and sediment inputs from agricultural and developed areas likely contributed to a reduction in habitat quality.
NASA Astrophysics Data System (ADS)
Wang, Zhihua; Yang, Xiaomei; Lu, Chen; Yang, Fengshuo
2018-07-01
Automatic updating of land use/cover change (LUCC) databases using high spatial resolution images (HSRI) is important for environmental monitoring and policy making, especially for coastal areas that connect the land and coast and that tend to change frequently. Many object-based change detection methods are proposed, especially those combining historical LUCC with HSRI. However, the scale parameter(s) segmenting the serial temporal images, which directly determines the average object size, is hard to choose without experts' intervention. And the samples transferred from historical LUCC also need experts' intervention to avoid insufficient or wrong samples. With respect to the scale parameter(s) choosing, a Scale Self-Adapting Segmentation (SSAS) approach based on the exponential sampling of a scale parameter and location of the local maximum of a weighted local variance was proposed to determine the scale selection problem when segmenting images constrained by LUCC for detecting changes. With respect to the samples transferring, Knowledge Transfer (KT), a classifier trained on historical images with LUCC and applied in the classification of updated images, was also proposed. Comparison experiments were conducted in a coastal area of Zhujiang, China, using SPOT 5 images acquired in 2005 and 2010. The results reveal that (1) SSAS can segment images more effectively without intervention of experts. (2) KT can also reach the maximum accuracy of samples transfer without experts' intervention. Strategy SSAS + KT would be a good choice if the temporal historical image and LUCC match, and the historical image and updated image are obtained from the same resource.
Whelan, Brendan; Moros, Eduardo G; Fahrig, Rebecca; Deye, James; Yi, Thomas; Woodward, Michael; Keall, Paul; Siewerdsen, Jeff H
2017-04-01
To produce and maintain a database of National Institutes of Health (NIH) funding of the American Association of Physicists in Medicine (AAPM) members, to perform a top-level analysis of these data, and to make these data (hereafter referred to as the AAPM research database) available for the use of the AAPM and its members. NIH-funded research dating back to 1985 is available for public download through the NIH exporter website, and AAPM membership information dating back to 2002 was supplied by the AAPM. To link these two sources of data, a data mining algorithm was developed in Matlab. The false-positive rate was manually estimated based on a random sample of 100 records, and the false-negative rate was assessed by comparing against 99 member-supplied PI_ID numbers. The AAPM research database was queried to produce an analysis of trends and demographics in research funding dating from 2002 to 2015. A total of 566 PI_ID numbers were matched to AAPM members. False-positive and -negative rates were respectively 4% (95% CI: 1-10%, N = 100) and 10% (95% CI: 5-18%, N = 99). Based on analysis of the AAPM research database, in 2015 the NIH awarded $USD 110M to members of the AAPM. The four NIH institutes which historically awarded the most funding to AAPM members were the National Cancer Institute, National Institute of Biomedical Imaging and Bioengineering, National Heart Lung and Blood Institute, and National Institute of Neurological Disorders and Stroke. In 2015, over 85% of the total NIH research funding awarded to AAPM members was via these institutes, representing 1.1% of their combined budget. In the same year, 2.0% of AAPM members received NIH funding for a total of $116M, which is lower than the historic mean of $120M (in 2015 USD). A database of NIH-funded research awarded to AAPM members has been developed and tested using a data mining approach, and a top-level analysis of funding trends has been performed. Current funding of AAPM members is lower than the historic mean. The database will be maintained by members of the Working group for the development of a research database (WGDRD) on an annual basis, and is available to the AAPM, its committees, working groups, and members for download through the AAPM electronic content website. A wide range of questions regarding financial and demographic funding trends can be addressed by these data. This report has been approved for publication by the AAPM Science Council. © 2017 American Association of Physicists in Medicine.
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Fosetyl - al ; CASRN
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016NuStar ; CASRN 85509 -
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) Merphos oxide ; CASR
Design, Development and Utilization Perspectives on Database Management Systems
ERIC Educational Resources Information Center
Shneiderman, Ben
1977-01-01
This paper reviews the historical development of integrated data base management systems and examines competing approaches. Topics include management and utilization, implementation and design, query languages, security, integrity, privacy and concurrency. (Author/KP)
DEFENSE MEDICAL SURVEILLANCE SYSTEM (DMSS)
AMSA operates the Defense Medical Surveillance System (DMSS), an executive information system whose database contains up-to-date and historical data on diseases and medical events (e.g., hospitalizations, ambulatory visits, reportable diseases, HIV tests, acute respiratory diseas...
36 CFR § 1256.24 - How long may access to some records be denied?
Code of Federal Regulations, 2013 CFR
2013-07-01
... RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE ACCESS TO RECORDS AND DONATED HISTORICAL MATERIALS... of the records in which you are interested available. In the case of electronic structured databases...
76 FR 4072 - Registration of Claims of Copyright
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-24
... registration of automated databases that predominantly consist of photographs, and applications for group... to submit electronic applications to register copyrights of such photographic databases or of groups... automated databases, an electronic application for group registration of an automated database that consists...
Just-in-time Database-Driven Web Applications
2003-01-01
"Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109
Research on the spatial analysis method of seismic hazard for island
NASA Astrophysics Data System (ADS)
Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying
2017-05-01
Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.
Aerosol Robotic Network (AERONET) Version 3 Aerosol Optical Depth and Inversion Products
NASA Astrophysics Data System (ADS)
Giles, D. M.; Holben, B. N.; Eck, T. F.; Smirnov, A.; Sinyuk, A.; Schafer, J.; Sorokin, M. G.; Slutsker, I.
2017-12-01
The Aerosol Robotic Network (AERONET) surface-based aerosol optical depth (AOD) database has been a principal component of many Earth science remote sensing applications and modelling for more than two decades. During this time, the AERONET AOD database had utilized a semiautomatic quality assurance approach (Smirnov et al., 2000). Data quality automation developed for AERONET Version 3 (V3) was achieved by augmenting and improving upon the combination of Version 2 (V2) automatic and manual procedures to provide a more refined near real time (NRT) and historical worldwide database of AOD. The combined effect of these new changes provides a historical V3 AOD Level 2.0 data set comparable to V2 Level 2.0 AOD. The recently released V3 Level 2.0 AOD product uses Level 1.5 data with automated cloud screening and quality controls and applies pre-field and post-field calibrations and wavelength-dependent temperature characterizations. For V3, the AERONET aerosol retrieval code inverts AOD and almucantar sky radiances using a full vector radiative transfer called Successive ORDers of scattering (SORD; Korkin et al., 2017). The full vector code allows for potentially improving the real part of the complex index of refraction and the sphericity parameter and computing the radiation field in the UV (e.g., 380nm) and degree of linear depolarization. Effective lidar ratio and depolarization ratio products are also available with the V3 inversion release. Inputs to the inversion code were updated to the accommodate H2O, O3 and NO2 absorption to be consistent with the computation of V3 AOD. All of the inversion products are associated with estimated uncertainties that include the random error plus biases due to the uncertainty in measured AOD, absolute sky radiance calibration, and retrieved MODIS BRDF for snow-free and snow covered surfaces. The V3 inversion products use the same data quality assurance criteria as V2 inversions (Holben et al. 2006). The entire AERONET V3 almucantar inversion database was computed using the NASA High End Computing resources at NASA Ames Research Center and NASA Goddard Space Flight Center. In addition to a description of data products, this presentation will provide a comparison of the V3 AOD and inversion climatology comparison of the V3 Level 2.0 and V2 Level 2.0 for sites with varying aerosol types.
Multiscale Interactive Communication: Inside and Outside Thun Castle
NASA Astrophysics Data System (ADS)
Massari, G. A.; Luce, F.; Pellegatta, C.
2011-09-01
The applications of informatics to architecture have become, for professionals, a great tool for managing analytical phases and project activities but also, for the general public, new ways of communication that may relate directly present, past and future facts. Museums in historic buildings, their installations and the recent experiences of eco-museums located throughout the territory provide a privileged experimentation field for technical and digital representation. On the one hand, the safeguarding and the functional adaptation of buildings use 3D computer graphics models that are real spatially related databases: in them are ordered, viewed and interpreted the results of archival, artistic-historical, diagnostic, technological-structural studies and the assumption and feasibility of interventions. On the other hand, the disclosure of things and knowledge linked to collective memory relies on interactive maps and hypertext systems that provide access to authentic virtual museums; a sort of multimedia extension of the exhibition hall is produced to an architectural scale, but at landscape scale the result is an instrument of cultural development so far unpublished: works that are separated in direct perception find in a zenith view of the map a synthetic relation, related both to spatial parameters and temporal interpretations.
Health and Wellness Technology Use by Historically Underserved Health Consumers: Systematic Review
Perchonok, Jennifer
2012-01-01
Background The implementation of health technology is a national priority in the United States and widely discussed in the literature. However, literature about the use of this technology by historically underserved populations is limited. Information on culturally informed health and wellness technology and the use of these technologies to reduce health disparities facing historically underserved populations in the United States is sparse in the literature. Objective To examine ways in which technology is being used by historically underserved populations to decrease health disparities through facilitating or improving health care access and health and wellness outcomes. Methods We conducted a systematic review in four library databases (PubMed, PsycINFO, Web of Science, and Engineering Village) to investigate the use of technology by historically underserved populations. Search strings consisted of three topics (eg, technology, historically underserved populations, and health). Results A total of 424 search phrases applied in the four databases returned 16,108 papers. After review, 125 papers met the selection criteria. Within the selected papers, 30 types of technology, 19 historically underserved groups, and 23 health issues were discussed. Further, almost half of the papers (62 papers) examined the use of technology to create effective and culturally informed interventions or educational tools. Finally, 12 evaluation techniques were used to assess the technology. Conclusions While the reviewed studies show how technology can be used to positively affect the health of historically underserved populations, the technology must be tailored toward the intended population, as personally relevant and contextually situated health technology is more likely than broader technology to create behavior changes. Social media, cell phones, and videotapes are types of technology that should be used more often in the future. Further, culturally informed health information technology should be used more for chronic diseases and disease management, as it is an innovative way to provide holistic care and reminders to otherwise underserved populations. Additionally, design processes should be stated regularly so that best practices can be created. Finally, the evaluation process should be standardized to create a benchmark for culturally informed health information technology. PMID:22652979
Introducing GFWED: The Global Fire Weather Database
NASA Technical Reports Server (NTRS)
Field, R. D.; Spessa, A. C.; Aziz, N. A.; Camia, A.; Cantin, A.; Carr, R.; de Groot, W. J.; Dowdy, A. J.; Flannigan, M. D.; Manomaiphiboon, K.;
2015-01-01
The Canadian Forest Fire Weather Index (FWI) System is the mostly widely used fire danger rating system in the world. We have developed a global database of daily FWI System calculations, beginning in 1980, called the Global Fire WEather Database (GFWED) gridded to a spatial resolution of 0.5 latitude by 2-3 longitude. Input weather data were obtained from the NASA Modern Era Retrospective-Analysis for Research and Applications (MERRA), and two different estimates of daily precipitation from rain gauges over land. FWI System Drought Code calculations from the gridded data sets were compared to calculations from individual weather station data for a representative set of 48 stations in North, Central and South America, Europe, Russia,Southeast Asia and Australia. Agreement between gridded calculations and the station-based calculations tended to be most different at low latitudes for strictly MERRA based calculations. Strong biases could be seen in either direction: MERRA DC over the Mato Grosso in Brazil reached unrealistically high values exceeding DCD1500 during the dry season but was too low over Southeast Asia during the dry season. These biases are consistent with those previously identified in MERRAs precipitation, and they reinforce the need to consider alternative sources of precipitation data. GFWED can be used for analyzing historical relationships between fire weather and fire activity at continental and global scales, in identifying large-scale atmosphereocean controls on fire weather, and calibration of FWI-based fire prediction models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habte, A.; Lopez, A.; Sengupta, M.
Typical Meteorological Year (TMY) data sets provide industry standard resource information for building designers and are commonly used by the solar industry to estimate photovoltaic and concentrating solar power system performance. Historically, TMY data sets were only available for certain station locations, but current TMY data sets are available on the same grid as the National Solar Radiation Database data and are referred to as the gridded TMY. In this report, a comparison of TMY, typical direct (normal irradiance) year (TDY), and typical global (horizontal irradiance) year (TGY) data sets were performed to better understand the impact of ancillary weathermore » variables upon them. These analyses identified geographical areas of high and low temporal and spatial variability, thereby providing insight into the representativeness of a particular TMY data set for use in renewable energy as well as other applications.« less
Software Application for Supporting the Education of Database Systems
ERIC Educational Resources Information Center
Vágner, Anikó
2015-01-01
The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…
Environmental applications based on GIS and GRID technologies
NASA Astrophysics Data System (ADS)
Demontis, R.; Lorrai, E.; Marrone, V. A.; Muscas, L.; Spanu, V.; Vacca, A.; Valera, P.
2009-04-01
In the last decades, the collection and use of environmental data has enormously increased in a wide range of applications. Simultaneously, the explosive development of information technology and its ever wider data accessibility have made it possible to store and manipulate huge quantities of data. In this context, the GRID approach is emerging worldwide as a tool allowing to provision a computational task with administratively-distant resources. The aim of this paper is to present three environmental applications (Land Suitability, Desertification Risk Assessment, Georesources and Environmental Geochemistry) foreseen within the AGISGRID (Access and query of a distributed GIS/Database within the GRID infrastructure, http://grida3.crs4.it/enginframe/agisgrid/index.xml) activities of the GRIDA3 (Administrator of sharing resources for data analysis and environmental applications, http://grida3.crs4.it) project. This project, co-funded by the Italian Ministry of research, is based on the use of shared environmental data through GRID technologies and accessible by a WEB interface, aimed at public and private users in the field of environmental management and land use planning. The technologies used for AGISGRID include: - the client-server-middleware iRODS⢠(Integrated Rule-Oriented Data System) (https://irods.org); - the EnginFrame system (http://www.nice-italy.com/main/index.php?id=32), the grid portal that supplies a frame to make available, via Intranet/Internet, the developed GRID applications; - the software GIS GRASS (Geographic Resources Analysis Support System) (http://grass.itc.it); - the relational database PostgreSQL (http://www.posgresql.org) and the spatial database extension PostGis; - the open source multiplatform Mapserver (http://mapserver.gis.umn.edu), used to represent the geospatial data through typical WEB GIS functionalities. Three GRID nodes are directly involved in the applications: the application workflow is implemented at the CRS4 (Pula, Southern Sardinia, Italy), the soil database is managed at the DISTER node (Cagliari, southern Sardinia, Italy), and the geochemical database is managed at the DIGITA node (Cagliari, southern Sardinia, Italy). The input data are files (raster ASCII format) and database tables. The raster files have been zipped and stored in iRods. The tables are imported into a PostgreSQL database and accessed by the Rule-oriented Database Access (RDA) system available for PostgreSQL in iRODS 1.1. From the EnginFrame portal it is possible to view and use the applications through three services: "Upload Data", "View Data and Metadata", and "Execute Application". The Land Suitability application, based on the FAO framework for land evaluation, produces suitability maps (at the scale 1:10,000) for 11 different possible alternative uses. The maps, with a ASCII raster format, are downloadable by the user and viewable by Mapserver. This application has been implemented in an area of southern Sardinia (Monastir) and may be useful to direct municipal urban planning towards a rational land use. The Desertification Risk Assessment application produces, by means of biophysical and socioeconomic key indicators, a final combined map showing critical, fragile, and potential Environmentally Sensitive Areas to desertification. This application has been implemented in an area of south-west Sardinia (Muravera). The final index for the sensitivity is obtained by the geometric mean among four parameters: SQI (Soil Quality Index), CQI (Climate Quality Index), VQI (Vegetation Quality Index) e MQI (Management Quality Index). The final result (ESAs = (SQI * CQI * VQI * MQI)14) is a map at the scale 1:50,000, with a ASCII raster format, downloadable by the user and viewable by Mapserver. This type of map may be useful to direct land planning at catchment basin level. The Georesources and Environmental Geochemistry application, whose test is in progress in the area of Muravera (south-west Sardinia) through stream sediment sampling, aims at producing maps defining, with high precision, areas (hydrographic basins) where the values of a given element exceed the lithological background (i.e. geochemically anomalous). Such a product has a double purpose. First of all, it identifies releasing sources and may be useful for the necessary remediation actions, if they insist on areas historically prone to more or less intense anthropical activities. On the other hand, if these sources are of natural origin, they could also be interpreted as ore mineral occurrences. In the latter case the study of these occurrences could lead to discover economic ore bodies of small-to-medium size (at least in the present target area) and consequently to the revival of a local mining industry.
Improved Information Retrieval Performance on SQL Database Using Data Adapter
NASA Astrophysics Data System (ADS)
Husni, M.; Djanali, S.; Ciptaningtyas, H. T.; Wicaksana, I. G. N. A.
2018-02-01
The NoSQL databases, short for Not Only SQL, are increasingly being used as the number of big data applications increases. Most systems still use relational databases (RDBs), but as the number of data increases each year, the system handles big data with NoSQL databases to analyze and access data more quickly. NoSQL emerged as a result of the exponential growth of the internet and the development of web applications. The query syntax in the NoSQL database differs from the SQL database, therefore requiring code changes in the application. Data adapter allow applications to not change their SQL query syntax. Data adapters provide methods that can synchronize SQL databases with NotSQL databases. In addition, the data adapter provides an interface which is application can access to run SQL queries. Hence, this research applied data adapter system to synchronize data between MySQL database and Apache HBase using direct access query approach, where system allows application to accept query while synchronization process in progress. From the test performed using data adapter, the results obtained that the data adapter can synchronize between SQL databases, MySQL, and NoSQL database, Apache HBase. This system spends the percentage of memory resources in the range of 40% to 60%, and the percentage of processor moving from 10% to 90%. In addition, from this system also obtained the performance of database NoSQL better than SQL database.
Newspaper archives + text mining = rich sources of historical geo-spatial data
NASA Astrophysics Data System (ADS)
Yzaguirre, A.; Smit, M.; Warren, R.
2016-04-01
Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.
Shared Web Information Systems for Heritage in Scotland and Wales - Flexibility in Partnership
NASA Astrophysics Data System (ADS)
Thomas, D.; McKeague, P.
2013-07-01
The Royal Commissions on the Ancient and Historical Monuments of Scotland and Wales were established in 1908 to investigate and record the archaeological and built heritage of their respective countries. The organisations have grown organically over the succeeding century, steadily developing their inventories and collections as card and paper indexes. Computerisation followed in the late 1980s and early 1990s, with RCAHMS releasing Canmore, an online searchable database, in 1998. Following a review of service provision in Wales, RCAHMW entered into partnership with RCAHMS in 2003 to deliver a database for their national inventories and collections. The resultant partnership enables both organisations to develop at their own pace whilst delivering efficiencies through a common experience and a shared IT infrastructure. Through innovative solutions the partnership has also delivered benefits to the wider historic environment community, providing online portals to a range of datasets, ultimately raising public awareness and appreciation of the heritage around them. Now celebrating its 10th year, Shared Web Information Systems for Heritage, or more simply SWISH, continues to underpin the work of both organisations in presenting information about the historic environment to the public.
77 FR 66617 - HIT Policy and Standards Committees; Workgroup Application Database
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... Database AGENCY: Office of the National Coordinator for Health Information Technology, HHS. ACTION: Notice of New ONC HIT FACA Workgroup Application Database. The Office of the National Coordinator (ONC) has launched a new Health Information Technology Federal Advisory Committee Workgroup Application Database...
Geolocation of man-made reservoirs across terrains of varying complexity using GIS
NASA Astrophysics Data System (ADS)
Mixon, David M.; Kinner, David A.; Stallard, Robert F.; Syvitski, James P. M.
2008-10-01
The Reservoir Sedimentation Survey Information System (RESIS) is one of the world's most comprehensive databases of reservoir sedimentation rates, comprising nearly 6000 surveys for 1819 reservoirs across the continental United States. Sediment surveys in the database date from 1904 to 1999, though more than 95% of surveys were entered prior to 1980, making RESIS largely a historical database. The use of this database for large-scale studies has been limited by the lack of precise coordinates for the reservoirs. Many of the reservoirs are relatively small structures and do not appear on current USGS topographic maps. Others have been renamed or have only approximate (i.e. township and range) coordinates. This paper presents a method scripted in ESRI's ARC Macro Language (AML) to locate the reservoirs on digital elevation models using information available in RESIS. The script also delineates the contributing watersheds and compiles several hydrologically important parameters for each reservoir. Evaluation of the method indicates that, for watersheds larger than 5 km 2, the correct outlet is identified over 80% of the time. The importance of identifying the watershed outlet correctly depends on the application. Our intent is to collect spatial data for watersheds across the continental United States and describe the land use, soils, and topography for each reservoir's watershed. Because of local landscape similarity in these properties, we show that choosing the incorrect watershed does not necessarily mean that the watershed characteristics will be misrepresented. We present a measure termed terrain complexity and examine its relationship to geolocation success rate and its influence on the similarity of nearby watersheds.
NASA Astrophysics Data System (ADS)
Dunbar, P. K.; Weaver, C.
2007-12-01
In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the qualitative assessments based on frequency and amplitude. The second method to assess tsunami hazard involved using the USGS earthquake databases to search for possible earthquake sources near American coastlines to extend the NOAA/NGDC tsunami databases backward in time. The qualitative tsunami hazard assessment based on the results of the NGDC and USGS database searches will be presented.
Long term volcanic hazard analysis in the Canary Islands
NASA Astrophysics Data System (ADS)
Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.
2009-04-01
Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.
NASA Astrophysics Data System (ADS)
Brady, M.
2016-12-01
During his historic trip to Alaska in 2015, U.S. President Barack Obama announced a collaborative effort to update maps of the Arctic region in anticipation of increased maritime access and resource development and to support climate resilience. Included in this effort is development of an Arctic-wide satellite-based digital elevation model (DEM) to provide a baseline to monitor landscape change such as coastal erosion. Focusing in Alaska's North Slope, an objective of this study is to transform emerging Arctic environment spatial data products including the new DEM into information that can support local level planning and decision-making in the face of extreme coastal erosion and related environmental threats. In pursuit of this, in 2016, 4 workshops were held in three North Slope villages highly exposed to coastal erosion. The first workshop with approximately 10 managers in Barrow solicited feedback on an erosion risk database developed in a previous research stage and installed onto the North Slope's planning Web portal. The database includes a physical risk indicator based on factors such as historical erosion and effects of sea ice loss summarized at asset locations. After a demonstration of the database, participants discussed usability aspects such as data reliability. The focus of the mapping workshops in Barrow and two smaller villages Wainwright and Kaktovik was to verify and expand the risk database by interactively mapping erosion observations and community asset impacts. Using coded stickers and paper maps of the shoreline showing USGS erosion rates, a total of 50 participants provided feedback on erosion data accuracy. Approximately 25 of the total 50 participants were elders and hunters who also provided in-depth community risk information. The workshop with managers confirmed physical risk factors used in the risk database, and revealed that the information may be relied upon to support some development decisions and better engage developers about erosion risks. Results from the three mapping workshops revealed that most participants agree that the USGS data are consistent with their observations. Also, in-depth contributions from elders and hunters confirmed that there is a need to monitor loss of specific assets including hunting grounds and historic places and associated community impacts.
S-Ethyl dipropylthiocarbamate (EPTC)
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) S - Ethyl dipropylth
Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.
2015-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential.The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska.For this report, DGGS funded reanalysis of 105 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Zane Hills area in the Hughes and Shungnak quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.
Werdon, Melanie B.; Azain, Jaime S.; Granitto, Matthew
2014-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. For the geochemical part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 1,682 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from an area covering the western half of the Wrangellia Terrane in the Anchorage, Gulkana, Healy, Mt. Hayes, Nabesna, and Talkeetna Mountains quadrangles of south-central Alaska (fig. 1). USGS was responsible for sample retrieval from the Denver warehouse through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.
Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.
2015-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 302 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Kougarok River drainage as well as smaller adjacent drainages in the Bendeleben and Teller quadrangles, Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.
Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.
2015-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 212 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Chilkat, Klehini, Tsirku, and Takhin river drainages, as well as smaller drainages flowing into Chilkat and Chilkoot Inlets near Haines, Skagway Quadrangle, Southeast Alaska. Additionally some samples were also chosen from the Juneau gold belt, Juneau Quadrangle, Southeast Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.
Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.
2015-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 670 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the northeastern Alaska Range, in the Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.
Sandoval, Hugo; Pérez-Neri, Iván; Martínez-Flores, Francisco; Valle-Cabrera, Martha Griselda Del; Pineda, Carlos
2017-01-01
Some interpretations frequently argue that three Disability Models (DM) (Charity, Medical/Rehabilitation, and Social) correspond to historical periods in terms of chronological succession. These views permeate a priori within major official documents on the subject in Mexico. This paper intends to test whether this association is plausible by applying a timeline method. A document search was made with inclusion and exclusion criteria in databases to select representative studies with which to depict milestones in the timelines for each period. The following is demonstrated: 1) models should be considered as categories of analysis and not as historical periods, in that the prevalence of elements of the three models is present to date, and 2) the association between disability models and historical periods results in teleological interpretations of the history of disability in Mexico.
NASA Astrophysics Data System (ADS)
Yan, Zheng; Mingzhong, Tian; Hengli, Wang
2010-05-01
Chinese hand-written local records were originated from the first century. Generally, these local records include geography, evolution, customs, education, products, people, historical sites, as well as writings of an area. Through such endeavors, the information of the natural materials of China nearly has had no "dark ages" in the evolution of its 5000-year old civilization. A compilation of all meaningful historical data of natural-disasters taken place in Alxa of inner-Mongolia, the second largest desert in China, is used here for the construction of a 500-year high resolution database. The database is divided into subsets according to the types of natural-disasters like sand-dust storm, drought events, cold wave, etc. Through applying trend, correlation, wavelet, and spectral analysis on these data, we can estimate the statistically periodicity of different natural-disasters, detect and quantify similarities and patterns of the periodicities of these records, and finally take these results in aggregate to find a strong and coherent cyclicity through the last 500 years which serves as the driving mechanism of these geological hazards. Based on the periodicity obtained from the above analysis, the paper discusses the probability of forecasting natural-disasters and the suitable measures to reduce disaster losses through history records. Keyword: Chinese local records; Alxa; natural disasters; database; periodicity analysis
The plant phenological online database (PPODB): an online database for long-term phenological data
NASA Astrophysics Data System (ADS)
Dierenbach, Jonas; Badeck, Franz-W.; Schaber, Jörg
2013-09-01
We present an online database that provides unrestricted and free access to over 16 million plant phenological observations from over 8,000 stations in Central Europe between the years 1880 and 2009. Unique features are (1) a flexible and unrestricted access to a full-fledged database, allowing for a wide range of individual queries and data retrieval, (2) historical data for Germany before 1951 ranging back to 1880, and (3) more than 480 curated long-term time series covering more than 100 years for individual phenological phases and plants combined over Natural Regions in Germany. Time series for single stations or Natural Regions can be accessed through a user-friendly graphical geo-referenced interface. The joint databases made available with the plant phenological database PPODB render accessible an important data source for further analyses of long-term changes in phenology. The database can be accessed via
An expanded mammal mitogenome dataset from Southeast Asia
Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S. Sinding; Riddhi, P. Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K. Lim; Stephen, J. Rossiter; Wilting, Andreas
2017-01-01
Abstract Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals—has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. PMID:28873965
An expanded mammal mitogenome dataset from Southeast Asia.
Mohd Salleh, Faezah; Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S Sinding; Riddhi, P Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K Lim; Stephen, J Rossiter; Wilting, Andreas; Gilbert, M Thomas P
2017-08-01
Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals-has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. © The Author 2017. Published by Oxford University Press.
Encouraging Historical Thinking at Historic Sites
ERIC Educational Resources Information Center
Baron, Christine
2010-01-01
This study seeks to contribute to our understanding of the problem of effectively encouraging historical thinking by (a) evaluating, and modifying Wineburg's heuristics for historical thinking for applicability to the problem-solving activities historians use at historic sites; (b) establishing the efficacy of a hypermedia-based education program…
An improved database of coastal flooding in the United Kingdom from 1915 to 2016
Haigh, Ivan D.; Ozsoy, Ozgun; Wadey, Matthew P.; Nicholls, Robert J.; Gallop, Shari L.; Wahl, Thomas; Brown, Jennifer M.
2017-01-01
Coastal flooding caused by extreme sea levels can produce devastating and wide-ranging consequences. The ‘SurgeWatch’ v1.0 database systematically documents and assesses the consequences of historical coastal flood events around the UK. The original database was inevitably biased due to the inconsistent spatial and temporal coverage of sea-level observations utilised. Therefore, we present an improved version integrating a variety of ‘soft’ data such as journal papers, newspapers, weather reports, and social media. SurgeWatch2.0 identifies 329 coastal flooding events from 1915 to 2016, a more than fivefold increase compared to the 59 events in v1.0. Moreover, each flood event is now ranked using a multi-level categorisation based on inundation, transport disruption, costs, and fatalities: from 1 (Nuisance) to 6 (Disaster). For the 53 most severe events ranked Category 3 and above, an accompanying event description based upon the Source-Pathway-Receptor-Consequence framework was produced. Thus, SurgeWatch v2.0 provides the most comprehensive and coherent historical record of UK coastal flooding. It is designed to be a resource for research, planning, management and education. PMID:28763054
Database of historically documented springs and spring flow measurements in Texas
Heitmuller, Franklin T.; Reece, Brian D.
2003-01-01
Springs are naturally occurring features that convey excess ground water to the land surface; they represent a transition from ground water to surface water. Water issues through one opening, multiple openings, or numerous seeps in the rock or soil. The database of this report provides information about springs and spring flow in Texas including spring names, identification numbers, location, and, if available, water source and use. This database does not include every spring in Texas, but is limited to an aggregation of selected digital and hard-copy data of the U.S. Geological Survey (USGS), the Texas Water Development Board (TWDB), and Capitol Environmental Services.
The Admissions Office Goes Scientific.
ERIC Educational Resources Information Center
Bryant, Peter; Crockett, Kevin
1993-01-01
Data-based planning and management is revolutionizing college student recruitment. Data analysis focuses on historical trends, marketing and recruiting strategies, cost-effectiveness strategy, and markets. Data sources include primary market demographics, geo-demographics, secondary sources, student price response information, and institutional…
Classification of Chemicals Based On Structured Toxicity Information
Thirty years and millions of dollars worth of pesticide registration toxicity studies, historically stored as hardcopy and scanned documents, have been digitized into highly standardized and structured toxicity data within the Toxicity Reference Database (ToxRefDB). Toxicity-bas...
A Brief Historical Introduction to Determinants with Applications
ERIC Educational Resources Information Center
Debnath, L.
2013-01-01
This article deals with a short historical introduction to determinants with applications to the theory of equations, geometry, multiple integrals, differential equations and linear algebra. Included are some properties of determinants with proofs, eigenvalues, eigenvectors and characteristic equations with examples of applications to simple…
A web-based approach for electrocardiogram monitoring in the home.
Magrabi, F; Lovell, N H; Celler, B G
1999-05-01
A Web-based electrocardiogram (ECG) monitoring service in which a longitudinal clinical record is used for management of patients, is described. The Web application is used to collect clinical data from the patient's home. A database on the server acts as a central repository where this clinical information is stored. A Web browser provides access to the patient's records and ECG data. We discuss the technologies used to automate the retrieval and storage of clinical data from a patient database, and the recording and reviewing of clinical measurement data. On the client's Web browser, ActiveX controls embedded in the Web pages provide a link between the various components including the Web server, Web page, the specialised client side ECG review and acquisition software, and the local file system. The ActiveX controls also implement FTP functions to retrieve and submit clinical data to and from the server. An intelligent software agent on the server is activated whenever new ECG data is sent from the home. The agent compares historical data with newly acquired data. Using this method, an optimum patient care strategy can be evaluated, a summarised report along with reminders and suggestions for action is sent to the doctor and patient by email.
Implementation of a data management software system for SSME test history data
NASA Technical Reports Server (NTRS)
Abernethy, Kenneth
1986-01-01
The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.
Generalized Database Management System Support for Numeric Database Environments.
ERIC Educational Resources Information Center
Dominick, Wayne D.; Weathers, Peggy G.
1982-01-01
This overview of potential for utilizing database management systems (DBMS) within numeric database environments highlights: (1) major features, functions, and characteristics of DBMS; (2) applicability to numeric database environment needs and user needs; (3) current applications of DBMS technology; and (4) research-oriented and…
Summary of recovered historical ground-water-level data for Michigan, 1934-2005
Cornett, Cassaundra L.; Crowley, Suzanne L.; McGowan, Rose M.; Blumer, Stephen P.; Reeves, Howard W.
2006-01-01
This report documents ground-water-level data-recovery efforts performed by the USGS Michigan Water Science Center and provides nearly three-hundred hydrographs generated from these recovered data. Data recovery is the process of verifying and transcribing data from paper files into the USGS National Water Information System (NWIS) electronic databases appropriate for ground-water-level data. Entering these data into the NWIS databases makes them more useful for USGS analysis and also makes them available to the public through the internet.
A Tactical Framework for Cyberspace Situational Awareness
2010-06-01
Command & Control 1. VOIP Telephone 2. Internet Chat 3. Web App ( TBMCS ) 4. Email 5. Web App (PEX) 6. Database (CAMS) 7. Database (ARMS) 8...Database (LogMod) 9. Resource (WWW) 10. Application (PFPS) Mission Planning 1. Application (PFPS) 2. Email 3. Web App ( TBMCS ) 4. Internet Chat...1. Web App (PEX) 2. Database (ARMS) 3. Web App ( TBMCS ) 4. Email 5. Database (CAMS) 6. VOIP Telephone 7. Application (PFPS) 8. Internet Chat 9
The future application of GML database in GIS
NASA Astrophysics Data System (ADS)
Deng, Yuejin; Cheng, Yushu; Jing, Lianwen
2006-10-01
In 2004, the Geography Markup Language (GML) Implementation Specification (version 3.1.1) was published by Open Geospatial Consortium, Inc. Now more and more applications in geospatial data sharing and interoperability depend on GML. The primary purpose of designing GML is for exchange and transportation of geo-information by standard modeling and encoding of geography phenomena. However, the problems of how to organize and access lots of GML data effectively arise in applications. The research on GML database focuses on these problems. The effective storage of GML data is a hot topic in GIS communities today. GML Database Management System (GDBMS) mainly deals with the problem of storage and management of GML data. Now two types of XML database, namely Native XML Database, and XML-Enabled Database are classified. Since GML is an application of the XML standard to geographic data, the XML database system can also be used for the management of GML. In this paper, we review the status of the art of XML database, including storage, index and query languages, management systems and so on, then move on to the GML database. At the end, the future prospect of GML database in GIS application is presented.
4-(2-Methyl-4-chlorophenoxy) butyric acid (MCPB)
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) 4 - ( 2 - Methyl - 4
The Better Mousetrap...Can Be Built by Engineers.
ERIC Educational Resources Information Center
McBride, Matthew
2003-01-01
Describes the growth of the INSPEC database developed by the Institution of Electrical Engineers. Highlights include an historical background of its growth from "Science Abstracts"; production methods, including computerization; indexing, including controlled (thesaurus-based), uncontrolled, chemical, and numerical indexing; and the…
International exploration of Mars. A special bibliography
NASA Technical Reports Server (NTRS)
1991-01-01
This bibliography lists 173 reports, articles, and other documents introduced into the NASA Scientific and Technical Information Database on the exploration of Mars. Historical references are cited for background. The bibliography was created for the 1991 session of the International Space University.
Applications of Database Machines in Library Systems.
ERIC Educational Resources Information Center
Salmon, Stephen R.
1984-01-01
Characteristics and advantages of database machines are summarized and their applications to library functions are described. The ability to attach multiple hosts to the same database and flexibility in choosing operating and database management systems for different functions without loss of access to common database are noted. (EJS)
ERIC Educational Resources Information Center
Forrest, Melanie D.
This curriculum guide is intended for Missouri teachers teaching a course in database applications for high school students enrolled in marketing and cooperative education. The curriculum presented includes learning activities in which students are taught to analyze database tables containing the types of data typically encountered by employees…
Applications of GIS and database technologies to manage a Karst Feature Database
Gao, Y.; Tipping, R.G.; Alexander, E.C.
2006-01-01
This paper describes the management of a Karst Feature Database (KFD) in Minnesota. Two sets of applications in both GIS and Database Management System (DBMS) have been developed for the KFD of Minnesota. These applications were used to manage and to enhance the usability of the KFD. Structured Query Language (SQL) was used to manipulate transactions of the database and to facilitate the functionality of the user interfaces. The Database Administrator (DBA) authorized users with different access permissions to enhance the security of the database. Database consistency and recovery are accomplished by creating data logs and maintaining backups on a regular basis. The working database provides guidelines and management tools for future studies of karst features in Minnesota. The methodology of designing this DBMS is applicable to develop GIS-based databases to analyze and manage geomorphic and hydrologic datasets at both regional and local scales. The short-term goal of this research is to develop a regional KFD for the Upper Mississippi Valley Karst and the long-term goal is to expand this database to manage and study karst features at national and global scales.
Historical sources on climate and extreme events before XX century in Calabria (Italy)
NASA Astrophysics Data System (ADS)
Aurora Pasqua, Angela; Petrucci, Olga
2014-05-01
Damaging Hydrogeological Events (DHEs) are defined as the occurrence of destructive phenomena, such as landslides and floods, triggered by extreme rain events. Due to the huge damage that they can cause to people and properties, DHEs are often described in a wide series of historical sources. The historical series of DHEs that affected a study region can supply useful information about the climatic trend of the area. Moreover, it can reveals temporal and spatial increases in vulnerability affecting sectors where urbanization increased throughout the time. On the other side, it can highlight further vulnerability variations occurred throughout the decades and related to specific defensive measures undertaken (or abandoned) in order to prevent damage caused by either landslides or floods. We present the historical series of catastrophic DHEs which affected a Mediterranean region named Calabria that is located in southern Italy. Data presented came from the database named ASICal (the Italian acronym of historically flooded areas in Calabria) that has been built at the beginning of 2000 at CNR-IRPI of Cosenza and that has been continuously updated since then. Currently, this database includes more than 11,000 records about floods and landslides which have been occurred in Calabria since the XVI century. These data came from different information sources as newspapers, archives of regional and national agencies, scientific and technical reports, on-site surveys reports and so on. ASICal is constantly updated. The updating concerns both current DHEs that every years affect the region, and the results of specific historical research that we regularly perform in order to fill data gaps for older epochs. In this work we present the result of a recent survey carried out in some regional public libraries focusing on the early-mid XIX century. The type of data sources available for the regional framework are described and a sketch of the DHEs trend during the last three centuries is presented. Moreover, a panoramic view of both proxy data and irregularly measured parameters concerning climatic trend of the region obtained from the analyzed historical sources is also shown.
Teaching Geography and History through GIS: Application on Greek cultural sites
NASA Astrophysics Data System (ADS)
Skentos, Athanasios; Pavlopoulos, Kosmas; Galani, Apostolia; Theodorakopoulou, Katerina; Kritikos, Giorgos
2013-04-01
This study deals with the presentation of cultural succession in Greek space-time through a GIS application, associated with core concepts of geographic and historical education. Through the specific application students will be able to develop five distinct skills: sense of time-scale, historical and geographic comprehension, spatial analysis and interpretation, ability to perform geo-historical research, and procedure of geo-historical decision-making. The methodology is based on the calibration of a set of criteria for each cultural site that covers the topics of economy, geomorphology, society, religion, art and science. Further analysis of these data forms a geodatabase. In addition, palaeogeographic and historical maps of the cultural sites derived by the geodatabase provide information about temporal and spatial changes. As result, students will be able to develop a multidimensional and interdisciplinary approach, in order to reconstruct the evolution of the site.
NASA Astrophysics Data System (ADS)
Kelbert, A.; Blum, C.
2015-12-01
Magnetotelluric Transfer Functions (MT TFs) represent most of the information about Earth electrical conductivity found in the raw electromagnetic data, providing inputs for further inversion and interpretation. To be useful for scientific interpretation, they must also contain carefully recorded metadata. Making these data available in a discoverable and citable fashion would provide the most benefit to the scientific community, but such a development requires that the metadata is not only present in the file but is also searchable. The most commonly used MT TF format to date, the historical Society of Exploration Geophysicists Electromagnetic Data Interchange Standard 1987 (EDI), no longer supports some of the needs of modern magnetotellurics, most notably accurate error bars recording. Moreover, the inherent heterogeneity of EDI's and other historic MT TF formats has mostly kept the community away from healthy data sharing practices. Recently, the MT team at Oregon State University in collaboration with IRIS Data Management Center developed a new, XML-based format for MT transfer functions, and an online system for long-term storage, discovery and sharing of MT TF data worldwide (IRIS SPUD; www.iris.edu/spud/emtf). The system provides a query page where all of the MT transfer functions collected within the USArray MT experiment and other field campaigns can be searched for and downloaded; an automatic on-the-fly conversion to the historic EDI format is also included. To facilitate conversion to the new, more comprehensive and sustainable, XML format for MT TFs, and to streamline inclusion of historic data into the online database, we developed a set of open source format conversion tools, which can be used for rotation of MT TFs as well as a general XML <-> EDI converter (https://seiscode.iris.washington.edu/projects/emtf-fcu). Here, we report on the newly established collaboration between the USGS Geomagnetism Program and the Oregon State University to gather and convert both historic and modern-day MT or related transfer functions into the searchable database at the IRIS DMC. The more complete and free access to these previously collected MT TFs will be of great value to MT scientists both in planning future surveys, and then to leverage the value of the new data at the inversion and interpretation stage.
An Historical Summary and Prospects for the Future of Spacecraft Batteries
NASA Technical Reports Server (NTRS)
Halpert, Gerald; Surampudi, S.
1998-01-01
Subjects covered in this report include a historical evolution of batteries in space, evolution and status of nickel-cadmium batteries and nickel-hydrogen batteries, present applications, future applications and advanced batteries for future missions.
4-(2,4-Dichlorophenoxy)butyric acid (2,4-DB)
Integrated Risk Information System (IRIS)
Integrated Risk Information System ( IRIS ) Chemical Assessment Summary U.S . Environmental Protection Agency National Center for Environmental Assessment This IRIS Summary has been removed from the IRIS database and is available for historical reference purposes . ( July 2016 ) 4 - ( 2,4 - Dichloro
Copyright, Licensing Agreements and Gateways.
ERIC Educational Resources Information Center
Elias, Arthur W.
1990-01-01
Discusses technological developments in information distribution and management in relation to concepts of ownership. A historical overview of the concept of copyright is presented; licensing elements for databases are examined; and implications for gateway systems are explored, including ownership, identification of users, and allowable uses of…
Past and Future Trends in Light Truck Sales.
DOT National Transportation Integrated Search
1981-08-01
This report uses the Wharton EFA Motor Vehicle Demand Model (Mark II) and its associated databases to discuss and analyze past and future trends in the Light Duty Truck market. The dynamic historical growth in this market and its implications for ene...
Alliance Building in the Information and Online Database Industry.
ERIC Educational Resources Information Center
Alexander, Johanna Olson
2001-01-01
Presents an analysis of information industry alliance formation using environmental scanning methods. Highlights include why libraries and academic institutions should be interested; a literature review; historical context; industry and market structures; commercial and academic models; trends; and implications for information providers,…
Garrido-Martín, Diego; Pazos, Florencio
2018-02-27
The exponential accumulation of new sequences in public databases is expected to improve the performance of all the approaches for predicting protein structural and functional features. Nevertheless, this was never assessed or quantified for some widely used methodologies, such as those aimed at detecting functional sites and functional subfamilies in protein multiple sequence alignments. Using raw protein sequences as only input, these approaches can detect fully conserved positions, as well as those with a family-dependent conservation pattern. Both types of residues are routinely used as predictors of functional sites and, consequently, understanding how the sequence content of the databases affects them is relevant and timely. In this work we evaluate how the growth and change with time in the content of sequence databases affect five sequence-based approaches for detecting functional sites and subfamilies. We do that by recreating historical versions of the multiple sequence alignments that would have been obtained in the past based on the database contents at different time points, covering a period of 20 years. Applying the methods to these historical alignments allows quantifying the temporal variation in their performance. Our results show that the number of families to which these methods can be applied sharply increases with time, while their ability to detect potentially functional residues remains almost constant. These results are informative for the methods' developers and final users, and may have implications in the design of new sequencing initiatives.
The CTBTO Link to the database of the International Seismological Centre (ISC)
NASA Astrophysics Data System (ADS)
Bondar, I.; Storchak, D. A.; Dando, B.; Harris, J.; Di Giacomo, D.
2011-12-01
The CTBTO Link to the database of the International Seismological Centre (ISC) is a project to provide access to seismological data sets maintained by the ISC using specially designed interactive tools. The Link is open to National Data Centres and to the CTBTO. By means of graphical interfaces and database queries tailored to the needs of the monitoring community, the users are given access to a multitude of products. These include the ISC and ISS bulletins, covering the seismicity of the Earth since 1904; nuclear and chemical explosions; the EHB bulletin; the IASPEI Reference Event list (ground truth database); and the IDC Reviewed Event Bulletin. The searches are divided into three main categories: The Area Based Search (a spatio-temporal search based on the ISC Bulletin), the REB search (a spatio-temporal search based on specific events in the REB) and the IMS Station Based Search (a search for historical patterns in the reports of seismic stations close to a particular IMS seismic station). The outputs are HTML based web-pages with a simplified version of the ISC Bulletin showing the most relevant parameters with access to ISC, GT, EHB and REB Bulletins in IMS1.0 format for single or multiple events. The CTBTO Link offers a tool to view REB events in context within the historical seismicity, look at observations reported by non-IMS networks, and investigate station histories and residual patterns for stations registered in the International Seismographic Station Registry.
Troutman, Sandra M.; Stanley, Richard G.
2003-01-01
This database and accompanying text depict historical and modern reported occurrences of petroleum both in wells and at the surface within the boundaries of the Central Alaska Province. These data were compiled from previously published and unpublished sources and were prepared for use in the 2002 U.S. Geological Survey petroleum assessment of Central Alaska, Yukon Flats region. Indications of petroleum are described as oil or gas shows in wells, oil or gas seeps, or outcrops of oil shale or oil-bearing rock and include confirmed and unconfirmed reports. The scale of the source map limits the spatial resolution (scale) of the database to 1:2,500,000 or smaller.
Using CLIPS in a distributed system: The Network Control Center (NCC) expert system
NASA Technical Reports Server (NTRS)
Wannemacher, Tom
1990-01-01
This paper describes an intelligent troubleshooting system for the Help Desk domain. It was developed on an IBM-compatible 80286 PC using Microsoft C and CLIPS and an AT&T 3B2 minicomputer using the UNIFY database and a combination of shell script, C programs and SQL queries. The two computers are linked by a lan. The functions of this system are to help non-technical NCC personnel handle trouble calls, to keep a log of problem calls with complete, concise information, and to keep a historical database of problems. The database helps identify hardware and software problem areas and provides a source of new rules for the troubleshooting knowledge base.
Geolocation of man-made reservoirs across terrains of varying complexity using GIS
Mixon, D.M.; Kinner, D.A.; Stallard, R.F.; Syvitski, J.P.M.
2008-01-01
The Reservoir Sedimentation Survey Information System (RESIS) is one of the world's most comprehensive databases of reservoir sedimentation rates, comprising nearly 6000 surveys for 1819 reservoirs across the continental United States. Sediment surveys in the database date from 1904 to 1999, though more than 95% of surveys were entered prior to 1980, making RESIS largely a historical database. The use of this database for large-scale studies has been limited by the lack of precise coordinates for the reservoirs. Many of the reservoirs are relatively small structures and do not appear on current USGS topographic maps. Others have been renamed or have only approximate (i.e. township and range) coordinates. This paper presents a method scripted in ESRI's ARC Macro Language (AML) to locate the reservoirs on digital elevation models using information available in RESIS. The script also delineates the contributing watersheds and compiles several hydrologically important parameters for each reservoir. Evaluation of the method indicates that, for watersheds larger than 5 km2, the correct outlet is identified over 80% of the time. The importance of identifying the watershed outlet correctly depends on the application. Our intent is to collect spatial data for watersheds across the continental United States and describe the land use, soils, and topography for each reservoir's watershed. Because of local landscape similarity in these properties, we show that choosing the incorrect watershed does not necessarily mean that the watershed characteristics will be misrepresented. We present a measure termed terrain complexity and examine its relationship to geolocation success rate and its influence on the similarity of nearby watersheds. ?? 2008 Elsevier Ltd. All rights reserved.
Development of a Global Fire Weather Database
NASA Technical Reports Server (NTRS)
Field, R. D.; Spessa, A. C.; Aziz, N. A.; Camia, A.; Cantin, A.; Carr, R.; de Groot, W. J.; Dowdy, A. J.; Flannigan, M. D.; Manomaiphiboon, K.;
2015-01-01
The Canadian Forest Fire Weather Index (FWI) System is the mostly widely used fire danger rating system in the world. We have developed a global database of daily FWI System calculations, beginning in 1980, called the Global Fire WEather Database (GFWED) gridded to a spatial resolution of 0.5 latitude by 2/3 longitude. Input weather data were obtained from the NASA Modern Era Retrospective- Analysis for Research and Applications (MERRA), and two different estimates of daily precipitation from rain gauges over land. FWI System Drought Code calculations from the gridded data sets were compared to calculations from individual weather station data for a representative set of 48 stations in North, Central and South America, Europe, Russia, Southeast Asia and Australia. Agreement between gridded calculations and the station-based calculations tended to be most different at low latitudes for strictly MERRA based calculations. Strong biases could be seen in either direction: MERRA DC over the Mato Grosso in Brazil reached unrealistically high values exceeding DCD1500 during the dry season but was too low over Southeast Asia during the dry season. These biases are consistent with those previously identified in MERRA's precipitation, and they reinforce the need to consider alternative sources of precipitation data. GFWED can be used for analyzing historical relationships between fire weather and fire activity at continental and global scales, in identifying large-scale atmosphere-ocean controls on fire weather, and calibration of FWI-based fire prediction models.
NASA Astrophysics Data System (ADS)
Knouft, J.; Ficklin, D. L.; Bart, H. L.; Rios, N. E.
2017-12-01
Streamflow and water temperature are primary factors influencing the traits, distribution, and diversity of freshwater species. Ongoing changes in climate are causing directional alteration of these environmental conditions, which can impact local ecological processes. Accurate estimation of these variables is critical for predicting the responses of species to ongoing changes in freshwater habitat, yet ecologically relevant high-resolution data describing variation in streamflow and water temperature across North America are not available. Considering the vast amount of web-accessible freshwater biodiversity data, development and application of appropriate hydrologic data are critical to the advancement of our understanding of freshwater systems. To address this issue, we are developing the "HydroClim" database, which will provide web-accessible (www.hydroclim.org) historical and projected monthly streamflow and water temperature data for stream sections in all major watersheds across the United States and Canada from 1950-2099. These data will also be integrated with FishNet 2 (www.fishnet2.net), an online biodiversity database that provides open access to over 2 million localities of freshwater fish species in the United States and Canada, thus allowing for the characterization of the habitat requirements of freshwater species across this region. HydroClim should provide a vast array of opportunities for a greater understanding of water resources as well as information for the conservation of freshwater biodiversity in the United States and Canada in the coming century.
Compilation of historical water-quality data for selected springs in Texas, by ecoregion
Heitmuller, Franklin T.; Williams, Iona P.
2006-01-01
Springs are important hydrologic features in Texas. A database of about 2,000 historically documented springs and available spring-flow measurements previously has been compiled and published, but water-quality data remain scattered in published sources. This report by the U.S. Geological Survey, in cooperation with the Texas Parks and Wildlife Department, documents the compilation of data for 232 springs in Texas on the basis of a set of criteria and the development of a water-quality database for the selected springs. The selection of springs for compilation of historical water-quality data in Texas was made using existing digital and hard-copy data, responses to mailed surveys, selection criteria established by various stakeholders, geographic information systems, and digital database queries. Most springs were selected by computing the highest mean spring flows for each Texas level III ecoregion. A brief assessment of the water-quality data for springs in Texas shows that few data are available in the Arizona/New Mexico Mountains, High Plains, East Central Texas Plains, Western Gulf Coastal Plain, and South Central Plains ecoregions. Water-quality data are more abundant for the Chihuahuan Deserts, Edwards Plateau, and Texas Blackland Prairies ecoregions. Selected constituent concentrations in Texas springs, including silica, calcium, magnesium, sodium, potassium, strontium, sulfate, chloride, fluoride, nitrate (nitrogen), dissolved solids, and hardness (as calcium carbonate) are comparatively high in the Chihuahuan Deserts, Southwestern Tablelands, Central Great Plains, and Cross Timbers ecoregions, mostly as a result of subsurface geology. Comparatively low concentrations of selected constituents in Texas springs are associated with the Arizona/New Mexico Mountains, Southern Texas Plains, East Central Texas Plains, and South Central Plains ecoregions.
Evaluation of relational and NoSQL database architectures to manage genomic annotations.
Schulz, Wade L; Nelson, Brent G; Felker, Donn K; Durant, Thomas J S; Torres, Richard
2016-12-01
While the adoption of next generation sequencing has rapidly expanded, the informatics infrastructure used to manage the data generated by this technology has not kept pace. Historically, relational databases have provided much of the framework for data storage and retrieval. Newer technologies based on NoSQL architectures may provide significant advantages in storage and query efficiency, thereby reducing the cost of data management. But their relative advantage when applied to biomedical data sets, such as genetic data, has not been characterized. To this end, we compared the storage, indexing, and query efficiency of a common relational database (MySQL), a document-oriented NoSQL database (MongoDB), and a relational database with NoSQL support (PostgreSQL). When used to store genomic annotations from the dbSNP database, we found the NoSQL architectures to outperform traditional, relational models for speed of data storage, indexing, and query retrieval in nearly every operation. These findings strongly support the use of novel database technologies to improve the efficiency of data management within the biological sciences. Copyright © 2016 Elsevier Inc. All rights reserved.
Ontology to relational database transformation for web application development and maintenance
NASA Astrophysics Data System (ADS)
Mahmudi, Kamal; Inggriani Liem, M. M.; Akbar, Saiful
2018-03-01
Ontology is used as knowledge representation while database is used as facts recorder in a KMS (Knowledge Management System). In most applications, data are managed in a database system and updated through the application and then they are transformed to knowledge as needed. Once a domain conceptor defines the knowledge in the ontology, application and database can be generated from the ontology. Most existing frameworks generate application from its database. In this research, ontology is used for generating the application. As the data are updated through the application, a mechanism is designed to trigger an update to the ontology so that the application can be rebuilt based on the newest ontology. By this approach, a knowledge engineer has a full flexibility to renew the application based on the latest ontology without dependency to a software developer. In many cases, the concept needs to be updated when the data changed. The framework is built and tested in a spring java environment. A case study was conducted to proof the concepts.
Effective Use of Java Data Objects in Developing Database Applications; Advantages and Disadvantages
2004-06-01
DATA OBJECTS IN DEVELOPING DATABASE APPLICATIONS. ADVANTAGES AND DISADVANTAGES Paschalis Zilidis June 2004 Thesis Advisor: Thomas...Objects in Developing Database Applications. Advantages and Disadvantages 6. AUTHOR(S) Paschalis ZILIDIS 5. FUNDING NUMBERS 7. PERFORMING...database for the backend datastore. The major disadvantage of this approach is the well-known “impedance mismatch” in which some form of mapping is
The National Eutrophication Survey: lake characteristics and historical nutrient concentrations
NASA Astrophysics Data System (ADS)
Stachelek, Joseph; Ford, Chanse; Kincaid, Dustin; King, Katelyn; Miller, Heather; Nagelkirk, Ryan
2018-01-01
Historical ecological surveys serve as a baseline and provide context for contemporary research, yet many of these records are not preserved in a way that ensures their long-term usability. The National Eutrophication Survey (NES) database is currently only available as scans of the original reports (PDF files) with no embedded character information. This limits its searchability, machine readability, and the ability of current and future scientists to systematically evaluate its contents. The NES data were collected by the US Environmental Protection Agency between 1972 and 1975 as part of an effort to investigate eutrophication in freshwater lakes and reservoirs. Although several studies have manually transcribed small portions of the database in support of specific studies, there have been no systematic attempts to transcribe and preserve the database in its entirety. Here we use a combination of automated optical character recognition and manual quality assurance procedures to make these data available for analysis. The performance of the optical character recognition protocol was found to be linked to variation in the quality (clarity) of the original documents. For each of the four archival scanned reports, our quality assurance protocol found an error rate between 5.9 and 17 %. The goal of our approach was to strike a balance between efficiency and data quality by combining entry of data by hand with digital transcription technologies. The finished database contains information on the physical characteristics, hydrology, and water quality of about 800 lakes in the contiguous US (Stachelek et al.(2017), https://doi.org/10.5063/F1639MVD). Ultimately, this database could be combined with more recent studies to generate meta-analyses of water quality trends and spatial variation across the continental US.
78 FR 11235 - Meetings of Humanities Panel
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
... Research grant program on the subject of New World Archaeology and Anthropology, submitted to the Division... discuss applications for the America's Historical & Cultural Organizations Implementation grant program on...: 421. This meeting will discuss applications for the America's Historical & Cultural Organizations...
Historical Research: A Lens for Family and Consumer Sciences
ERIC Educational Resources Information Center
Nickols, Sharon Y.
2017-01-01
This article describes various historical research methods and presents examples of their application in family and consumer sciences (FCS). Historical research preserves the heritage of FCS through greater understanding of the development of the field and practitioners of the profession. Furthermore, historical research can revitalize…
36 CFR 67.11 - Fees for processing rehabilitation certification requests.
Code of Federal Regulations, 2010 CFR
2010-07-01
... building projects where there is no historic functional relationship among the structures and which are... certified historic structure as provided by the owner in the Historic Preservation Certification Application... rehabilitation of a separate certified historic structure will be considered a separate project for purposes of...
Survey of Machine Learning Methods for Database Security
NASA Astrophysics Data System (ADS)
Kamra, Ashish; Ber, Elisa
Application of machine learning techniques to database security is an emerging area of research. In this chapter, we present a survey of various approaches that use machine learning/data mining techniques to enhance the traditional security mechanisms of databases. There are two key database security areas in which these techniques have found applications, namely, detection of SQL Injection attacks and anomaly detection for defending against insider threats. Apart from the research prototypes and tools, various third-party commercial products are also available that provide database activity monitoring solutions by profiling database users and applications. We present a survey of such products. We end the chapter with a primer on mechanisms for responding to database anomalies.
Transit of Venus Culture: A Celestial Phenomenon Intrigues the Public
NASA Astrophysics Data System (ADS)
Bueter, Chuck
2012-01-01
When Jeremiah Horrocks first observed it in 1639, the transit of Venus was a desirable telescopic target because of its scientific value. By the next transit of Venus in 1761, though, the enlightened public also embraced it as a popular celestial phenomenon. Its stature elevated over the centuries, the transit of Venus has been featured in music, poetry, stamps, plays, books, and art. The June 2004 transit emerged as a surprising global sensation, as suggested by the search queries it generated. Google's Zeitgeist deemed Venus Transit to be the #1 Most Popular Event in the world for that month. New priorities, technologies, and media have brought new audiences to the rare alignment. As the 2012 transit of Venus approaches, the trend continues with publicly accessible capabilities that did not exist only eight years prior. For example, sites from which historic observations have been made are plotted and readily available on Google Earth. A transit of Venus phone app in development will, if fully funded, facilitate a global effort to recreate historic expeditions by allowing smartphone users to submit their observed transit timings to a database for quantifying the Astronomical Unit. While maintaining relevance in modern scientific applications, the transit of Venus has emerged as a cultural attraction that briefly intrigues the mainstream public and inspires their active participation in the spectacle.
Historical Collections | Alaska State Library
Microfilm eResources Electronic Books (EBSCO) World Catalog (WorldCat) Free Images and Sounds Journal Finder Publications Catalog and Library Card Info Federal Publications Free Images and Sounds Library Resources Articles & Databases Free Images & Sounds Journal Finder Library Resources Live Homework Help
ERIC Educational Resources Information Center
Manzo, Kathleen Kennedy
2007-01-01
As part of a professional development program organized by the Save Ellis Island Foundation, the exhibits, databases, photo archives, and recorded interviews at the island's museum helps put the nation's current immigration debate into a broader historical context. Teachers at these sessions learn from scholars and park personnel about early…
NASA Astrophysics Data System (ADS)
Liu, Zhi Feng; Wang, Ying; Huang, Dong Hui
2018-06-01
In the wake of big data and Internet plus era, continuous infiltration of digital technology has been happening in various fields of social and economic development. As the most important material carrier of historical culture, the historical value of historical buildings is produced and accumulated in its historical evolution, and it can only be protected from being created again. Based on the background of digitization of cultural resources, this paper summarizes the relevant digital technologies for the digital translation of information on buildings of historical and cultural heritage, as a means to promote the spread of the PDA+APPS mobile terminal, so as to achieve the purpose of preservation, protection, management and publicity. Meanwhile, this paper analyzes the application of digital technology in this field and the prospect of its function.
Pant, Kamala; Springer, S; Bruce, S; Lawlor, T; Hewitt, N; Aardema, M J
2014-10-01
There is increased interest in the in vivo comet assay in rodents as a follow-up approach for determining the biological relevance of chemicals that are genotoxic in in vitro assays. This is partly because, unlike other assays, DNA damage can be assessed in this assay in virtually any tissue. Since background levels of DNA damage can vary with the species, tissue, and cell processing method, a robust historical control database covering multiple tissues is essential. We describe extensive vehicle and positive control data for multiple tissues from rats and mice. In addition, we report historical data from control and genotoxin-treated human blood. Technical issues impacting comet results are described, including the method of cell preparation and freezing. Cell preparation by scraping (stomach and other GI tract organs) resulted in higher % tail DNA than mincing (liver, spleen, kidney etc) or direct collection (blood or bone marrow). Treatment with the positive control genotoxicant, ethyl methanesulfonate (EMS) in rats and methyl methanesulfonate in mice, resulted in statistically significant increases in % tail DNA. Background DNA damage was not markedly increased when cell suspensions were stored frozen prior to preparing slides, and the outcome of the assay was unchanged (EMS was always positive). In conclusion, historical data from our laboratory for the in vivo comet assay for multiple tissues from rats and mice, as well as human blood show very good reproducibility. These data and recommendations provided are aimed at contributing to the design and proper interpretation of results from comet assays. © 2014 Wiley Periodicals, Inc.
Architectural Heritage Visualization Using Interactive Technologies
NASA Astrophysics Data System (ADS)
Albourae, A. T.; Armenakis, C.; Kyan, M.
2017-08-01
With the increased exposure to tourists, historical monuments are at an ever-growing risk of disappearing. Building Information Modelling (BIM) offers a process of digitally documenting of all the features that are made or incorporated into the building over its life-span, thus affords unique opportunities for information preservation. BIM of historical buildings are called Historical Building Information Models (HBIM). This involves documenting a building in detail throughout its history. Geomatics professionals have the potential to play a major role in this area as they are often the first professionals involved on construction development sites for many Architectural, Engineering, and Construction (AEC) projects. In this work, we discuss how to establish an architectural database of a heritage site, digitally reconstruct, preserve and then interact with it through an immersive environment that leverages BIM for exploring historic buildings. The reconstructed heritage site under investigation was constructed in the early 15th century. In our proposed approach, the site selection was based on many factors such as architectural value, size, and accessibility. The 3D model is extracted from the original collected and integrated data (Image-based, range-based, CAD modelling, and land survey methods), after which the elements of the 3D objects are identified by creating a database using the BIM software platform (Autodesk Revit). The use of modern and widely accessible game engine technology (Unity3D) is explored, allowing the user to fully embed and interact with the scene using handheld devices. The details of implementing an integrated pipeline between HBIM, GIS and augmented and virtual reality (AVR) tools and the findings of the work are presented.
Historical literature review on waste classification and categorization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croff, A.G.; Richmond, A.A.; Williams, J.P.
1995-03-01
The Staff of the Waste Management Document Library (WMDL), in cooperation with Allen Croff have been requested to provide information support for a historical search concerning waste categorization/classification. This bibliography has been compiled under the sponsorship of Oak Ridge National Laboratory`s Chemical Technology Division to help in Allen`s ongoing committee work with the NRC/NRCP. After examining the search, Allen Croff saw the value of the search being published. Permission was sought from the database providers to allow limited publication (i.e. 20--50 copies) of the search for internal distribution at the Oak Ridge National Laboratory and for Allen Croff`s associated committee.more » Citations from the database providers who did not grant legal permission for their material to be published have been omitted from the literature review. Some of the longer citations have been included in an abbreviated form in the search to allow the format of the published document to be shortened from approximately 1,400 pages. The bibliography contains 372 citations.« less
Uses of the Word “Macula” in Written English, 1400-Present
Schwartz, Stephen G.; Leffler, Christopher T.
2014-01-01
We compiled uses of the word “macula” in written English by searching multiple databases, including the Early English Books Online Text Creation Partnership, America’s Historical Newspapers, the Gale Cengage Collections, and others. “Macula” has been used: as a non-medical “spot” or “stain”, literal or figurative, including in astronomy and in Shakespeare; as a medical skin lesion, occasionally with a following descriptive adjective, such as a color (macula alba); as a corneal lesion, including the earliest identified use in English, circa 1400; and to describe the center of the retina. Francesco Buzzi described a yellow color in the posterior pole (“retina tinta di un color giallo”) in 1782, but did not use the word “macula”. “Macula lutea” was published by Samuel Thomas von Sömmering by 1799, and subsequently used in 1818 by James Wardrop, which appears to be the first known use in English. The Google n-gram database shows a marked increase in the frequencies of both “macula” and “macula lutea” following the introduction of the ophthalmoscope in 1850. “Macula” has been used in multiple contexts in written English. Modern databases provide powerful tools to explore historical uses of this word, which may be underappreciated by contemporary ophthalmologists. PMID:24913329
Landsat-4 and Landsat-5 thematic mapper band 6 historical performance and calibration
Barsi, J.A.; Chander, G.; Markham, B.L.; Higgs, N.; ,
2005-01-01
Launched in 1982 and 1984 respectively, the Landsat-4 and -5 Thematic Mappers (TM) are the backbone of an extensive archive of moderate resolution Earth imagery. However, these sensors and their data products were not subjected to the type of intensive monitoring that has been part of the Landsat-7 system since its launch in 1999. With Landsat-4's 11 year and Landsat-5's 20+ year data record, there is a need to understand the historical behavior of the instruments in order to verify the scientific integrity of the archive and processed products. Performance indicators of the Landsat-4 and -5 thermal bands have recently been extracted from a processing system database allowing for a more complete study of thermal band characteristics and calibration than was previously possible. The database records responses to the internal calibration system, instrument temperatures and applied gains and offsets for each band for every scene processed through the National Landsat Archive Production System (NLAPS). Analysis of this database has allowed for greater understanding of the calibration and improvement in the processing system. This paper will cover the trends in the Landsat-4 and -5 thermal bands, the effect of the changes seen in the trends, and how these trends affect the use of the thermal data.
Developing New Rainfall Estimates to Identify the Likelihood of Agricultural Drought in Mesoamerica
NASA Astrophysics Data System (ADS)
Pedreros, D. H.; Funk, C. C.; Husak, G. J.; Michaelsen, J.; Peterson, P.; Lasndsfeld, M.; Rowland, J.; Aguilar, L.; Rodriguez, M.
2012-12-01
The population in Central America was estimated at ~40 million people in 2009, with 65% in rural areas directly relying on local agricultural production for subsistence, and additional urban populations relying on regional production. Mapping rainfall patterns and values in Central America is a complex task due to the rough topography and the influence of two oceans on either side of this narrow land mass. Characterization of precipitation amounts both in time and space is of great importance for monitoring agricultural food production for food security analysis. With the goal of developing reliable rainfall fields, the Famine Early warning Systems Network (FEWS NET) has compiled a dense set of historical rainfall stations for Central America through cooperation with meteorological services and global databases. The station database covers the years 1900-present with the highest density between 1970-2011. Interpolating station data by themselves does not provide a reliable result because it ignores topographical influences which dominate the region. To account for this, climatological rainfall fields were used to support the interpolation of the station data using a modified Inverse Distance Weighting process. By blending the station data with the climatological fields, a historical rainfall database was compiled for 1970-2011 at a 5km resolution for every five day interval. This new database opens the door to analysis such as the impact of sea surface temperature on rainfall patterns, changes to the typical dry spell during the rainy season, characterization of drought frequency and rainfall trends, among others. This study uses the historical database to identify the frequency of agricultural drought in the region and explores possible changes in precipitation patterns during the past 40 years. A threshold of 500mm of rainfall during the growing season was used to define agricultural drought for maize. This threshold was selected based on assessments of crop conditions from previous seasons, and was identified as an amount roughly corresponding to significant crop loss for maize, a major crop in most of the region. Results identify areas in central Honduras and Nicaragua as well as the Altiplano region in Guatemala that experienced 15 seasons of agricultural drought for the period May-July during the years 1970-2000. Preliminary results show no clear trend in rainfall, but further investigation is needed to confirm that agricultural drought is not becoming more frequent in this region.
Assessment of Fire Occurrence and Future Fire Potential in Arctic Alaska
NASA Astrophysics Data System (ADS)
French, N. H. F.; Jenkins, L. K.; Loboda, T. V.; Bourgeau-Chavez, L. L.; Whitley, M. A.
2014-12-01
An analysis of the occurrence of fire in Alaskan tundra was completed using the relatively complete historical record of fire for the region from 1950 to 2013. Spatial fire data for Alaskan tundra regions were obtained from the Alaska Large Fire Database for the region defined from vegetation and ecoregion maps. A detailed presentation of fire records available for assessing the fire regime of the tundra regions of Alaska as well as results evaluating fire size, seasonality, and general geographic and temporal trends is included. Assessment of future fire potential was determined for three future climate scenarios at four locations across the Alaskan tundra using the Canadian Forest Fire Weather Index (FWI). Canadian Earth System Model (CanESM2) weather variables were used for historical (1850-2005) and future (2006-2100) time periods. The database includes 908 fire points and 463 fire polygons within the 482,931 km2 of Alaskan tundra. Based on the polygon database 25,656 km2 (6,340,000 acres) has burned across the six tundra ecoregions since 1950. Approximately 87% of tundra fires start in June and July across all ecoregions. Combining information from the polygon and points data records, the estimated average fire size for fire in the Alaskan Arctic region is 28.1 km2 (7,070 acres), which is much smaller than in the adjacent boreal forest region, averaging 203 km2 for high fire years. The largest fire in the database is the Imuruk Basin Fire which burned 1,680 km2 in 1954 in the Seward Peninsula region (Table 1). Assessment of future fire potential shows that, in comparison with the historical fire record, fire occurrence in Alaskan tundra is expected to increase under all three climate scenarios. Occurrences of high fire weather danger (>10 FWI) are projected to increase in frequency and magnitude in all regions modeled. The changes in fire weather conditions are expected to vary from one region to another in seasonal occurrence as well as severity and frequency of high fire weather danger. While the Alaska Large Fire Database represents the best data available for the Alaskan Arctic, and is superior to many other regions around the world, particularly Arctic regions, these fire records need to be used with some caution due to the mixed origin and minimal validation of the data; this is reviewed in the presentation.
The Polish Genetic Database of Victims of Totalitarianisms.
Ossowski, A; Kuś, M; Kupiec, T; Bykowska, M; Zielińska, G; Jasiński, M E; March, A L
2016-01-01
This paper describes the creation of the Polish Genetic Database of Victims of Totalitarianism and the first research conducted under this project. On September 28th 2012, the Pomeranian Medical University in Szczecin and the Institute of National Remembrance-Commission for Prosecution of Crimes against the Polish Nation agreed to support the creation of the Polish Genetic Database of Victims of Totalitarianism (PBGOT, www.pbgot.pl). The purpose was to employ state-of-the-art methods of forensic genetics to identify the remains of unidentified victims of Communist and Nazi totalitarian regimes. The database was designed to serve as a central repository of genetic information of the victim's DNA and that of the victim's nearest living relatives, with the goal of making a positive identification of the victim. Along the way, PGBOT encountered several challenges. First, extracting useable DNA samples from the remains of individuals who had been buried for over half a century required forensic geneticists to create special procedures and protocols. Second, obtaining genetic reference material and historical information from the victim's closest relatives was both problematic and urgent. The victim's nearest living relatives were part of a dying generation, and the opportunity to obtain the best genetic and historical information about the victims would soon die with them. For this undertaking, PGBOT assembled a team of historians, archaeologists, forensic anthropologists, and forensic geneticists from several European research institutions. The field work was divided into five broad categories: (1) exhumation of victim remains and storing their biological material for later genetic testing; (2) researching archives and historical data for a more complete profile of those killed or missing and the families that lost them; (3) locating the victim's nearest relatives to obtain genetic reference samples (swabs), (4) entering the genetic data from both victims and family members into a common database; (5) making a conclusive, final identification of the victim. PGBOT's first project was to identify victims of the Communist regime buried in hidden mass graves in the Powązki Military Cemetery in Warsaw. Throughout 2012 and 2013, PGBOT carried out archaeological exhumations in the Powązki Military Cemetery that resulted in the recovery of the skeletal remains of 194 victims in several mass graves. Of the 194 sets of remains, more than 50 victims have been successfully matched and identified through genetic evidence. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Data Mining Twitter for Science Applications
NASA Astrophysics Data System (ADS)
Teng, W. L.; Albayrak, A.; Huffman, G. J.
2016-12-01
The Twitter social microblogging database, which recently passed its tenth anniversary, is potentially a rich source of real-time and historical global information for science applications (beyond the by-now fairly familiar use of Twitter for natural hazards monitoring). Over the past several years, we have been exploring the feasibility of extracting from the Twitter data stream useful information for application to NASA precipitation research, with both "passive" and "active" participation by the twitterers. In the passive case, we have experimented with listening to the Twitter stream in real time for "precipitation" and related tweets (in different languages), applying basic filters for exact phrases, extracting location information, and mapping the resulting tweet distributions. In the active case, we have conducted preliminary experiments to evaluate different methods of engaging with potential participants. The time-varying set of "precipitation" tweets can be thought of as an organic network of rain gauges, potentially providing a widespread view of precipitation occurrence. The validation of satellite precipitation estimates is challenging, because many regions lack data or access to data, especially outside of the U.S. and in remote and developing areas. Mining the Twitter stream could augment these validation programs and, potentially, help tune existing algorithms. Though exploratory, our efforts thus far could significantly extend the application realm of Twitter, as a platform for citizen science, beyond natural hazards monitoring to science applications.
Clinical applications of hallucinogens: A review.
Garcia-Romeu, Albert; Kersgaard, Brennan; Addy, Peter H
2016-08-01
Hallucinogens fall into several different classes, as broadly defined by pharmacological mechanism of action, and chemical structure. These include psychedelics, entactogens, dissociatives, and other atypical hallucinogens. Although these classes do not share a common primary mechanism of action, they do exhibit important similarities in their ability to occasion temporary but profound alterations of consciousness, involving acute changes in somatic, perceptual, cognitive, and affective processes. Such effects likely contribute to their recreational use. However, a growing body of evidence indicates that these drugs may have therapeutic applications beyond their potential for abuse. This review will present data on several classes of hallucinogens with a particular focus on psychedelics, entactogens, and dissociatives, for which clinical utility has been most extensively documented. Information on each class is presented in turn, tracing relevant historical insights, highlighting similarities and differences between the classes from the molecular to the behavioral level, and presenting the most up-to-date information on clinically oriented research with these substances, with important ramifications for their potential therapeutic value. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
BDVC (Bimodal Database of Violent Content): A database of violent audio and video
NASA Astrophysics Data System (ADS)
Rivera Martínez, Jose Luis; Mijes Cruz, Mario Humberto; Rodríguez Vázqu, Manuel Antonio; Rodríguez Espejo, Luis; Montoya Obeso, Abraham; García Vázquez, Mireya Saraí; Ramírez Acosta, Alejandro Álvaro
2017-09-01
Nowadays there is a trend towards the use of unimodal databases for multimedia content description, organization and retrieval applications of a single type of content like text, voice and images, instead bimodal databases allow to associate semantically two different types of content like audio-video, image-text, among others. The generation of a bimodal database of audio-video implies the creation of a connection between the multimedia content through the semantic relation that associates the actions of both types of information. This paper describes in detail the used characteristics and methodology for the creation of the bimodal database of violent content; the semantic relationship is stablished by the proposed concepts that describe the audiovisual information. The use of bimodal databases in applications related to the audiovisual content processing allows an increase in the semantic performance only and only if these applications process both type of content. This bimodal database counts with 580 audiovisual annotated segments, with a duration of 28 minutes, divided in 41 classes. Bimodal databases are a tool in the generation of applications for the semantic web.
X-1 to X-Wings: Developing a Parametric Cost Model
NASA Technical Reports Server (NTRS)
Sterk, Steve; McAtee, Aaron
2015-01-01
In todays cost-constrained environment, NASA needs an X-Plane database and parametric cost model that can quickly provide rough order of magnitude predictions of cost from initial concept to first fight of potential X-Plane aircraft. This paper takes a look at the steps taken in developing such a model and reports the results. The challenges encountered in the collection of historical data and recommendations for future database management are discussed. A step-by-step discussion of the development of Cost Estimating Relationships (CERs) is then covered.
Making historic loss data comparable over time and place
NASA Astrophysics Data System (ADS)
Eichner, Jan; Steuer, Markus; Löw, Petra
2017-04-01
When utilizing historic loss data for present day risk assessment, it is necessary to make the data comparable over time and place. To achieve this, the assessment of costs from natural hazard events requires consistent and homogeneous methodologies for loss estimation as well as a robust treatment of loss data to estimate and/or reduce distorting effects due to a temporal bias in the reporting of small-scale loss events. Here we introduce Munich Re's NatCatSERVICE loss database and present a novel methodology of peril-specific normalization of the historic losses (to account for socio-economic growth of assets over time), and we introduce a metric of severity classification (called CatClass) that allows for a global comparison of impact severity across countries of different stages of economic development.
Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.
2015-01-01
The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 653 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from an area covering portions of the Inmachuk, Kugruk, Kiwalik, and Koyuk river drainages, Granite Mountain, and the northern Darby Mountains, located in the Bendeleben, Candle, Kotzebue, and Solomon quadrangles of eastern Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.
NASA Astrophysics Data System (ADS)
Petrucci, Olga; Pasqua, Aurora Angela; Polemio, Maurizio
2013-04-01
The present work is based on the use of a wide historical database concerning floods and landslides which occurred in Calabria, a region of southern Italy, since the seventeenth century, and including more than 11,000 records. This database has been built by collecting data coming from different information sources as newspapers, archives of regional and national agencies, scientific and technical reports, on-site surveys reports and information collected by interviewing both people involved and local administrators. This database has been continuously updated by both the results of local historical research and data coming from the daily survey of regional newspapers. Similarly, a wide archive of rainfall data for the same period and the same region has been implemented. In this work, basing on the abovementioned archives, a comparative analysis of floods that occurred in a regional sector over a long period and the climatic data characterizing the same period has been carried out, focusing on the climate trend and aiming to investigate the potential effect of climate variation on the damaging floods trend. The aim was to assess whether the frequency of floods is changing and, if so, whether these changes can be related to either rainfall and/or anthropogenic modifications. In order to assess anthropogenic modifications, the evolution of urbanized sectors of the study area in the last centuries has been reenacted by mean of comparisons, in GIS environment, of historical maps of different epochs. The annual variability of rainfall was discussed using an annual index. Short duration-high intensity rainfalls were characterized considering time series of annual maxima of 1, 3, 6, 12, and 24 hours and daily rainfall. The analysis indicates that, despite a rainfall trend favorable towards a reduction in flood occurrence, floods damage has not decreased. This seems to be mainly the effect of mismanagement of land use modifications. Moreover, the long historical series analyzed allowed us to individuate both the most frequently damaged elements and the frequently damaged geographical sectors of the study area, even with a further in depth on the cases involving people in urbanized sectors.
NASA Astrophysics Data System (ADS)
Polemio, Maurizio; Lonigro, Teresa
2013-04-01
Recent international researches have underlined the evidences of climate changes throughout the world. Among the consequences of climate change, there is the increase in the frequency and magnitude of natural disasters, such as droughts, windstorms, heat waves, landslides, floods and secondary floods (i.e. rapid accumulation or pounding of surface water with very low flow velocity). The Damaging Hydrogeological Events (DHEs) can be defined as the occurrence of one or more simultaneous aforementioned phenomena causing damages. They represent a serious problem, especially in DHE-prone areas with growing urbanisation. In these areas the increasing frequency of extreme hydrological events could be related to climate variations and/or urban development. The historical analysis of DHEs can support decision making and land-use planning, ultimately reducing natural risks. The paper proposes a methodology, based on both historical and time series approaches, used for describing the influence of climatic variability on the number of phenomena observed. The historical approach is finalised to collect phenomenon historical data. The historical flood and landslide data are important for the comprehension of the evolution of a study area and for the estimation of risk scenarios as a basis for civil protection purposes. Phenomenon historical data is useful for expanding the historical period of investigation in order to assess the occurrence trend of DHEs. The time series approach includes the collection and the statistical analysis of climatic and rainfall data (monthly rainfall, wet days, rainfall intensity, and temperature data together with the annual maximum of short-duration rainfall data, from 1 hour to 5 days), which are also used as a proxy for floods and landslides. The climatic and rainfall data are useful to characterise the climate variations and trends and to roughly assess the effects of these trends on river discharge and on the triggering of landslides. The time series approach is completed by tools to analyse simultaneously all data types. The methodology was tested considering a selected Italian region (Apulia, southern Italy). The data were collected in two databases: a damaging hydrogeological event database (1186 landslides and floods since 1918) and a climate database (from 1877; short-duration rainfall from 1921). A statistically significant decreasing trend of rainfall intensity and an increasing trend of temperature, landslides, and DHEs were observed. A generalised decreasing trend of short-duration rainfall was observed. If there is not an evident relationship between climate variability and the variability of DHE occurrences, the role of anthropogenic modifications (increasing use or misuse of flood- and landslide-prone areas) could be hypothesized to justify the increasing occurrences of floods and landslides.. This study identifies the advantages of a simplifying approach to reduce the intrinsic complexities of the spatial-temporal analysis of climate variability, permitting the simultaneous analysis of the modification of flood and landslide occurrences.
Granitto, Matthew; Bailey, Elizabeth A.; Schmidt, Jeanine M.; Shew, Nora B.; Gamble, Bruce M.; Labay, Keith A.
2011-01-01
The Alaska Geochemical Database (AGDB) was created and designed to compile and integrate geochemical data from Alaska in order to facilitate geologic mapping, petrologic studies, mineral resource assessments, definition of geochemical baseline values and statistics, environmental impact assessments, and studies in medical geology. This Microsoft Access database serves as a data archive in support of present and future Alaskan geologic and geochemical projects, and contains data tables describing historical and new quantitative and qualitative geochemical analyses. The analytical results were determined by 85 laboratory and field analytical methods on 264,095 rock, sediment, soil, mineral and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed in USGS laboratories or, under contracts, in commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects from 1962 to 2009. In addition, mineralogical data from 18,138 nonmagnetic heavy mineral concentrate samples are included in this database. The AGDB includes historical geochemical data originally archived in the USGS Rock Analysis Storage System (RASS) database, used from the mid-1960s through the late 1980s and the USGS PLUTO database used from the mid-1970s through the mid-1990s. All of these data are currently maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB were used to generate most of the AGDB data set. These data were checked for accuracy regarding sample location, sample media type, and analytical methods used. This arduous process of reviewing, verifying and, where necessary, editing all USGS geochemical data resulted in a significantly improved Alaska geochemical dataset. USGS data that were not previously in the NGDB because the data predate the earliest USGS geochemical databases, or were once excluded for programmatic reasons, are included here in the AGDB and will be added to the NGDB. The AGDB data provided here are the most accurate and complete to date, and should be useful for a wide variety of geochemical studies. The AGDB data provided in the linked database may be updated or changed periodically. The data on the DVD and in the data downloads provided with this report are current as of date of publication.
RefPrimeCouch—a reference gene primer CouchApp
Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus
2013-01-01
To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html PMID:24368831
RefPrimeCouch--a reference gene primer CouchApp.
Silbermann, Jascha; Wernicke, Catrin; Pospisil, Heike; Frohme, Marcus
2013-01-01
To support a quantitative real-time polymerase chain reaction standardization project, a new reference gene database application was required. The new database application was built with the explicit goal of simplifying not only the development process but also making the user interface more responsive and intuitive. To this end, CouchDB was used as the backend with a lightweight dynamic user interface implemented client-side as a one-page web application. Data entry and curation processes were streamlined using an OpenRefine-based workflow. The new RefPrimeCouch database application provides its data online under an Open Database License. Database URL: http://hpclife.th-wildau.de:5984/rpc/_design/rpc/view.html.
RESPONSE OF GULF COAST ESTUARIES TO NUTRIENT LOAD: DISSOLVED OXYGEN DEPLETION
GED has developed a process-based approach to hypoxia research on Pensacola Bay as a model Gulf of Mexico estuary. We selected Pensacola Bay because, like many Gulf coast estuaries, it is shallow, microtidal, and experiences seasonal hypoxia. We also have an historical database ...
The Role of Research-Oriented Universities in School Change
ERIC Educational Resources Information Center
Nur, Mary Morison
1986-01-01
The interdisciplinary school-university partnership based at Stanford University is establishing a database for developing educational policy. The following features are discussed: (1) historical perspective; (2) data collection/feedback process and its contribution to the linking of researcher and practitioner on a national basis; (3) lessons…
DOT National Transportation Integrated Search
2010-03-01
The objectives of this project were to (a) produce historic estimates of travel times on Twin-Cities arterials : for 1995 and 2005, and (b) develop an initial architecture and database that could, in the future, produce timely : estimates of arterial...
USDA-ARS?s Scientific Manuscript database
Background: Dietary fiber is a broad category of compounds historically defined as partially or completely indigestible plant-based carbohydrates and lignin with, more recently, the additional criteria that fibers incorporated into foods as additives should demonstrate functional human health outcom...
John F. Caratti
2006-01-01
The FIREMON database software allows users to enter data, store, analyze, and summarize plot data, photos, and related documents. The FIREMON database software consists of a Java application and a Microsoft® Access database. The Java application provides the user interface with FIREMON data through data entry forms, data summary reports, and other data management tools...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-10
... National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State Historic Preservation Office (SHPO), and to solicit their views...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-11
... National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with the applicable State Historic Preservation Office (SHPO), and to solicit their...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-15
... Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State Historic Preservation Office (SHPO), and to solicit their views and those of...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State Historic Preservation Office (SHPO), and to solicit their views...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
... Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State Historic Preservation Office, and to solicit their views and those of other...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with the applicable State Historic Preservation Office (SHPO), and to solicit their...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-28
... National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with the applicable State Historic Preservation Office (SHPO), and to solicit their...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-11
... National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State Historic Preservation Office(s), and to solicit their views and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-08
... Historical Society; Notice of Preliminary Determination of A Qualifying Conduit Hydropower Facility and Soliciting Comments and Motions To Intervene On September 20, 2013, San Juan County Historical Society filed... County, Colorado. Applicant Contact: Beverly Rich, San Juan County Historical Society, P.O. Box 154...
Detection and measurement of total ozone from stellar spectra: Paper 2. Historic data from 1935-1942
NASA Astrophysics Data System (ADS)
Griffin, R. E. M.
2006-06-01
Atmospheric ozone columns are derived from historic stellar spectra observed between 1935 and 1942 at Mount Wilson Observatory, California. Comparisons with contemporary measurements in the Arosa database show a generally close correspondence, while a similar comparison with more sparse data from Table Mountain reveals a difference of ~15-20%, as has also been found by other researches of the latter data. The results of the analysis indicate that astronomy's archives command considerable potential for investigating the natural levels of ozone and its variability during the decades prior to anthropogenic interference.
The Growth Dynamics of Words: How Historical Context Shapes the Competitive Linguistic Environment
NASA Astrophysics Data System (ADS)
Tenenbaum, Joel; Petersen, Alexander; Havlin, Shlomo; Stanley, H. Eugene
2012-02-01
Using the massive Google n-gram database of over 10^11 word uses in English, Hebrew, and Spanish, we explore the connection between the growth rates of relative word use and the observed growth rates of disparate competing actors in a common environment such as businesses, scientific journals, and universities, supporting the concept that a language's lexicon is a generic arena for competition, evolving according to selection laws. We find aggregate-level anomalies in the collective statistics corresponding to the time of key historical events such as World War II and the Balfour Declaration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, A.; Seuntjens, J.; Parker, W.
We describe development of automated, web-based, electronic health record (EHR) auditing software for use within our paperless radiation oncology clinic. By facilitating access to multiple databases within the clinic, each patient's EHR is audited prior to treatment, regularly during treatment, and post treatment. Anomalies such as missing documentation, non-compliant workflow and treatment parameters that differ significantly from the norm may be monitored, flagged and brought to the attention of clinicians. By determining historical trends using existing patient data and by comparing new patient data with the historical, we expect our software to provide a measurable improvement in the quality ofmore » radiotherapy at our centre.« less
A Database Design for the Brazilian Air Force Military Personnel Control System.
1987-06-01
GIVEN A RECNum GET MOVING HISTORICAL". 77 SEL4 PlC X(70) VALUE ". 4. GIVEN A RECNUM GET NOMINATION HISTORICAL". 77 SEL5 PIC X(70) VALUE it 5. GIVEN A...WHERE - "°RECNUM = :RECNUM". 77 SQL-SEL3-LENGTH PIC S9999 VALUE 150 COMP. 77 SQL- SEL4 PIC X(150) VALUE "SELECT ABBREV,DTNOM,DTEXO,SITN FROM...NOMINATION WHERE RECNUM 77 SQL- SEL4 -LENGTH PIC S9999 VALUE 150 COMP. 77 SQL-SEL5 PIC X(150) VALUE "SELECT ABBREVDTDES,DTWAIVER,SITD FROM DESIG WHERE RECNUM It
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-19
...: Federal Power Act 16 U.S.C. 791(a)-825(r). h. Applicant Contact: Nicholas E. Josten, GeoSense, 2742 Saint... State Historic Preservation Officer (SHPO), as required by section 106 of the National Historic...
Hydrogen Leak Detection Sensor Database
NASA Technical Reports Server (NTRS)
Baker, Barton D.
2010-01-01
This slide presentation reviews the characteristics of the Hydrogen Sensor database. The database is the result of NASA's continuing interest in and improvement of its ability to detect and assess gas leaks in space applications. The database specifics and a snapshot of an entry in the database are reviewed. Attempts were made to determine the applicability of each of the 65 sensors for ground and/or vehicle use.
Natural stones of historic and future importance in Sweden
NASA Astrophysics Data System (ADS)
Schouenborg, Björn; Andersson, Jenny; Göransson, Mattias
2013-04-01
Several activities and responsibilities of the Geological Survey of Sweden (SGU) are related to the work of the newly formed international Heritage Stone Task Group (HSTG) for designating historically important stones. SGU is among other things a referral organization, frequently dealing with the preparation of statements in connection with the quarrying permit applications of stone producers. When preparing these statements, SGU takes into account a number of parameters, e.g. the importance for local and regional business development, historic importance, area of occurrence, quality of the geological documentation of the stone type, peculiarities of the stone types and technical properties relevant for the intended use. Traditionally, SGU has not worked with bedrock mapping looking at the potential of natural stones production but more commonly looking at the potential production of aggregates, industrial minerals and metals. The competence is, therefore, presently being built up with new databases over important natural stone types and definition of criteria for their selection etc. In this respect the criteria defined by the HSTG provide important help. This work goes hand in hand with the task of proposing stone-deposits and quarries of "national interest". The criteria for selection of a stone type, quarry etc as one of national interest are currently being revised. SGU plays an important role in this work. However, the final decision and appointment lies in the hands of the Swedish Board of Housing, Building and Planning (Boverket), an authority dealing with sustainable land use and regional development, town and country planning. Boverket supervises how the planning legislation is handled by the municipal authorities and the county administrative boards. The two latter organizations are those in charge of giving extraction permits for stone quarrying. The "Hallandia gneiss", of SW Sweden, is described as a case story and presented in this paper. Keywords: Hallandia gneiss, natural stones, historic stones, urban planning and building
Using Historical Atlas Data to Develop High-Resolution Distribution Models of Freshwater Fishes
Huang, Jian; Frimpong, Emmanuel A.
2015-01-01
Understanding the spatial pattern of species distributions is fundamental in biogeography, and conservation and resource management applications. Most species distribution models (SDMs) require or prefer species presence and absence data for adequate estimation of model parameters. However, observations with unreliable or unreported species absences dominate and limit the implementation of SDMs. Presence-only models generally yield less accurate predictions of species distribution, and make it difficult to incorporate spatial autocorrelation. The availability of large amounts of historical presence records for freshwater fishes of the United States provides an opportunity for deriving reliable absences from data reported as presence-only, when sampling was predominantly community-based. In this study, we used boosted regression trees (BRT), logistic regression, and MaxEnt models to assess the performance of a historical metacommunity database with inferred absences, for modeling fish distributions, investigating the effect of model choice and data properties thereby. With models of the distribution of 76 native, non-game fish species of varied traits and rarity attributes in four river basins across the United States, we show that model accuracy depends on data quality (e.g., sample size, location precision), species’ rarity, statistical modeling technique, and consideration of spatial autocorrelation. The cross-validation area under the receiver-operating-characteristic curve (AUC) tended to be high in the spatial presence-absence models at the highest level of resolution for species with large geographic ranges and small local populations. Prevalence affected training but not validation AUC. The key habitat predictors identified and the fish-habitat relationships evaluated through partial dependence plots corroborated most previous studies. The community-based SDM framework broadens our capability to model species distributions by innovatively removing the constraint of lack of species absence data, thus providing a robust prediction of distribution for stream fishes in other regions where historical data exist, and for other taxa (e.g., benthic macroinvertebrates, birds) usually observed by community-based sampling designs. PMID:26075902
Applying ecological and evolutionary theory to cancer: a long and winding road.
Thomas, Frédéric; Fisher, Daniel; Fort, Philippe; Marie, Jean-Pierre; Daoust, Simon; Roche, Benjamin; Grunau, Christoph; Cosseau, Céline; Mitta, Guillaume; Baghdiguian, Stephen; Rousset, François; Lassus, Patrice; Assenat, Eric; Grégoire, Damien; Missé, Dorothée; Lorz, Alexander; Billy, Frédérique; Vainchenker, William; Delhommeau, François; Koscielny, Serge; Itzykson, Raphael; Tang, Ruoping; Fava, Fanny; Ballesta, Annabelle; Lepoutre, Thomas; Krasinska, Liliana; Dulic, Vjekoslav; Raynaud, Peggy; Blache, Philippe; Quittau-Prevostel, Corinne; Vignal, Emmanuel; Trauchessec, Hélène; Perthame, Benoit; Clairambault, Jean; Volpert, Vitali; Solary, Eric; Hibner, Urszula; Hochberg, Michael E
2013-01-01
Since the mid 1970s, cancer has been described as a process of Darwinian evolution, with somatic cellular selection and evolution being the fundamental processes leading to malignancy and its many manifestations (neoangiogenesis, evasion of the immune system, metastasis, and resistance to therapies). Historically, little attention has been placed on applications of evolutionary biology to understanding and controlling neoplastic progression and to prevent therapeutic failures. This is now beginning to change, and there is a growing international interest in the interface between cancer and evolutionary biology. The objective of this introduction is first to describe the basic ideas and concepts linking evolutionary biology to cancer. We then present four major fronts where the evolutionary perspective is most developed, namely laboratory and clinical models, mathematical models, databases, and techniques and assays. Finally, we discuss several of the most promising challenges and future prospects in this interdisciplinary research direction in the war against cancer.
Classification of Chemicals Based On Structured Toxicity ...
Thirty years and millions of dollars worth of pesticide registration toxicity studies, historically stored as hardcopy and scanned documents, have been digitized into highly standardized and structured toxicity data within the Toxicity Reference Database (ToxRefDB). Toxicity-based classifications of chemicals were performed as a model application of ToxRefDB. These endpoints will ultimately provide the anchoring toxicity information for the development of predictive models and biological signatures utilizing in vitro assay data. Utilizing query and structured data mining approaches, toxicity profiles were uniformly generated for greater than 300 chemicals. Based on observation rate, species concordance and regulatory relevance, individual and aggregated effects have been selected to classify the chemicals providing a set of predictable endpoints. ToxRefDB exhibits the utility of transforming unstructured toxicity data into structured data and, furthermore, into computable outputs, and serves as a model for applying such data to address modern toxicological problems.
Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc
2010-07-01
We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.
Steen, Paul J.; Passino-Reader, Dora R.; Wiley, Michael J.
2006-01-01
As a part of the Great Lakes Regional Aquatic Gap Analysis Project, we evaluated methodologies for modeling associations between fish species and habitat characteristics at a landscape scale. To do this, we created brook trout Salvelinus fontinalis presence and absence models based on four different techniques: multiple linear regression, logistic regression, neural networks, and classification trees. The models were tested in two ways: by application to an independent validation database and cross-validation using the training data, and by visual comparison of statewide distribution maps with historically recorded occurrences from the Michigan Fish Atlas. Although differences in the accuracy of our models were slight, the logistic regression model predicted with the least error, followed by multiple regression, then classification trees, then the neural networks. These models will provide natural resource managers a way to identify habitats requiring protection for the conservation of fish species.
Historical records of the geomagnetic field
NASA Astrophysics Data System (ADS)
Arneitz, Patrick; Heilig, Balázs; Vadasz, Gergely; Valach, Fridrich; Dolinský, Peter; Hejda, Pavel; Fabian, Karl; Hammerl, Christa; Leonhardt, Roman
2014-05-01
Records of historical direct measurements of the geomagnetic field are invaluable sources to reconstruct temporal variations of the Earth's magnetic field. They provide information about the field evolution back to the late Middle Age. We have investigated such records with focus on Austria and some neighbouring countries. A variety of new sources and source types are examined. These include 19th century land survey and observatory records of the Imperial and Royal "Centralanstalt f. Meteorologie und Erdmagnetismus", which are not included in the existing compilations. Daily measurements at the Imperial and Royal Observatory in Prague have been digitized. The Imperial and Royal Navy carried out observations in the Adriatic Sea during several surveys. Declination values have been collected from famous mining areas in the former Austro-Hungarian Empire. In this connection, a time series for Banska Stiavnica has been compiled. In the meteorological yearbooks of the monastery Kremsmünster regular declination measurements for the first half of the 19th century were registered. Marsigli's observations during military mapping works in 1696 are also included in our collection. Moreover, compass roses on historical maps or declination values marked on compasses, sundials or globes also provide information about ancient field declination. An evaluation of church orientations in Lower Austria and Northern Germany did not support the hypothesis that church naves had been aligned along the East-West direction by means of magnetic compasses. Therefore, this potential source of information must be excluded from our collection. The gathered records are integrated into a database together with corresponding metadata, such as the used measurement instruments and methods. This information allows an assessment of quality and reliability of the historical observations. The combination of compilations of historical measurements with high quality archeo- and paleomagnetic data in a single database enables a reliable joint evaluation of all types of magnetic field records from different origins. This collection forms the basis for a combined inverse modelling of the geomagnetic field evolution.
A Framework for Mapping User-Designed Forms to Relational Databases
ERIC Educational Resources Information Center
Khare, Ritu
2011-01-01
In the quest for database usability, several applications enable users to design custom forms using a graphical interface, and forward engineer the forms into new databases. The path-breaking aspect of such applications is that users are completely shielded from the technicalities of database creation. Despite this innovation, the process of…
Genotator: a disease-agnostic tool for genetic annotation of disease.
Wall, Dennis P; Pivovarov, Rimma; Tong, Mark; Jung, Jae-Yoon; Fusaro, Vincent A; DeLuca, Todd F; Tonellato, Peter J
2010-10-29
Disease-specific genetic information has been increasing at rapid rates as a consequence of recent improvements and massive cost reductions in sequencing technologies. Numerous systems designed to capture and organize this mounting sea of genetic data have emerged, but these resources differ dramatically in their disease coverage and genetic depth. With few exceptions, researchers must manually search a variety of sites to assemble a complete set of genetic evidence for a particular disease of interest, a process that is both time-consuming and error-prone. We designed a real-time aggregation tool that provides both comprehensive coverage and reliable gene-to-disease rankings for any disease. Our tool, called Genotator, automatically integrates data from 11 externally accessible clinical genetics resources and uses these data in a straightforward formula to rank genes in order of disease relevance. We tested the accuracy of coverage of Genotator in three separate diseases for which there exist specialty curated databases, Autism Spectrum Disorder, Parkinson's Disease, and Alzheimer Disease. Genotator is freely available at http://genotator.hms.harvard.edu. Genotator demonstrated that most of the 11 selected databases contain unique information about the genetic composition of disease, with 2514 genes found in only one of the 11 databases. These findings confirm that the integration of these databases provides a more complete picture than would be possible from any one database alone. Genotator successfully identified at least 75% of the top ranked genes for all three of our use cases, including a 90% concordance with the top 40 ranked candidates for Alzheimer Disease. As a meta-query engine, Genotator provides high coverage of both historical genetic research as well as recent advances in the genetic understanding of specific diseases. As such, Genotator provides a real-time aggregation of ranked data that remains current with the pace of research in the disease fields. Genotator's algorithm appropriately transforms query terms to match the input requirements of each targeted databases and accurately resolves named synonyms to ensure full coverage of the genetic results with official nomenclature. Genotator generates an excel-style output that is consistent across disease queries and readily importable to other applications.
Prototype Development: Context-Driven Dynamic XML Ophthalmologic Data Capture Application
Schwei, Kelsey M; Kadolph, Christopher; Finamore, Joseph; Cancel, Efrain; McCarty, Catherine A; Okorie, Asha; Thomas, Kate L; Allen Pacheco, Jennifer; Pathak, Jyotishman; Ellis, Stephen B; Denny, Joshua C; Rasmussen, Luke V; Tromp, Gerard; Williams, Marc S; Vrabec, Tamara R; Brilliant, Murray H
2017-01-01
Background The capture and integration of structured ophthalmologic data into electronic health records (EHRs) has historically been a challenge. However, the importance of this activity for patient care and research is critical. Objective The purpose of this study was to develop a prototype of a context-driven dynamic extensible markup language (XML) ophthalmologic data capture application for research and clinical care that could be easily integrated into an EHR system. Methods Stakeholders in the medical, research, and informatics fields were interviewed and surveyed to determine data and system requirements for ophthalmologic data capture. On the basis of these requirements, an ophthalmology data capture application was developed to collect and store discrete data elements with important graphical information. Results The context-driven data entry application supports several features, including ink-over drawing capability for documenting eye abnormalities, context-based Web controls that guide data entry based on preestablished dependencies, and an adaptable database or XML schema that stores Web form specifications and allows for immediate changes in form layout or content. The application utilizes Web services to enable data integration with a variety of EHRs for retrieval and storage of patient data. Conclusions This paper describes the development process used to create a context-driven dynamic XML data capture application for optometry and ophthalmology. The list of ophthalmologic data elements identified as important for care and research can be used as a baseline list for future ophthalmologic data collection activities. PMID:28903894
HISTORY OF TROPOSPHERIC OZONE FOR THE SAN BERNARDINO MOUNTAINS OF SOUTHERN CALIFORNIA, 1963-1999
A historical database of hourly O3 concentrations for Crestline, California in 1963-1999 has been developed based on all available representative oxidant/ozone monitoring data taken since 1963. All data were obtained from the California Air Resources Board and the U.S. Departmen...
"Stressed and Sexy": Lexical Borrowing in Cape Town Xhosa
ERIC Educational Resources Information Center
Dowling, Tessa
2011-01-01
Codeswitching by African language speakers in South Africa (whether speaking English or the first language) has been extensively commented on and researched. Many studies analyse the historical, political and sociolinguistic factors behind this growing phenomenon, but there appears to be a little urgency about establishing a database of new…
The presentation presents an introduction to the Yaquina Bay Nutrient Case Study which provides approaches for development of estuarine nutrient criteria in the Pacific Northwest. As part of this effort, a database of historic and recent data has been assembled consisting of phy...
Profiling Chemicals Based on Chronic Toxicity Results from the U.S. EPA ToxRef Database
Thirty years of pesticide registration toxicity data have been historically stored as hardcopy and scanned documents by the U.S. Environmental Protection Agency (EPA) . A significant portion of these data have now been processed into standardized and structured toxicity data with...
The presentation provides an introduction to the Yaquina Estuary Nutrient Case Study which includes considerations for development of estuarine nutrient criteria in the Pacific Northwest. As part of this effort, a database of historic and recent data has been assembled consistin...
Cracking the Egg: The South Carolina Digital Library's New Perspective
ERIC Educational Resources Information Center
Vinson, Christopher G.; Boyd, Kate Foster
2008-01-01
This article explores the historical foundations of the South Carolina Digital Library, a collaborative statewide program that ties together academic special collections and archives, public libraries, state government archives, and other cultural resource institutions in an effort to provide the state with a comprehensive database of online…
Counseling and Spirituality: A Historical Review
ERIC Educational Resources Information Center
Powers, Robin
2005-01-01
Evolution of the relationship between counseling and spirituality since 1840 is examined in terms of the number of publications that have appeared over time that include these terms. The author retrieved the data using the American Psychological Association's PsycINFO database. A similar search was done adding the term training. The rise of…
The dynamics of biogeographic ranges in the deep sea.
McClain, Craig R; Hardy, Sarah Mincks
2010-12-07
Anthropogenic disturbances such as fishing, mining, oil drilling, bioprospecting, warming, and acidification in the deep sea are increasing, yet generalities about deep-sea biogeography remain elusive. Owing to the lack of perceived environmental variability and geographical barriers, ranges of deep-sea species were traditionally assumed to be exceedingly large. In contrast, seamount and chemosynthetic habitats with reported high endemicity challenge the broad applicability of a single biogeographic paradigm for the deep sea. New research benefiting from higher resolution sampling, molecular methods and public databases can now more rigorously examine dispersal distances and species ranges on the vast ocean floor. Here, we explore the major outstanding questions in deep-sea biogeography. Based on current evidence, many taxa appear broadly distributed across the deep sea, a pattern replicated in both the abyssal plains and specialized environments such as hydrothermal vents. Cold waters may slow larval metabolism and development augmenting the great intrinsic ability for dispersal among many deep-sea species. Currents, environmental shifts, and topography can prove to be dispersal barriers but are often semipermeable. Evidence of historical events such as points of faunal origin and climatic fluctuations are also evident in contemporary biogeographic ranges. Continued synthetic analysis, database construction, theoretical advancement and field sampling will be required to further refine hypotheses regarding deep-sea biogeography.
The dynamics of biogeographic ranges in the deep sea
McClain, Craig R.; Hardy, Sarah Mincks
2010-01-01
Anthropogenic disturbances such as fishing, mining, oil drilling, bioprospecting, warming, and acidification in the deep sea are increasing, yet generalities about deep-sea biogeography remain elusive. Owing to the lack of perceived environmental variability and geographical barriers, ranges of deep-sea species were traditionally assumed to be exceedingly large. In contrast, seamount and chemosynthetic habitats with reported high endemicity challenge the broad applicability of a single biogeographic paradigm for the deep sea. New research benefiting from higher resolution sampling, molecular methods and public databases can now more rigorously examine dispersal distances and species ranges on the vast ocean floor. Here, we explore the major outstanding questions in deep-sea biogeography. Based on current evidence, many taxa appear broadly distributed across the deep sea, a pattern replicated in both the abyssal plains and specialized environments such as hydrothermal vents. Cold waters may slow larval metabolism and development augmenting the great intrinsic ability for dispersal among many deep-sea species. Currents, environmental shifts, and topography can prove to be dispersal barriers but are often semipermeable. Evidence of historical events such as points of faunal origin and climatic fluctuations are also evident in contemporary biogeographic ranges. Continued synthetic analysis, database construction, theoretical advancement and field sampling will be required to further refine hypotheses regarding deep-sea biogeography. PMID:20667884
NASA Technical Reports Server (NTRS)
Walls, Laurie K.; Kirk, Daniel; deLuis, Kavier; Haberbusch, Mark S.
2011-01-01
As space programs increasingly investigate various options for long duration space missions the accurate prediction of propellant behavior over long periods of time in microgravity environment has become increasingly imperative. This has driven the development of a detailed, physics-based understanding of slosh behavior of cryogenic propellants over a range of conditions and environments that are relevant for rocket and space storage applications. Recent advancements in computational fluid dynamics (CFD) models and hardware capabilities have enabled the modeling of complex fluid behavior in microgravity environment. Historically, launch vehicles with moderate duration upper stage coast periods have contained very limited instrumentation to quantify propellant stratification and boil-off in these environments, thus the ability to benchmark these complex computational models is of great consequence. To benchmark enhanced CFD models, recent work focuses on establishing an extensive experimental database of liquid slosh under a wide range of relevant conditions. In addition, a mass gauging system specifically designed to provide high fidelity measurements for both liquid stratification and liquid/ullage position in a micro-gravity environment has been developed. This pUblication will summarize the various experimental programs established to produce this comprehensive database and unique flight measurement techniques.
THz Spectroscopy and Spectroscopic Database for Astrophysics
NASA Technical Reports Server (NTRS)
Pearson, John C.; Drouin, Brian J.
2006-01-01
Molecule specific astronomical observations rely on precisely determined laboratory molecular data for interpretation. The Herschel Heterodyne Instrument for Far Infrared, a suite of SOFIA instruments, and ALMA are each well placed to expose the limitations of available molecular physics data and spectral line catalogs. Herschel and SOFIA will observe in high spectral resolution over the entire far infrared range. Accurate data to previously unimagined frequencies including infrared ro-vibrational and ro-torsional bands will be required for interpretation of the observations. Planned ALMA observations with a very small beam will reveal weaker emission features requiring accurate knowledge of higher quantum numbers and additional vibrational states. Historically, laboratory spectroscopy has been at the front of submillimeter technology development, but now astronomical receivers have an enormous capability advantage. Additionally, rotational spectroscopy is a relatively mature field attracting little interest from students and funding agencies. Molecular database maintenance is tedious and difficult to justify as research. This severely limits funding opportunities even though data bases require the same level of expertise as research. We report the application of some relatively new receiver technology into a simple solid state THz spectrometer that has the performance required to collect the laboratory data required by astronomical observations. Further detail on the lack of preparation for upcoming missions by the JPL spectral line catalog is given.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-18
... Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State Historic Preservation Office(s), and to solicit their views and those of...
Waste Information Management System-2012 - 12114
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upadhyay, H.; Quintero, W.; Shoffner, P.
2012-07-01
The Waste Information Management System (WIMS) -2012 was updated to support the Department of Energy (DOE) accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to waste treatment and disposal were potential critical path issues under the accelerated schedule. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of radioactive waste that wouldmore » be generated by DOE sites over the next 40 years. Each local DOE site historically collected, organized, and displayed waste forecast information in separate and unique systems. In order for interested parties to understand and view the complete DOE complex-wide picture, the radioactive waste and shipment information of each DOE site needed to be entered into a common application. The WIMS application was therefore created to serve as a common application to improve stakeholder comprehension and improve DOE radioactive waste treatment and disposal planning and scheduling. WIMS allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, developed and deployed the web-based forecast and transportation system and is responsible for updating the radioactive waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. WIMS continues to successfully accomplish the goals and objectives set forth by DOE for this project. It has replaced the historic process of each DOE site gathering, organizing, and reporting their waste forecast information utilizing different databases and display technologies. In addition, WIMS meets DOE's objective to have the complex-wide waste forecast and transportation information available to all stakeholders and the public in one easy-to-navigate system. The enhancements to WIMS made since its initial deployment include the addition of new DOE sites and facilities, an updated waste and transportation information, and the ability to easily display and print customized waste forecast, the disposition maps, GIS maps and transportation information. The system also allows users to customize and generate reports over the web. These reports can be exported to various formats, such as Adobe{sup R} PDF, Microsoft Excel{sup R}, and Microsoft Word{sup R} and downloaded to the user's computer. Future enhancements will include database/application migration to the next level. A new data import interface will be developed to integrate 2012-13 forecast waste streams. In addition, the application is updated on a continuous basis based on DOE feedback. (authors)« less
Extreme precipitation and floods in the Iberian Peninsula and its socio-economic impacts
NASA Astrophysics Data System (ADS)
Ramos, A. M.; Pereira, S.; Trigo, R. M.; Zêzere, J. L.
2017-12-01
Extreme precipitation events in the Iberian Peninsula can induce floods and landslides that have often major socio-economic impacts. The DISASTER database gathered the basic information on past floods and landslides that caused social consequences in Portugal for the period 1865-2015. This database was built under the assumption that social consequences of floods and landslides are sufficient relevant to be reported by newspapers, that provide the data source. Three extreme historical events were analysed in detail taking into account their associated wide socio-economic impacts. The December 1876 record precipitation and flood event leading to an all-time record flow in two large international rivers (Tagus and Guadiana). As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. The 20-28 December 1909 event recorded the highest number of flood and landslide cases that occurred in Portugal in the period 1865-2015, having triggered the highest floods in 200 years at the Douro river's mouth and causing 89 fatalities in both Portugal and Spain northern regions. More recently the deadliest flash-flooding event affecting Portugal since, at least, the early 19th century, took place on the 25 and 26 November 1967 causing more than 500 fatalities in the Lisbon region. We provide a detailed analysis of each of these events, including their human impacts, precipitation analyses based on historical datasets and the associated atmospheric circulation conditions from reanalysis datasets. Acknowledgements: This work was supported by the project FORLAND - Hydrogeomorphologic risk in Portugal: driving forces and application for land use planning [PTDC / ATPGEO / 1660/2014] funded by the Portuguese Foundation for Science and Technology (FCT), Portugal. A. M. Ramos was also supported by a FCT postdoctoral grant (FCT/DFRH/ SFRH/BPD/84328/2012). The financial support for attending this workshop was also possible through FCT project UID/GEO/50019/2013 - Instituto Dom Luiz.
Salgueiro, Ana Rita; Pereira, Henrique Garcia; Rico, Maria-Teresa; Benito, Gerado; Díez-Herreo, Andrés
2008-02-01
A new statistical approach for preliminary risk evaluation of breakage in tailings dam is presented and illustrated by a case study regarding the Mediterranean region. The objective of the proposed method is to establish an empirical scale of risk, from which guidelines for prioritizing the collection of further specific information can be derived. The method relies on a historical database containing, in essence, two sets of qualitative data: the first set concerns the variables that are observable before the disaster (e.g., type and size of the dam, its location, and state of activity), and the second refers to the consequences of the disaster (e.g., failure type, sludge characteristics, fatalities categorization, and downstream range of damage). Based on a modified form of correspondence analysis, where the second set of attributes are projected as "supplementary variables" onto the axes provided by the eigenvalue decomposition of the matrix referring to the first set, a "qualitative regression" is performed, relating the variables to be predicted (contained in the second set) with the "predictors" (the observable variables). On the grounds of the previously derived relationship, the risk of breakage in a new case can be evaluated, given observable variables. The method was applied in a case study regarding a set of 13 test sites where the ranking of risk obtained was validated by expert knowledge. Once validated, the procedure was included in the final output of the e-EcoRisk UE project (A Regional Enterprise Network Decision-Support System for Environmental Risk and Disaster Management of Large-Scale Industrial Spills), allowing for a dynamic historical database updating and providing a prompt rough risk evaluation for a new case. The aim of this section of the global project is to provide a quantified context where failure cases occurred in the past for supporting analogue reasoning in preventing similar situations.
Spatiotemporal conceptual platform for querying archaeological information systems
NASA Astrophysics Data System (ADS)
Partsinevelos, Panagiotis; Sartzetaki, Mary; Sarris, Apostolos
2015-04-01
Spatial and temporal distribution of archaeological sites has been shown to associate with several attributes including marine, water, mineral and food resources, climate conditions, geomorphological features, etc. In this study, archeological settlement attributes are evaluated under various associations in order to provide a specialized query platform in a geographic information system (GIS). Towards this end, a spatial database is designed to include a series of archaeological findings for a secluded geographic area of Crete in Greece. The key categories of the geodatabase include the archaeological type (palace, burial site, village, etc.), temporal information of the habitation/usage period (pre Minoan, Minoan, Byzantine, etc.), and the extracted geographical attributes of the sites (distance to sea, altitude, resources, etc.). Most of the related spatial attributes are extracted with readily available GIS tools. Additionally, a series of conceptual data attributes are estimated, including: Temporal relation of an era to a future one in terms of alteration of the archaeological type, topologic relations of various types and attributes, spatial proximity relations between various types. These complex spatiotemporal relational measures reveal new attributes towards better understanding of site selection for prehistoric and/or historic cultures, yet their potential combinations can become numerous. Therefore, after the quantification of the above mentioned attributes, they are classified as of their importance for archaeological site location modeling. Under this new classification scheme, the user may select a geographic area of interest and extract only the important attributes for a specific archaeological type. These extracted attributes may then be queried against the entire spatial database and provide a location map of possible new archaeological sites. This novel type of querying is robust since the user does not have to type a standard SQL query but graphically select an area of interest. In addition, according to the application at hand, novel spatiotemporal attributes and relations can be supported, towards the understanding of historical settlement patterns.
Schurr, K.M.; Cox, S.E.
1994-01-01
The Pesticide-Application Data-Base Management System was created as a demonstration project and was tested with data submitted to the Washington State Department of Agriculture by pesticide applicators from a small geographic area. These data were entered into the Department's relational data-base system and uploaded into the system's ARC/INFO files. Locations for pesticide applica- tions are assigned within the Public Land Survey System grids, and ARC/INFO programs in the Pesticide-Application Data-Base Management System can subdivide each survey section into sixteen idealized quarter-quarter sections for display map grids. The system provides data retrieval and geographic information system plotting capabilities from a menu of seven basic retrieval options. Additionally, ARC/INFO coverages can be created from the retrieved data when required for particular applications. The Pesticide-Application Data-Base Management System, or the general principles used in the system, could be adapted to other applica- tions or to other states.
NASA Astrophysics Data System (ADS)
Gaspar Aparicio, R.; Gomez, D.; Coterillo Coz, I.; Wojcik, D.
2012-12-01
At CERN a number of key database applications are running on user-managed MySQL database services. The database on demand project was born out of an idea to provide the CERN user community with an environment to develop and run database services outside of the actual centralised Oracle based database services. The Database on Demand (DBoD) empowers the user to perform certain actions that had been traditionally done by database administrators, DBA's, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently open community version of MySQL and single instance Oracle database server. This article describes a technology approach to face this challenge, a service level agreement, the SLA that the project provides, and an evolution of possible scenarios.
Tucker, Raymond P; Wingate, LaRicka R; O'Keefe, Victoria M
2016-07-01
Recent research has indicated that historical loss may play an important role in the experience of depression symptoms in American Indian/Alaska Native people. Increased frequency of historical loss thinking has been related to symptoms of depression and other pervasive psychological outcomes (i.e., substance abuse) in American Indian and Canadian First Nations communities. The current study investigated how aspects of ethnic minority experience relate to the incidence of historical loss thinking and symptoms of depression in American Indian adults. Data are presented from 123 self-identified American Indian college students (ages 18-25, 67.50% female) who participated in the study in return for course credit and/or entrance into a raffle for gift cards. Participants completed the Adolescent Historical Loss Scale (AHLS), Scale of Ethnic Experiences (SEE), and the Center for Epidemiologic Studies-Depression Scale (CES-D). Indirect effects of ethnic experience on symptoms of depression through historical loss thinking were calculated with nonparametric bootstrapping procedures. Results indicated that a strong ethnic identification, desire to predominantly socialize with other American Indians, and perceptions of discrimination were associated with increased historical loss thinking. Feelings of comfort and assimilation with the mainstream American culture were negatively related to historical loss thinking. Only perception of discrimination was directly related to symptoms of depression; however, ethnic identification and the preference to predominantly socialize with other American Indians were both indirectly related to elevated depressive symptoms through increased historical loss thinking. The clinical implications for these results are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Introducing the Global Fire WEather Database (GFWED)
NASA Astrophysics Data System (ADS)
Field, R. D.
2015-12-01
The Canadian Fire Weather Index (FWI) System is the mostly widely used fire danger rating system in the world. We have developed a global database of daily FWI System calculations beginning in 1980 called the Global Fire WEather Database (GFWED) gridded to a spatial resolution of 0.5° latitude by 2/3° longitude. Input weather data were obtained from the NASA Modern Era Retrospective-Analysis for Research (MERRA), and two different estimates of daily precipitation from rain gauges over land. FWI System Drought Code calculations from the gridded datasets were compared to calculations from individual weather station data for a representative set of 48 stations in North, Central and South America, Europe, Russia, Southeast Asia and Australia. Agreement between gridded calculations and the station-based calculations tended to be most different at low latitudes for strictly MERRA-based calculations. Strong biases could be seen in either direction: MERRA DC over the Mato Grosso in Brazil reached unrealistically high values exceeding DC=1500 during the dry season but was too low over Southeast Asia during the dry season. These biases are consistent with those previously-identified in MERRA's precipitation and reinforce the need to consider alternative sources of precipitation data. GFWED is being used by researchers around the world for analyzing historical relationships between fire weather and fire activity at large scales, in identifying large-scale atmosphere-ocean controls on fire weather, and calibration of FWI-based fire prediction models. These applications will be discussed. More information on GFWED can be found at http://data.giss.nasa.gov/impacts/gfwed/
Development of an agricultural job-exposure matrix for British Columbia, Canada.
Wood, David; Astrakianakis, George; Lang, Barbara; Le, Nhu; Bert, Joel
2002-09-01
Farmers in British Columbia (BC), Canada have been shown to have unexplained elevated proportional mortality rates for several cancers. Because agricultural exposures have never been documented systematically in BC, a quantitative agricultural Job-exposure matrix (JEM) was developed containing exposure assessments from 1950 to 1998. This JEM was developed to document historical exposures and to facilitate future epidemiological studies. Available information regarding BC farming practices was compiled and checklists of potential exposures were produced for each crop. Exposures identified included chemical, biological, and physical agents. Interviews with farmers and agricultural experts were conducted using the checklists as a starting point. This allowed the creation of an initial or 'potential' JEM based on three axes: exposure agent, 'type of work' and time. The 'type of work' axis was determined by combining several variables: region, crop, job title and task. This allowed for a complete description of exposures. Exposure assessments were made quantitatively, where data allowed, or by a dichotomous variable (exposed/unexposed). Quantitative calculations were divided into re-entry and application scenarios. 'Re-entry' exposures were quantified using a standard exposure model with some modification while application exposure estimates were derived using data from the North American Pesticide Handlers Exposure Database (PHED). As expected, exposures differed between crops and job titles both quantitatively and qualitatively. Of the 290 agents included in the exposure axis; 180 were pesticides. Over 3000 estimates of exposure were conducted; 50% of these were quantitative. Each quantitative estimate was at the daily absorbed dose level. Exposure estimates were then rated as high, medium, or low based on comparing them with their respective oral chemical reference dose (RfD) or Acceptable Daily Intake (ADI). This data was mainly obtained from the US Environmental Protection Agency (EPA) Integrated Risk Information System database. Of the quantitative estimates, 74% were rated as low (< 100%) and only 10% were rated as high (>500%). The JEM resulting from this study fills a void concerning exposures for BC farmers and farm workers. While only limited validation of assessments were possible, this JEM can serve as a benchmark for future studies. Preliminary analysis at the BC Cancer Agency (BCCA) using the JEM with prostate cancer records from a large cancer and occupation study/survey has already shown promising results. Development of this JEM provides a useful model for developing historical quantitative exposure estimates where is very little documented information available.
NASA Astrophysics Data System (ADS)
Eichner, J. F.; Steuer, M.; Loew, P.
2016-12-01
Past natural catastrophes offer valuable information for present-day risk assessment. To make use of historic loss data one has to find a setting that enables comparison (over place and time) of historic events happening under today's conditions. By means of loss data normalization the influence of socio-economic development, as the fundamental driver in this context, can be eliminated and the data gives way to the deduction of risk-relevant information and allows the study of other driving factors such as influences from climate variability and climate change or changes of vulnerability. Munich Re's NatCatSERVICE database includes for each historic loss event the geographic coordinates of all locations and regions that were affected in a relevant way. These locations form the basis for what is known as the loss footprint of an event. Here we introduce a state of the art and robust method for global loss data normalization. The presented peril-specific loss footprint normalization method adjusts direct economic loss data to the influence of economic growth within each loss footprint (by using gross cell product data as proxy for local economic growth) and makes loss data comparable over time. To achieve a comparative setting for supra-regional economic differences, we categorize the normalized loss values (together with information on fatalities) based on the World Bank income groups into five catastrophe classes, from minor to catastrophic. The data treated in such way allows (a) for studying the influence of improved reporting of small scale loss events over time and (b) for application of standard (stationary) extreme value statistics (here: peaks over threshold method) to compile estimates for extreme and extrapolated loss magnitudes such as a "100 year event" on global scale. Examples of such results will be shown.
Williamson, Tanja N.; Odom, Kenneth R.; Newson, Jeremy K.; Downs, Aimee C.; Nelson, Hugh L.; Cinotto, Peter J.; Ayers, Mark A.
2009-01-01
The Water Availability Tool for Environmental Resources (WATER) was developed in cooperation with the Kentucky Division of Water to provide a consistent and defensible method of estimating streamflow and water availability in ungaged basins. WATER is process oriented; it is based on the TOPMODEL code and incorporates historical water-use data together with physiographic data that quantitatively describe topography and soil-water storage. The result is a user-friendly decision tool that can estimate water availability in non-karst areas of Kentucky without additional data or processing. The model runs on a daily time step, and critical source data include a historical record of daily temperature and precipitation, digital elevation models (DEMs), the Soil Survey Geographic Database (SSURGO), and historical records of water discharges and withdrawals. The model was calibrated and statistically evaluated for 12 basins by comparing the estimated discharge to that observed at U.S. Geological Survey streamflow-gaging stations. When statistically evaluated over a 2,119-day time period, the discharge estimates showed a bias of -0.29 to 0.42, a root mean square error of 1.66 to 5.06, a correlation of 0.54 to 0.85, and a Nash-Sutcliffe Efficiency of 0.26 to 0.72. The parameter and input modifications that most significantly improved the accuracy and precision of streamflow-discharge estimates were the addition of Next Generation radar (NEXRAD) precipitation data, a rooting depth of 30 centimeters, and a TOPMODEL scaling parameter (m) derived directly from SSURGO data that was multiplied by an adjustment factor of 0.10. No site-specific optimization was used.
NASA Astrophysics Data System (ADS)
Roberti, Gioachino; Ward, Brent; van Wyk de Vries, Benjamin; Perotti, Luigi; Giardino, Marco; Friele, Pierre; Clague, John
2017-04-01
Topographic modeling is becoming more accessible due to the development of structure from motion (SFM), and multi-view stereo (MVS) image matching algorithms in digital photogrammetry. Many studies are utilizing SFM-MVS with either UAV or hand-held consumer-grade digital cameras. However, little work has been done in using SFM-MVS with digitized historical air photos. Large databases of historical airphotos are available in university, public, and government libraries, commonly as paper copies. In many instances, the photos are in poor condition (i.e. deformed by humidity, scratched, or annotated). In addition, the negatives, as well as metadata on the camera and the flight mission, may be missing. Processing such photos using classic stereo-photogrammetry is difficult and in many instances impossible. Yet these photos can provide a valuable archive of geomorphic changes. In this study, we digitized over 1000 vertical air photos of the Mount Meager massif (British Columbia, Canada), acquired during flights between 1947 and 2006. We processed the scans using the commercial SFM-MVS software package PhotoScan. PhotoScan provided high-quality orthophotos (0.42-1.13 m/pixel) and DTMs (1-5 m/pixel). We used the orthophotos to document glacier retreat and deep-seated gravitational deformation over the 60-year photo period. Notably, we reconstructed geomorphic changes that led to the very large (˜50 x 106 m 3) 2010 failure of the south flank of Meager Peak and also documented other unstable areas that might fail catastrophically in the future. This technique can be applied to other photosets to provide rapid high-quality cartographic products that allow researchers to track landscape changes over large areas over the past century.
Jain, Anubhav; Persson, Kristin A.; Ceder, Gerbrand
2016-03-24
Materials innovations enable new technological capabilities and drive major societal advancements but have historically required long and costly development cycles. The Materials Genome Initiative (MGI) aims to greatly reduce this time and cost. Here, we focus on data reuse in the MGI and, in particular, discuss the impact of three different computational databases based on density functional theory methods to the research community. Finally, we discuss and provide recommendations on technical aspects of data reuse, outline remaining fundamental challenges, and present an outlook on the future of MGI's vision of data sharing.
Partitioning medical image databases for content-based queries on a Grid.
Montagnat, J; Breton, V; E Magnin, I
2005-01-01
In this paper we study the impact of executing a medical image database query application on the grid. For lowering the total computation time, the image database is partitioned into subsets to be processed on different grid nodes. A theoretical model of the application complexity and estimates of the grid execution overhead are used to efficiently partition the database. We show results demonstrating that smart partitioning of the database can lead to significant improvements in terms of total computation time. Grids are promising for content-based image retrieval in medical databases.
NASA Astrophysics Data System (ADS)
Henze, F.; Magdalinski, N.; Schwarzbach, F.; Schulze, A.; Gerth, Ph.; Schäfer, F.
2013-07-01
Information systems play an important role in historical research as well as in heritage documentation. As part of a joint research project of the German Archaeological Institute, the Brandenburg University of Technology Cottbus and the Dresden University of Applied Sciences a web-based documentation system is currently being developed, which can easily be adapted to the needs of different projects with individual scientific concepts, methods and questions. Based on open source and standardized technologies it will focus on open and well-documented interfaces to ease the dissemination and re-use of its content via web-services and to communicate with desktop applications for further evaluation and analysis. Core of the system is a generic data model that represents a wide range of topics and methods of archaeological work. By the provision of a concerted amount of initial themes and attributes a cross project analysis of research data will be possible. The development of enhanced search and retrieval functionalities will simplify the processing and handling of large heterogeneous data sets. To achieve a high degree of interoperability with existing external data, systems and applications, standardized interfaces will be integrated. The analysis of spatial data shall be possible through the integration of web-based GIS functions. As an extension to this, customized functions for storage, processing and provision of 3D geo data are being developed. As part of the contribution system requirements and concepts will be presented and discussed. A particular focus will be on introducing the generic data model and the derived database schema. The research work on enhanced search and retrieval capabilities will be illustrated by prototypical developments, as well as concepts and first implementations for an integrated 2D/3D Web-GIS.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...
Applications of the Cambridge Structural Database in organic chemistry and crystal chemistry.
Allen, Frank H; Motherwell, W D Samuel
2002-06-01
The Cambridge Structural Database (CSD) and its associated software systems have formed the basis for more than 800 research applications in structural chemistry, crystallography and the life sciences. Relevant references, dating from the mid-1970s, and brief synopses of these papers are collected in a database, DBUse, which is freely available via the CCDC website. This database has been used to review research applications of the CSD in organic chemistry, including supramolecular applications, and in organic crystal chemistry. The review concentrates on applications that have been published since 1990 and covers a wide range of topics, including structure correlation, conformational analysis, hydrogen bonding and other intermolecular interactions, studies of crystal packing, extended structural motifs, crystal engineering and polymorphism, and crystal structure prediction. Applications of CSD information in studies of crystal structure precision, the determination of crystal structures from powder diffraction data, together with applications in chemical informatics, are also discussed.
The Historical and InstruMental SEismic cataLogue for France (HIMSELF)
NASA Astrophysics Data System (ADS)
Manchuel, Kevin; Traversa, Paola; Baumont, David; Cara, Michel; Nayman, Emmanuelle; Durouchoux, Christophe
2017-04-01
In regions that undergo low deformation rates, as it is the case for metropolitan France, the use of historical seismicity, in addition to instrumental one, is necessary when dealing with seismic hazard assessment. The goal is to extend the observation time window to better assess the seismogenic behavior of the crust and of specific geological structures. This paper presents the strategy adopted to develop a parametric earthquake catalogue using Mw as the reference magnitude scale that covers the Metropolitan France for both instrumental and historical times. Works performed in the frame of the SiHex (Cara et al., 2015) and SIGMA projects (EDF-CEA-AREVA-ENEL), respectively on instrumental and historical earthquakes, are combined to produce the Historical and InstruMental SEismic cataLogue for France (HIMSELF). The SiHex catalogue is composed of 40 000 natural earthquakes, for which hypocentral location (inferred from 1D homogeneous location process and observatories regional estimates) and Mw magnitude (from specific analysis on crustal waves coda - ML-LDG> 4.0 - and magnitudes conversions laws) are given. In the frame of the SIGMA research program, an integrated study is realized on historical seismicity from Empirical Macroseismic Prediction Equations (EMPEs) calibration in Mw (Baumont et al., submitted) to their application to earthquakes of the SISFRANCE macroseismic database (BRGM, EDF, IRSN), through a dedicated strategy developed by Traversa et al. (submitted) to compute their Mw magnitude and depth. This inversion process allows taking into account the main macroseismic field specificities reported by SISFRANCE with a Logic Tree (LT) approach. It also permits to capture epistemic uncertainties associated to macroseismic data and to EMPEs selection. For events that exhibit a poorly constrained macroseismic field (mainly old, cross border or at sea earthquakes) joint inversion of Mw and depth is not possible and a priori depth needs to be set to calculate Mw. Regional a priori depths are defined here based on analysis of the distribution of depths computed for earthquakes with a well constrained macroseismic field and for which joint inversion of Mw and depth is possible. At the end, 27% of SISFRANCE earthquake seismological parameters are jointly inverted and for the other 73% Mw are calculated assuming a priori depths. The HIMSELF catalogue is composed of the SIGMA historical parametric catalogue from 463 to 1965 and of the SiHex instrumental one from 1965 to 2009. All magnitudes are expressed in Mw which makes this catalogue directly usable as an input for seismic hazard studies, carried out both through a probabilistic or deterministic way. Uncertainties on magnitudes and depths are provided in this study for historical earthquakes following calculation scheme presented in Traversa et al. (submitted). Uncertainties on magnitudes for instrumental events are from Cara et al. (2016).
First Look: TRADEMARKSCAN Database.
ERIC Educational Resources Information Center
Fernald, Anne Conway; Davidson, Alan B.
1984-01-01
Describes database produced by Thomson and Thomson and available on Dialog which contains over 700,000 records representing all active federal trademark registrations and applications for registrations filed in United States Patent and Trademark Office. A typical record, special features, database applications, learning to use TRADEMARKSCAN, and…
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 6 2011-01-01 2011-01-01 false Archeological and historical laws and Executive orders applicable to NRCS-assisted programs. 656.2 Section 656.2 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 6 2012-01-01 2012-01-01 false Archeological and historical laws and Executive orders applicable to NRCS-assisted programs. 656.2 Section 656.2 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 6 2013-01-01 2013-01-01 false Archeological and historical laws and Executive orders applicable to NRCS-assisted programs. 656.2 Section 656.2 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES...
Development and operations of the astrophysics data system
NASA Technical Reports Server (NTRS)
Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)
2005-01-01
Abstract service - Continued regular updates of abstracts in the databases, both at SA0 and at all mirror sites. - Modified loading scripts to accommodate changes in data format (PhyS) - Discussed data deliveries with providers to clear up problems with format or other errors (EGU) - Continued inclusion of large numbers of historical literature volumes and physics conference volumes xeroxed from the library. - Performed systematic fixes on some data sets in the database to account for changes in article numbering (AGU journals) - Implemented linking of ADS bibliographic records with multimedia files - Debugged and fixed obscure connection problems with the ADS Korean mirror site which were preventing successful updates of the data holdings. - Wrote procedure to parse citation data and characterize an ADS record based on its citation ratios within each database.
Database architectures for Space Telescope Science Institute
NASA Astrophysics Data System (ADS)
Lubow, Stephen
1993-08-01
At STScI nearly all large applications require database support. A general purpose architecture has been developed and is in use that relies upon an extended client-server paradigm. Processing is in general distributed across three processes, each of which generally resides on its own processor. Database queries are evaluated on one such process, called the DBMS server. The DBMS server software is provided by a database vendor. The application issues database queries and is called the application client. This client uses a set of generic DBMS application programming calls through our STDB/NET programming interface. Intermediate between the application client and the DBMS server is the STDB/NET server. This server accepts generic query requests from the application and converts them into the specific requirements of the DBMS server. In addition, it accepts query results from the DBMS server and passes them back to the application. Typically the STDB/NET server is local to the DBMS server, while the application client may be remote. The STDB/NET server provides additional capabilities such as database deadlock restart and performance monitoring. This architecture is currently in use for some major STScI applications, including the ground support system. We are currently investigating means of providing ad hoc query support to users through the above architecture. Such support is critical for providing flexible user interface capabilities. The Universal Relation advocated by Ullman, Kernighan, and others appears to be promising. In this approach, the user sees the entire database as a single table, thereby freeing the user from needing to understand the detailed schema. A software layer provides the translation between the user and detailed schema views of the database. However, many subtle issues arise in making this transformation. We are currently exploring this scheme for use in the Hubble Space Telescope user interface to the data archive system (DADS).
Application of cloud database in the management of clinical data of patients with skin diseases.
Mao, Xiao-fei; Liu, Rui; DU, Wei; Fan, Xue; Chen, Dian; Zuo, Ya-gang; Sun, Qiu-ning
2015-04-01
To evaluate the needs and applications of using cloud database in the daily practice of dermatology department. The cloud database was established for systemic scleroderma and localized scleroderma. Paper forms were used to record the original data including personal information, pictures, specimens, blood biochemical indicators, skin lesions,and scores of self-rating scales. The results were input into the cloud database. The applications of the cloud database in the dermatology department were summarized and analyzed. The personal and clinical information of 215 systemic scleroderma patients and 522 localized scleroderma patients were included and analyzed using the cloud database. The disease status,quality of life, and prognosis were obtained by statistical calculations. The cloud database can efficiently and rapidly store and manage the data of patients with skin diseases. As a simple, prompt, safe, and convenient tool, it can be used in patients information management, clinical decision-making, and scientific research.
SQLGEN: a framework for rapid client-server database application development.
Nadkarni, P M; Cheung, K H
1995-12-01
SQLGEN is a framework for rapid client-server relational database application development. It relies on an active data dictionary on the client machine that stores metadata on one or more database servers to which the client may be connected. The dictionary generates dynamic Structured Query Language (SQL) to perform common database operations; it also stores information about the access rights of the user at log-in time, which is used to partially self-configure the behavior of the client to disable inappropriate user actions. SQLGEN uses a microcomputer database as the client to store metadata in relational form, to transiently capture server data in tables, and to allow rapid application prototyping followed by porting to client-server mode with modest effort. SQLGEN is currently used in several production biomedical databases.
Academic Oral History: Life Review in Spite of Itself.
ERIC Educational Resources Information Center
Ryant, Carl
The process and content of the life review should not be separated from the creation of an oral history. Several projects, undertaken at the University of Louisville Oral History Center, support the therapeutic aspects of reminiscence. The dichotomy between oral history, as an historical database, and life review, as a therapeutic exercise, breaks…
The Lawyers in the 16th-18th Century's Germany: A Historical Database.
ERIC Educational Resources Information Center
Ranieri, Filippo
1990-01-01
Investigates the sociological backgrounds of German lawyers of the Holy Roman Empire through an analysis of the dissertations and disputations written during the seventeenth and eighteenth centuries. Focuses on their university education, family circumstances, and careers. Creates an information data bank to carry out the project. Predicts further…
The 21st Century Writing Program: Collaboration for the Common Good
ERIC Educational Resources Information Center
Moberg, Eric
2010-01-01
The purpose of this report is to review the literature on theoretical frameworks, best practices, and conceptual models for the 21st century collegiate writing program. Methods include electronic database searches for recent and historical peer-reviewed scholarly literature on collegiate writing programs. The author analyzed over 65 sources from…
Database on Demand: insight how to build your own DBaaS
NASA Astrophysics Data System (ADS)
Gaspar Aparicio, Ruben; Coterillo Coz, Ignacio
2015-12-01
At CERN, a number of key database applications are running on user-managed MySQL, PostgreSQL and Oracle database services. The Database on Demand (DBoD) project was born out of an idea to provide CERN user community with an environment to develop and run database services as a complement to the central Oracle based database service. The Database on Demand empowers the user to perform certain actions that had been traditionally done by database administrators, providing an enterprise platform for database applications. It also allows the CERN user community to run different database engines, e.g. presently three major RDBMS (relational database management system) vendors are offered. In this article we show the actual status of the service after almost three years of operations, some insight of our new redesign software engineering and near future evolution.
Southworth, C. Scott; Brezinski, David K.; Orndorff, Randall C.; Chirico, Peter G.; Lagueux, Kerry M.
2001-01-01
The Chesapeake and Ohio (CO) Canal National Historical Park is unique in that it is the only land within the National Park system that crosses 5 physiographic provinces along a major river. From Georgetown, District of Columbia (D.C.) to Cumberland, Maryland (Md.), the CO Canal provides an opportunity to examine the geologic history of the central Appalachian region and how the canal contributed to the development of this area. The geologic map data covers the 184.5-mile long park in a 2-mile wide corridor centered on the Potomac River
DOT National Transportation Integrated Search
2002-02-26
This document, the Introduction to the Enhanced Logistics Intratheater Support Tool (ELIST) Mission Application and its Segments, satisfies the following objectives: : It identifies the mission application, known in brief as ELIST, and all seven ...
Granitto, Matthew; Schmidt, Jeanine M.; Shew, Nora B.; Gamble, Bruce M.; Labay, Keith A.
2013-01-01
The Alaska Geochemical Database Version 2.0 (AGDB2) contains new geochemical data compilations in which each geologic material sample has one “best value” determination for each analyzed species, greatly improving speed and efficiency of use. Like the Alaska Geochemical Database (AGDB, http://pubs.usgs.gov/ds/637/) before it, the AGDB2 was created and designed to compile and integrate geochemical data from Alaska in order to facilitate geologic mapping, petrologic studies, mineral resource assessments, definition of geochemical baseline values and statistics, environmental impact assessments, and studies in medical geology. This relational database, created from the Alaska Geochemical Database (AGDB) that was released in 2011, serves as a data archive in support of present and future Alaskan geologic and geochemical projects, and contains data tables in several different formats describing historical and new quantitative and qualitative geochemical analyses. The analytical results were determined by 85 laboratory and field analytical methods on 264,095 rock, sediment, soil, mineral and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey personnel and analyzed in U.S. Geological Survey laboratories or, under contracts, in commercial analytical laboratories. These data represent analyses of samples collected as part of various U.S. Geological Survey programs and projects from 1962 through 2009. In addition, mineralogical data from 18,138 nonmagnetic heavy-mineral concentrate samples are included in this database. The AGDB2 includes historical geochemical data originally archived in the U.S. Geological Survey Rock Analysis Storage System (RASS) database, used from the mid-1960s through the late 1980s and the U.S. Geological Survey PLUTO database used from the mid-1970s through the mid-1990s. All of these data are currently maintained in the National Geochemical Database (NGDB). Retrievals from the NGDB were used to generate most of the AGDB data set. These data were checked for accuracy regarding sample location, sample media type, and analytical methods used. This arduous process of reviewing, verifying and, where necessary, editing all U.S. Geological Survey geochemical data resulted in a significantly improved Alaska geochemical dataset. USGS data that were not previously in the NGDB because the data predate the earliest U.S. Geological Survey geochemical databases, or were once excluded for programmatic reasons, are included here in the AGDB2 and will be added to the NGDB. The AGDB2 data provided here are the most accurate and complete to date, and should be useful for a wide variety of geochemical studies. The AGDB2 data provided in the linked database may be updated or changed periodically.
NASA Astrophysics Data System (ADS)
McEnery, J. A.; Jitkajornwanich, K.
2012-12-01
This presentation will describe the methodology and overall system development by which a benchmark dataset of precipitation information has been used to characterize the depth-area-duration relations in heavy rain storms occurring over regions of Texas. Over the past two years project investigators along with the National Weather Service (NWS) West Gulf River Forecast Center (WGRFC) have developed and operated a gateway data system to ingest, store, and disseminate NWS multi-sensor precipitation estimates (MPE). As a pilot project of the Integrated Water Resources Science and Services (IWRSS) initiative, this testbed uses a Standard Query Language (SQL) server to maintain a full archive of current and historic MPE values within the WGRFC service area. These time series values are made available for public access as web services in the standard WaterML format. Having this volume of information maintained in a comprehensive database now allows the use of relational analysis capabilities within SQL to leverage these multi-sensor precipitation values and produce a valuable derivative product. The area of focus for this study is North Texas and will utilize values that originated from the West Gulf River Forecast Center (WGRFC); one of three River Forecast Centers currently represented in the holdings of this data system. Over the past two decades, NEXRAD radar has dramatically improved the ability to record rainfall. The resulting hourly MPE values, distributed over an approximate 4 km by 4 km grid, are considered by the NWS to be the "best estimate" of rainfall. The data server provides an accepted standard interface for internet access to the largest time-series dataset of NEXRAD based MPE values ever assembled. An automated script has been written to search and extract storms over the 18 year period of record from the contents of this massive historical precipitation database. Not only can it extract site-specific storms, but also duration-specific storms and storms separated by user defined inter-event periods. A separate storm database has been created to store the selected output. By storing output within tables in a separate database, we can make use of powerful SQL capabilities to perform flexible pattern analysis. Previous efforts have made use of historic data from limited clusters of irregularly spaced physical gauges. Spatial extent of the observational network has been a limiting factor. The relatively dense distribution of MPE provides a virtual mesh of observations stretched over the landscape. This work combines a unique hydrologic data resource with programming and database analysis to characterize storm depth-area-duration relationships.
7 CFR 785.7 - Distribution of Federal grant funds.
Code of Federal Regulations, 2010 CFR
2010-01-01
... mediation services (historical and projected); (2) Scope of mediation services; (3) Service record of the... mediation services, historical and projected, as applicable; (iii) Number of mediations resulting in signed... of mediation; (4) Historic use of program funds (budgeted versus actual); and (5) Material changes in...
36 CFR 801.5 - State Historic Preservation Officer responsibilities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Officer responsibilities. 801.5 Section 801.5 Parks, Forests, and Public Property ADVISORY COUNCIL ON... § 801.5 State Historic Preservation Officer responsibilities. (a) The State Historic Preservation... § 801.3(b); responding, within 45 days, to submittal of a determination by the applicant under section...
PROGRESS REPORT ON THE DSSTOX DATABASE NETWORK: NEWLY LAUNCHED WEBSITE, APPLICATIONS, FUTURE PLANS
Progress Report on the DSSTox Database Network: Newly Launched Website, Applications, Future Plans
Progress will be reported on development of the Distributed Structure-Searchable Toxicity (DSSTox) Database Network and the newly launched public website that coordinates and...
Applications of Technology to CAS Data-Base Production.
ERIC Educational Resources Information Center
Weisgerber, David W.
1984-01-01
Reviews the economic importance of applying computer technology to Chemical Abstracts Service database production from 1973 to 1983. Database building, technological applications for editorial processing (online editing, Author Index Manufacturing System), and benefits (increased staff productivity, reduced rate of increase of cost of services,…
A database application for wilderness character monitoring
Ashley Adams; Peter Landres; Simon Kingston
2012-01-01
The National Park Service (NPS) Wilderness Stewardship Division, in collaboration with the Aldo Leopold Wilderness Research Institute and the NPS Inventory and Monitoring Program, developed a database application to facilitate tracking and trend reporting in wilderness character. The Wilderness Character Monitoring Database allows consistent, scientifically based...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphrey, Walter R.
CMS is a Windows application for tracking chemical inventories. Partners will use this application to record chemicals that are stored on their site and to perform periodic inventories of those chemicals. The application records information about stored chemicals from user input via the keyboard and barcode readers and stores that information into a single-file database (SQLite). A simple user login mechanism is used to control access to functions in the application. A user interface is provided that allows users to search the database and update data in the database.
Titanium in dentistry: historical development, state of the art and future perspectives.
Jorge, Juliana Ribeiro Pala; Barão, Valentim Adelino; Delben, Juliana Aparecida; Faverani, Leonardo Perez; Queiroz, Thallita Pereira; Assunção, Wirley Gonçalves
2013-06-01
Titanium is a metallic element known by several attractive characteristics, such as biocompatibility, excellent corrosion resistance and high mechanical resistance. It is widely used in Dentistry, with high success rates, providing a favorable biological response when in contact with live tissues. Therefore, the objective of this study was to describe the different uses of titanium in Dentistry, reviewing its historical development and discoursing about its state of art and future perspective of its utilization. A search in the MEDLINE/PubMed database was performed using the terms 'titanium', 'dentistry' and 'implants'. The title and abstract of articles were read, and after this first screening 20 articles were selected and their full-texts were downloaded. Additional text books and manual search of reference lists within selected articles were included. Correlated literature showed that titanium is the most used metal in Implantology for manufacturing osseointegrated implants and their systems, with a totally consolidated utilization. Moreover, titanium can be also employed in prosthodontics to obtain frameworks. However, problems related to its machining, casting, welding and ceramic application for dental prosthesis are still limiting its use. In Endodontics, titanium has been used in association to nickel for manufacturing rotatory instruments, providing a higher resistance to deformation. However, although the different possibilities of using titanium in modern Dentistry, its use for prostheses frameworks still needs technological improvements in order to surpass its limitations.
Database constraints applied to metabolic pathway reconstruction tools.
Vilaplana, Jordi; Solsona, Francesc; Teixido, Ivan; Usié, Anabel; Karathia, Hiren; Alves, Rui; Mateo, Jordi
2014-01-01
Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the process(es) of interest and their function. It also enables the sets of proteins involved in the process(es) in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes.
2013-01-01
Background Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Results Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes: • Support for multi-component compounds (mixtures) • Import and export of SD-files • Optional security (authorization) For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures). Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. Conclusions By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework. PMID:24325762
Kiener, Joos
2013-12-11
Research in organic chemistry generates samples of novel chemicals together with their properties and other related data. The involved scientists must be able to store this data and search it by chemical structure. There are commercial solutions for common needs like chemical registration systems or electronic lab notebooks. However for specific requirements of in-house databases and processes no such solutions exist. Another issue is that commercial solutions have the risk of vendor lock-in and may require an expensive license of a proprietary relational database management system. To speed up and simplify the development for applications that require chemical structure search capabilities, I have developed Molecule Database Framework. The framework abstracts the storing and searching of chemical structures into method calls. Therefore software developers do not require extensive knowledge about chemistry and the underlying database cartridge. This decreases application development time. Molecule Database Framework is written in Java and I created it by integrating existing free and open-source tools and frameworks. The core functionality includes:•Support for multi-component compounds (mixtures)•Import and export of SD-files•Optional security (authorization)For chemical structure searching Molecule Database Framework leverages the capabilities of the Bingo Cartridge for PostgreSQL and provides type-safe searching, caching, transactions and optional method level security. Molecule Database Framework supports multi-component chemical compounds (mixtures).Furthermore the design of entity classes and the reasoning behind it are explained. By means of a simple web application I describe how the framework could be used. I then benchmarked this example application to create some basic performance expectations for chemical structure searches and import and export of SD-files. By using a simple web application it was shown that Molecule Database Framework successfully abstracts chemical structure searches and SD-File import and export to simple method calls. The framework offers good search performance on a standard laptop without any database tuning. This is also due to the fact that chemical structure searches are paged and cached. Molecule Database Framework is available for download on the projects web page on bitbucket: https://bitbucket.org/kienerj/moleculedatabaseframework.
The Network Configuration of an Object Relational Database Management System
NASA Technical Reports Server (NTRS)
Diaz, Philip; Harris, W. C.
2000-01-01
The networking and implementation of the Oracle Database Management System (ODBMS) requires developers to have knowledge of the UNIX operating system as well as all the features of the Oracle Server. The server is an object relational database management system (DBMS). By using distributed processing, processes are split up between the database server and client application programs. The DBMS handles all the responsibilities of the server. The workstations running the database application concentrate on the interpretation and display of data.
Atlantic Hurricane Activity: 1851-1900
NASA Astrophysics Data System (ADS)
Landsea, C. W.
2001-12-01
This presentation reports on the second year's work of a three year project to re-analyze the North Atlantic hurricane database (or HURDAT). The original database of six-hourly positions and intensities were put together in the 1960s in support of the Apollo space program to help provide statistical track forecast guidance. In the intervening years, this database - which is now freely and easily accessible on the Internet from the National Hurricane Center's (NHC's) Webpage - has been utilized for a wide variety of uses: climatic change studies, seasonal forecasting, risk assessment for county emergency managers, analysis of potential losses for insurance and business interests, intensity forecasting techniques and verification of official and various model predictions of track and intensity. Unfortunately, HURDAT was not designed with all of these uses in mind when it was first put together and not all of them may be appropriate given its original motivation. One problem with HURDAT is that there are numerous systematic as sell as some random errors in the database which need correction. Additionally, analysis techniques have changed over the years at NHC as our understanding of tropical cyclones has developed, leading to biases in the historical database that have not been addressed. Another difficulty in applying the hurricane database to studies concerned with landfalling events is the lack exact location, time and intensity at hurricane landfall. Finally, recent efforts into uncovering undocumented historical hurricanes in the late 1800s and early 1900s led by Jose Fernandez-Partagas have greatly increased our knowledge of these past events, which are not yet incorporated into the HURDAT database. Because of all of these issues, a re-analysis of the Atlantic hurricane database is being attempted that will be completed in three years. As part of the re-analyses, three files will be made available: {* } The revised Atlantic HURDAT (with six hourly intensities & positions) {* }{* } HURDAT meta-file: A text file with detailed information about each suggested change proposed in the revised HURDAT. {* }{* }{* } A ``center fix" file: This file is composed of actual observations of tropical cyclone positions and intensity estimates from the following platforms: aircraft, satellite, radar, and synoptic. All changes made to HURDAT will be approved by a NHC Committee as this database is one that is officially maintained by them. At the conference, results will be shown including a revised climatology of U.S. hurricane strikes back to 1851. >http://www.aoml.noaa.gov/hrd/hurdat/index.html
GeoInt: the first macroseismic intensity database for the Republic of Georgia
NASA Astrophysics Data System (ADS)
Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.
2018-05-01
Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.
GeoInt: the first macroseismic intensity database for the Republic of Georgia
NASA Astrophysics Data System (ADS)
Varazanashvili, O.; Tsereteli, N.; Bonali, F. L.; Arabidze, V.; Russo, E.; Pasquaré Mariotto, F.; Gogoladze, Z.; Tibaldi, A.; Kvavadze, N.; Oppizzi, P.
2018-01-01
Our work is intended to present the new macroseismic intensity database for the Republic of Georgia—hereby named GeoInt—which includes earthquakes from the historical (from 1250 B.C. onwards) to the instrumental era. Such database is composed of 111 selected earthquakes and related 3944 intensity data points (IDPs) for 1509 different localities, reported in the Medvedev-Sponheuer-Karnik scale (MSK). Regarding the earthquakes, the M S is in the 3.3-7 range and the depth is in the 2-36 km range. The entire set of IDPs is characterized by intensities ranging from 2-3 to 9-10 and covers an area spanning from 39.508° N to 45.043° N in a N-S direction and from 37.324° E to 48.500° E in an E-W direction, with some of the IDPs located outside the Georgian border, in the (i) Republic of Armenia, (ii) Russian Federation, (iii) Republic of Turkey, and (iv) Republic of Azerbaijan. We have revised each single IDP and have reevaluated and homogenized intensity values to the MSK scale. In particular, regarding the whole set of 3944 IDPs, 348 belong to the Historical era (pre-1900) and 3596 belong to the instrumental era (post-1900). With particular regard to the 3596 IDPs, 105 are brand new (3%), whereas the intensity values for 804 IDPs have been reevaluated (22%); for 2687 IDPs (75%), intensities have been confirmed from previous interpretations. We introduce this database as a key input for further improvements in seismic hazard modeling and seismic risk calculation for this region, based on macroseismic intensity; we report all the 111 earthquakes with available macroseismic information. The GeoInt database is also accessible online at http://www.enguriproject.unimib.it and will be kept updated in the future.
Patent Documents as a Resource for Studies and Education in Geophysics - An Approach.
NASA Astrophysics Data System (ADS)
Wollny, K. G.
2016-12-01
Patents are a highly neglected source of information in geophysics, although they supply a wealth of technical and historically relevant data and might be an important asset for researchers and students. The technical drawings and descriptions in patent documents provide insight into the personal work of a researcher or a scientific group and give detailed technical background information, show interdisciplinary solutions for similar problems, help to learn about inventions too advanced for their time but maybe useful now, and to explore the historical background and timelines of inventions and their inventors. It will be shown how to get access to patent documents and how to use them for research and education purposes. Exemplary inventions by well-known geoscientists or scientists in related fields will be presented to illustrate the usefulness of patent documents. The data pool used is the International Patent Classification (IPC) class G01V that the United Nations' World Intellectual Property Organisation (WIPO) has set up mainly for inventions with key aspects in geophysics. This class contains approximately 235,000 patent documents (July 2016) for methods, apparatuses or scientific instruments developed during scientific projects or by geophysical companies. The patent documents can be accessed via patent databases. The most important patent databases are for free, search functionality is self-explanatory and the amount of information to be extracted is enormous. For example, more than 90 million multilingual patent documents are currently available online (July 2016) in DEPATIS database of the German Patent and Trade Mark Office or ESPACENET of the European Patent Office. To summarize, patent documents are a highly useful tool for educational and research purposes to strengthen students' and scientists' knowledge in a practically orientated geophysical field and to widen the horizon to adjacent technical areas. Last but not least, they also provide insight into historical aspects of geophysics and the persons working in that area.
Challenges in Defining Tsunami Wave Height
NASA Astrophysics Data System (ADS)
Stroker, K. J.; Dunbar, P. K.; Mungov, G.; Sweeney, A.; Arcos, N. P.
2017-12-01
The NOAA National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 Mw earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height, NCEI will consider adding an additional field for the maximum peak measurement.
Challenges in Defining Tsunami Wave Heights
NASA Astrophysics Data System (ADS)
Dunbar, Paula; Mungov, George; Sweeney, Aaron; Stroker, Kelly; Arcos, Nicolas
2017-08-01
The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 M w earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 coastal tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height for each tide gauge and deep-ocean buoy, NCEI will consider adding an additional field for the maximum peak measurement.
Urban Neighborhood Information Systems: Crime Prevention and Control Applications.
ERIC Educational Resources Information Center
Pattavina, April; Pierce, Glenn; Saiz, Alan
2002-01-01
Chronicles the need for and development of an interdisciplinary, integrated neighborhood-level database for Boston, Massachusetts, discussing database content and potential applications of this database to a range of criminal justice problems and initiatives (e.g., neighborhood crime patterns, needs assessment, and program planning and…
2008-12-01
between our current project and the historical projects. Therefore to refine the historical volatility estimate of the previously completed software... historical volatility estimates obtained in the form of beliefs and plausibility based on subjective probabilities that take into consideration unique
Analysis and selection of magnitude relations for the Working Group on Utah Earthquake Probabilities
Duross, Christopher; Olig, Susan; Schwartz, David
2015-01-01
Prior to calculating time-independent and -dependent earthquake probabilities for faults in the Wasatch Front region, the Working Group on Utah Earthquake Probabilities (WGUEP) updated a seismic-source model for the region (Wong and others, 2014) and evaluated 19 historical regressions on earthquake magnitude (M). These regressions relate M to fault parameters for historical surface-faulting earthquakes, including linear fault length (e.g., surface-rupture length [SRL] or segment length), average displacement, maximum displacement, rupture area, seismic moment (Mo ), and slip rate. These regressions show that significant epistemic uncertainties complicate the determination of characteristic magnitude for fault sources in the Basin and Range Province (BRP). For example, we found that M estimates (as a function of SRL) span about 0.3–0.4 units (figure 1) owing to differences in the fault parameter used; age, quality, and size of historical earthquake databases; and fault type and region considered.
Design and implementation of a twin-family database for behavior genetics and genomics studies.
Boomsma, Dorret I; Willemsen, Gonneke; Vink, Jacqueline M; Bartels, Meike; Groot, Paul; Hottenga, Jouke Jan; van Beijsterveldt, C E M Toos; Stroet, Therese; van Dijk, Rob; Wertheim, Rien; Visser, Marco; van der Kleij, Frank
2008-06-01
In this article we describe the design and implementation of a database for extended twin families. The database does not focus on probands or on index twins, as this approach becomes problematic when larger multigenerational families are included, when more than one set of multiples is present within a family, or when families turn out to be part of a larger pedigree. Instead, we present an alternative approach that uses a highly flexible notion of persons and relations. The relations among the subjects in the database have a one-to-many structure, are user-definable and extendible and support arbitrarily complicated pedigrees. Some additional characteristics of the database are highlighted, such as the storage of historical data, predefined expressions for advanced queries, output facilities for individuals and relations among individuals and an easy-to-use multi-step wizard for contacting participants. This solution presents a flexible approach to accommodate pedigrees of arbitrary size, multiple biological and nonbiological relationships among participants and dynamic changes in these relations that occur over time, which can be implemented for any type of multigenerational family study.
Database security and encryption technology research and application
NASA Astrophysics Data System (ADS)
Zhu, Li-juan
2013-03-01
The main purpose of this paper is to discuss the current database information leakage problem, and discuss the important role played by the message encryption techniques in database security, As well as MD5 encryption technology principle and the use in the field of website or application. This article is divided into introduction, the overview of the MD5 encryption technology, the use of MD5 encryption technology and the final summary. In the field of requirements and application, this paper makes readers more detailed and clearly understood the principle, the importance in database security, and the use of MD5 encryption technology.
Development and applications of the EntomopathogenID MLSA database for use in agricultural systems
USDA-ARS?s Scientific Manuscript database
The current study reports the development and application of a publicly accessible, curated database of Hypocrealean entomopathogenic fungi sequence data. The goal was to provide a platform for users to easily access sequence data from reference strains. The database can be used to accurately identi...
Assessing natural hazard risk using images and data
NASA Astrophysics Data System (ADS)
Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.
2012-12-01
Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.
Documentation and Cultural Heritage Inventories - Case of the Historic City of Ahmadabad
NASA Astrophysics Data System (ADS)
Shah, K.
2015-08-01
Located in the western Indian state of Gujarat, the historic city of Ahmadabad is renowned for the unparalleled richness of its monumental architecture, traditional house form, community based settlement patterns, city structure, crafts and mercantile culture. This paper describes the process followed for documentation and development of comprehensive Heritage Inventories for the historic city with an aim of illustrating the Outstanding Universal Values of its Architectural and Urban Heritage. The exercise undertaken between 2011 & 2014 as part of the preparation of world heritage nomination dossier included thorough archival research, field surveys, mapping and preparation of inventories using a combination of traditional data procurement and presentation tools as well as creation of advanced digital database using GIS. The major challenges encountered were: need to adapt documentation methodology and survey formats to field conditions, changing and ever widening scope of work, corresponding changes in time frame, management of large quantities of data generated during the process along with difficulties in correlating existing databases procured from the local authority in varying formats. While the end result satisfied the primary aim, the full potential of Heritage Inventory as a protection and management tool will only be realised after its acceptance as the statutory list and its integration within the larger urban development plan to guide conservation, development and management strategy for the city. The rather detailed description of evolution of documentation process and the complexities involved is presented to understand the relevance of methods used in Ahmadabad and guide similar future efforts in the field.
The GMOS cyber(e)-infrastructure: advanced services for supporting science and policy.
Cinnirella, S; D'Amore, F; Bencardino, M; Sprovieri, F; Pirrone, N
2014-03-01
The need for coordinated, systematized and catalogued databases on mercury in the environment is of paramount importance as improved information can help the assessment of the effectiveness of measures established to phase out and ban mercury. Long-term monitoring sites have been established in a number of regions and countries for the measurement of mercury in ambient air and wet deposition. Long term measurements of mercury concentration in biota also produced a huge amount of information, but such initiatives are far from being within a global, systematic and interoperable approach. To address these weaknesses the on-going Global Mercury Observation System (GMOS) project ( www.gmos.eu ) established a coordinated global observation system for mercury as well it retrieved historical data ( www.gmos.eu/sdi ). To manage such large amount of information a technological infrastructure was planned. This high-performance back-end resource associated with sophisticated client applications enables data storage, computing services, telecommunications networks and all services necessary to support the activity. This paper reports the architecture definition of the GMOS Cyber(e)-Infrastructure and the services developed to support science and policy, including the United Nation Environmental Program. It finally describes new possibilities in data analysis and data management through client applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sjaardema, Gregory; Bauer, David; Erik, & Illescas
2017-01-06
The Ioss is a database-independent package for providing an object-oriented, abstract interface to IO capabilities for a finite element application; and concrete database interfaces which provided input and/or output to exodusII, xdmf, generated, and heartbeat database formats. The Ioss provides an object-oriented C++-based IO interface for a finite element application code. The application code can perform all IO operations through the Ioss interface which is typically at a higher abstraction level than the concrete database formats. The Ioss then performs the needed operations to translate the finite element data to the specific format required by the concrete database implementations. Themore » Ioss currently supports interfaces to exodusII, xdmf, generated, and heartbeat formats, but additional formats can be easily added.« less
SU-F-T-231: Improving the Efficiency of a Radiotherapy Peer-Review System for Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, S; Basavatia, A; Garg, M
Purpose: To improve the efficiency of a radiotherapy peer-review system using a commercially available software application for plan quality evaluation and documentation. Methods: A commercial application, FullAccess (Radialogica LLC, Version 1.4.4), was implemented in a Citrix platform for peer-review process and patient documentation. This application can display images, isodose lines, and dose-volume histograms and create plan reports for peer-review process. Dose metrics in the report can also be benchmarked for plan quality evaluation. Site-specific templates were generated based on departmental treatment planning policies and procedures for each disease site, which generally follow RTOG protocols as well as published prospective clinicalmore » trial data, including both conventional fractionation and hypo-fractionation schema. Once a plan is ready for review, the planner exports the plan to FullAccess, applies the site-specific template, and presents the report for plan review. The plan is still reviewed in the treatment planning system, as that is the legal record. Upon physician’s approval of a plan, the plan is packaged for peer review with the plan report and dose metrics are saved to the database. Results: The reports show dose metrics of PTVs and critical organs for the plans and also indicate whether or not the metrics are within tolerance. Graphical results with green, yellow, and red lights are displayed of whether planning objectives have been met. In addition, benchmarking statistics are collected to see where the current plan falls compared to all historical plans on each metric. All physicians in peer review can easily verify constraints by these reports. Conclusion: We have demonstrated the improvement in a radiotherapy peer-review system, which allows physicians to easily verify planning constraints for different disease sites and fractionation schema, allows for standardization in the clinic to ensure that departmental policies are maintained, and builds a comprehensive database for potential clinical outcome evaluation.« less
Application Analysis and Decision with Dynamic Analysis
2014-12-01
pushes the application file and the JSON file containing the metadata from the database . When the 2 files are in place, the consumer thread starts...human analysts and stores it in a database . It would then use some of these data to generate a risk score for the application. However, static analysis...and store them in the primary A2D database for future analysis. 15. SUBJECT TERMS Android, dynamic analysis 16. SECURITY CLASSIFICATION OF: 17
The Game of Life: College Sports and Educational Values.
ERIC Educational Resources Information Center
Shulman, James L.; Bowen, William G.
Drawing on historical research, data on alumni giving, information on budgetary spending on college athletics, and a database of 90,000 students from 30 selective colleges and universities in the 1950s, 1970, and 1990s, this book demonstrates how athletics influences the class composition and campus ethos of selective schools. The chapters are:…
ERIC Educational Resources Information Center
Smarte, Lynn; Starcher, Heather
This ERIC Annual Report presents both accomplishments and historical perspectives, as 2001 marks 35 years of ERIC service in delivering educational research and information to the public. This annual report describes the developments in the database of educational literature, the growing variety of ERIC Web-based products and user services, and…
ERIC Educational Resources Information Center
Holtzclaw, J. David; Eisen, Arri; Whitney, Erika M.; Penumetcha, Meera; Hoey, J. Joseph; Kimbro, K. Sean
2006-01-01
Many students at minority-serving institutions are underexposed to Internet resources such as the human genome project, PubMed, NCBI databases, and other Web-based technologies because of a lack of financial resources. To change this, we designed and implemented a new bioinformatics component to supplement the undergraduate Genetics course at…
The Past in the Future: Problems and Potentials of Historical Reception Studies.
ERIC Educational Resources Information Center
Jensen, Klaus Bruhn
1993-01-01
Gives examples of how qualitative methodologies have been employed to study media reception in the present. Identifies some forms of evidence that can creatively fill the gaps in knowledge about media reception in the past. Argues that the field must develop databases documenting media reception, which may broaden the scope of audience research in…
Uterine transplantation: Review in human research.
Favre-Inhofer, A; Rafii, A; Carbonnel, M; Revaux, A; Ayoubi, J M
2018-06-01
Uterine transplantation is the solution to treat absolute uterine fertility. In this review, we present the historical, medical, technical, psychological and ethical perspectives in human uterine transplantation research. We reviewed the PubMed database following PRISMA guidelines and added data presented by several research teams during the first international congress on uterine transplantation. Copyright © 2018. Published by Elsevier Masson SAS.
Marriage and Family Counseling. Searchlight Plus: Relevant Resources in High Interest Areas. 57+.
ERIC Educational Resources Information Center
Okun, Barbara F.
This information analysis paper is based on a computer search of the ERIC database from November 1966 through March 1984, and on pertinent outside resources related to marriage and family counseling. A brief historical perspective of the field of marriage and family counseling is provided, and the differences and overlaps between family,…
Forest inventory, catastrophic events and historic geospatial assessments in the south
Dennis M. Jacobs
2007-01-01
Catastrophic events are a regular occurrence of disturbance to forestland in the Southern United States. Each major event affects the integrity of the forest inventory database developed and maintained by the Forest Inventory & Analysis Research Work Unit of the U.S. Department of Agriculture, Forest Service. Some of these major disturbances through the years have...
ERIC Educational Resources Information Center
Hsaieh, Hsiao-Chin; Yang, Chia-Ling
2014-01-01
While access to higher education has reached gender parity in Taiwan, the phenomenon of gender segregation and stratification by fields of study and by division of labor persist. In this article, we trace the historical evolution of Taiwan's education system and data using large-scale educational databases to analyze the association of…
Forsythe, Stephen J; Dickins, Benjamin; Jolley, Keith A
2014-12-16
Following the association of Cronobacter spp. to several publicized fatal outbreaks in neonatal intensive care units of meningitis and necrotising enterocolitis, the World Health Organization (WHO) in 2004 requested the establishment of a molecular typing scheme to enable the international control of the organism. This paper presents the application of Next Generation Sequencing (NGS) to Cronobacter which has led to the establishment of the Cronobacter PubMLST genome and sequence definition database (http://pubmlst.org/cronobacter/) containing over 1000 isolates with metadata along with the recognition of specific clonal lineages linked to neonatal meningitis and adult infections Whole genome sequencing and multilocus sequence typing (MLST) has supports the formal recognition of the genus Cronobacter composed of seven species to replace the former single species Enterobacter sakazakii. Applying the 7-loci MLST scheme to 1007 strains revealed 298 definable sequence types, yet only C. sakazakii clonal complex 4 (CC4) was principally associated with neonatal meningitis. This clonal lineage has been confirmed using ribosomal-MLST (51-loci) and whole genome-MLST (1865 loci) to analyse 107 whole genomes via the Cronobacter PubMLST database. This database has enabled the retrospective analysis of historic cases and outbreaks following re-identification of those strains. The Cronobacter PubMLST database offers a central, open access, reliable sequence-based repository for researchers. It has the capacity to create new analysis schemes 'on the fly', and to integrate metadata (source, geographic distribution, clinical presentation). It is also expandable and adaptable to changes in taxonomy, and able to support the development of reliable detection methods of use to industry and regulatory authorities. Therefore it meets the WHO (2004) request for the establishment of a typing scheme for this emergent bacterial pathogen. Whole genome sequencing has additionally shown a range of potential virulence and environmental fitness traits which may account for the association of C. sakazakii CC4 pathogenicity, and propensity for neonatal CNS.
Database Constraints Applied to Metabolic Pathway Reconstruction Tools
Vilaplana, Jordi; Solsona, Francesc; Teixido, Ivan; Usié, Anabel; Karathia, Hiren; Alves, Rui; Mateo, Jordi
2014-01-01
Our group developed two biological applications, Biblio-MetReS and Homol-MetReS, accessing the same database of organisms with annotated genes. Biblio-MetReS is a data-mining application that facilitates the reconstruction of molecular networks based on automated text-mining analysis of published scientific literature. Homol-MetReS allows functional (re)annotation of proteomes, to properly identify both the individual proteins involved in the process(es) of interest and their function. It also enables the sets of proteins involved in the process(es) in different organisms to be compared directly. The efficiency of these biological applications is directly related to the design of the shared database. We classified and analyzed the different kinds of access to the database. Based on this study, we tried to adjust and tune the configurable parameters of the database server to reach the best performance of the communication data link to/from the database system. Different database technologies were analyzed. We started the study with a public relational SQL database, MySQL. Then, the same database was implemented by a MapReduce-based database named HBase. The results indicated that the standard configuration of MySQL gives an acceptable performance for low or medium size databases. Nevertheless, tuning database parameters can greatly improve the performance and lead to very competitive runtimes. PMID:25202745
Teaching Economics Using Historical Novels: Jonathan Harr's "The Lost Painting"
ERIC Educational Resources Information Center
Cotti, Chad; Johnson, Marianne
2012-01-01
Undergraduate students are often interested in and benefit greatly from applications of economic principles. Historical novels drawn from real-world situations can engage students with economic concepts in new ways and provide a useful tool to help enhance instruction. In this article, the authors discuss the use of historical novels generally in…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-20
... 1501.6. Consultations Under Section 106 of the National Historic Preservation Act In accordance with the Advisory Council on Historic Preservation's implementing regulations for section 106 of the National Historic Preservation Act, we are using this notice to initiate consultation with applicable State...
NASA Astrophysics Data System (ADS)
Wohlers, Annika; Damm, Bodo
2017-04-01
Regional data of the Central German Uplands are extracted from the German landslide database in order to understand the complex interactions between landslide risks and public risk awareness considering transportation infrastructure. Most information within the database is gathered by means of archive studies from inventories of emergency agencies, state, press and web archives, company and department records as well as scientific and (geo)technical literature. The information includes land use practices, repair and mitigation measures with resultant costs of the German road network as well as railroad and waterway networks. It therefore contains valuable information of historical and current landslide impacts, elements at risk and provides an overview of spatiotemporal changes in social exposure and vulnerability to landslide hazards over the last 120 years. On a regional scale the recorded infrastructure damages, and consequential repair or mitigation measures were categorized and classified, according to relevant landslide types, processes and types of infrastructure. In a further step, the data of recent landslides are compared with historical and modern repair and mitigation measures and are correlated with socioeconomic concepts. As a result, it is possible to identify some complex interactions between landslide hazard, risk perception, and damage impact, including time lags and intensity thresholds. The data reveal distinct concepts of repairing respectively mitigating landslides on different types of transportation infrastructure, which are not exclusively linked to higher construction efforts (e.g. embankments on railroads and channels), but changing levels of economic losses and risk perception as well. In addition, a shift from low cost prevention measures such as the removal of loose rock and vegetation, rock blasting, and catch barriers towards expensive mitigation measures such as catch fences, soil anchoring and rock nailing over time can be noticed. This temporal shift is associated with a higher public hazard awareness towards landslides which is at some sites linked to an apparent increase in landslide frequency and magnitude. Damm B., Klose M. (2015) The landslide database for Germany: Closing the gap at national level. Geomorphology. 249: 82-93. Klose, M., Damm, B., Terhorst, B. (2015): Landslide cost modeling for transportation infrastructures: a methodological approach. Landslides 12: 321-334. Klose M., Maurischat P., Damm B. (2016) Landslide impacts in Germany: A historical and socioeconomic perspective. Landslides. 13: 183-199.
Geopan AT@S: a Brokering Based Gateway to Georeferenced Historical Maps for Risk Analysis
NASA Astrophysics Data System (ADS)
Previtali, M.
2017-08-01
Importance of ancient and historical maps is nowadays recognized in many applications (e.g., urban planning, landscape valorisation and preservation, land changes identification, etc.). In the last years a great effort has been done by different institutions, such as Geographical Institutes, Public Administrations, and collaborative communities, for digitizing and publishing online collections of historical maps. In spite of this variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required. In addition, problems of interconnection between different data sources and their restricted interoperability may arise. This paper describe a new brokering based gateway developed to assure interoperability between data, in particular georeferenced historical maps and geographic data, gathered from different data providers, with various features and referring to different historical periods. The developed approach is exemplified by a new application named GeoPAN Atl@s that is aimed at linking in Northern Italy area land changes with risk analysis (local seismicity amplification and flooding risk) by using multi-temporal data sources and historic maps.
Timm, Donna F; Jones, Dee; Woodson, Deidra; Cyrus, John W
2012-01-01
Library faculty members at the Health Sciences Library at the LSU Health Shreveport campus offer a database searching class for third-year medical students during their surgery rotation. For a number of years, students completed "ten-minute clinical challenges," but the instructors decided to replace the clinical challenges with innovative exercises using The Edwin Smith Surgical Papyrus to emphasize concepts learned. The Surgical Papyrus is an online resource that is part of the National Library of Medicine's "Turning the Pages" digital initiative. In addition, vintage surgical instruments and historic books are displayed in the classroom to enhance the learning experience.
MicroUse: The Database on Microcomputer Applications in Libraries and Information Centers.
ERIC Educational Resources Information Center
Chen, Ching-chih; Wang, Xiaochu
1984-01-01
Describes MicroUse, a microcomputer-based database on microcomputer applications in libraries and information centers which was developed using relational database manager dBASE II. The description includes its system configuration, software utilized, the in-house-developed dBASE programs, multifile structure, basic functions, MicroUse records,…
NASA Astrophysics Data System (ADS)
Lee, Sangho; Suh, Jangwon; Park, Hyeong-Dong
2015-03-01
Boring logs are widely used in geological field studies since the data describes various attributes of underground and surface environments. However, it is difficult to manage multiple boring logs in the field as the conventional management and visualization methods are not suitable for integrating and combining large data sets. We developed an iPad application to enable its user to search the boring log rapidly and visualize them using the augmented reality (AR) technique. For the development of the application, a standard borehole database appropriate for a mobile-based borehole database management system was designed. The application consists of three modules: an AR module, a map module, and a database module. The AR module superimposes borehole data on camera imagery as viewed by the user and provides intuitive visualization of borehole locations. The map module shows the locations of corresponding borehole data on a 2D map with additional map layers. The database module provides data management functions for large borehole databases for other modules. Field survey was also carried out using more than 100,000 borehole data.
49 CFR 1572.107 - Other analyses.
Code of Federal Regulations, 2011 CFR
2011-10-01
... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...
49 CFR 1572.107 - Other analyses.
Code of Federal Regulations, 2010 CFR
2010-10-01
... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...
49 CFR 1572.107 - Other analyses.
Code of Federal Regulations, 2014 CFR
2014-10-01
... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...
49 CFR 1572.107 - Other analyses.
Code of Federal Regulations, 2012 CFR
2012-10-01
... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...
49 CFR 1572.107 - Other analyses.
Code of Federal Regulations, 2013 CFR
2013-10-01
... applicant poses a security threat based on a search of the following databases: (1) Interpol and other international databases, as appropriate. (2) Terrorist watchlists and related databases. (3) Any other databases...
1987-12-01
Application Programs Intelligent Disk Database Controller Manangement System Operating System Host .1’ I% Figure 2. Intelligent Disk Controller Application...8217. /- - • Database Control -% Manangement System Disk Data Controller Application Programs Operating Host I"" Figure 5. Processor-Per- Head data. Therefore, the...However. these ad- ditional properties have been proven in classical set and relation theory [75]. These additional properties are described here
Historical and social aspects of halitosis.
Elias, Marina Sá; Ferriani, Maria das Graças Carvalho
2006-01-01
Buccal odors have always been a factor of concern for society. This study aims to investigate the historical and social base of halitosis, through systematized research in the database BVS (biblioteca virtual em saúde - virtual library in health) and also in books. Lack of knowledge on how to prevent halitosis allows for its occurrence, limiting quality of life. As social relationships are one of the pillars of the quality of life concept, halitosis needs to be considered a factor of negative interference. Education in health should be accomplished with a view to a dynamic balance, involving human beings' physical and psychological aspects, as well as their social interactions, so that individuals do not become jigsaw puzzles of sick parts.
NASA Technical Reports Server (NTRS)
Moore, Patrick K.
2002-01-01
The 2002 NASA/ASEE KSC History Project focused on a series of seven history initiatives designed to acquire, preserve, and interpret the history of Kennedy Space Center. These seven projects included the co-authoring of Voices From the Cape, historical work with NASA historian Roger Launius, the completion of a series of oral histories with key KSC personnel, a monograph on Public Affairs, the development of a Historical Concept Map (CMap) for history knowledge preservation, advice on KSC history database and web interface capabilities, the development of a KSC oral history program and guidelines of training and collection, and the development of collaborative relationships between Kennedy Space Center, the University of West Florida, and the University of Central Florida.
Functionally Graded Materials Database
NASA Astrophysics Data System (ADS)
Kisara, Katsuto; Konno, Tomomi; Niino, Masayuki
2008-02-01
Functionally Graded Materials Database (hereinafter referred to as FGMs Database) was open to the society via Internet in October 2002, and since then it has been managed by the Japan Aerospace Exploration Agency (JAXA). As of October 2006, the database includes 1,703 research information entries with 2,429 researchers data, 509 institution data and so on. Reading materials such as "Applicability of FGMs Technology to Space Plane" and "FGMs Application to Space Solar Power System (SSPS)" were prepared in FY 2004 and 2005, respectively. The English version of "FGMs Application to Space Solar Power System (SSPS)" is now under preparation. This present paper explains the FGMs Database, describing the research information data, the sitemap and how to use it. From the access analysis, user access results and users' interests are discussed.
Heterogeneity of long-history migration predicts emotion recognition accuracy.
Wood, Adrienne; Rychlowska, Magdalena; Niedenthal, Paula M
2016-06-01
Recent work (Rychlowska et al., 2015) demonstrated the power of a relatively new cultural dimension, historical heterogeneity, in predicting cultural differences in the endorsement of emotion expression norms. Historical heterogeneity describes the number of source countries that have contributed to a country's present-day population over the last 500 years. People in cultures originating from a large number of source countries may have historically benefited from greater and clearer emotional expressivity, because they lacked a common language and well-established social norms. We therefore hypothesized that in addition to endorsing more expressive display rules, individuals from heterogeneous cultures will also produce facial expressions that are easier to recognize by people from other cultures. By reanalyzing cross-cultural emotion recognition data from 92 papers and 82 cultures, we show that emotion expressions of people from heterogeneous cultures are more easily recognized by observers from other cultures than are the expressions produced in homogeneous cultures. Heterogeneity influences expression recognition rates alongside the individualism-collectivism of the perceivers' culture, as more individualistic cultures were more accurate in emotion judgments than collectivistic cultures. This work reveals the present-day behavioral consequences of long-term historical migration patterns and demonstrates the predictive power of historical heterogeneity. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Vacca, G.; Pili, D.; Fiorino, D. R.; Pintus, V.
2017-05-01
The presented work is part of the research project, titled "Tecniche murarie tradizionali: conoscenza per la conservazione ed il miglioramento prestazionale" (Traditional building techniques: from knowledge to conservation and performance improvement), with the purpose of studying the building techniques of the 13th-18th centuries in the Sardinia Region (Italy) for their knowledge, conservation, and promotion. The end purpose of the entire study is to improve the performance of the examined structures. In particular, the task of the authors within the research project was to build a WebGIS to manage the data collected during the examination and study phases. This infrastructure was entirely built using Open Source software. The work consisted of designing a database built in PostgreSQL and its spatial extension PostGIS, which allows to store and manage feature geometries and spatial data. The data input is performed via a form built in HTML and PHP. The HTML part is based on Bootstrap, an open tools library for websites and web applications. The implementation of this template used both PHP and Javascript code. The PHP code manages the reading and writing of data to the database, using embedded SQL queries. As of today, we surveyed and archived more than 300 buildings, belonging to three main macro categories: fortification architectures, religious architectures, residential architectures. The masonry samples investigated in relation to the construction techniques are more than 150. The database is published on the Internet as a WebGIS built using the Leaflet Javascript open libraries, which allows creating map sites with background maps and navigation, input and query tools. This too uses an interaction of HTML, Javascript, PHP and SQL code.
A Communication Framework for Collaborative Defense
2009-02-28
been able to provide sufficient automation to be able to build up the most extensive application signature database in the world with a fraction of...perceived. We have been able to provide sufficient automation to be able to build up the most extensive application signature database in the world with a...that are well understood in the context of databases . These techniques allow users to quickly scan for the existence of a key in a database . 8 To be
Web application for detailed real-time database transaction monitoring for CMS condition data
NASA Astrophysics Data System (ADS)
de Gruttola, Michele; Di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio
2012-12-01
In the upcoming LHC era, database have become an essential part for the experiments collecting data from LHC, in order to safely store, and consistently retrieve, a wide amount of data, which are produced by different sources. In the CMS experiment at CERN, all this information is stored in ORACLE databases, allocated in several servers, both inside and outside the CERN network. In this scenario, the task of monitoring different databases is a crucial database administration issue, since different information may be required depending on different users' tasks such as data transfer, inspection, planning and security issues. We present here a web application based on Python web framework and Python modules for data mining purposes. To customize the GUI we record traces of user interactions that are used to build use case models. In addition the application detects errors in database transactions (for example identify any mistake made by user, application failure, unexpected network shutdown or Structured Query Language (SQL) statement error) and provides warning messages from the different users' perspectives. Finally, in order to fullfill the requirements of the CMS experiment community, and to meet the new development in many Web client tools, our application was further developed, and new features were deployed.
Code of Federal Regulations, 2010 CFR
2010-10-01
... is determined. (c) Application. If a provider has inadequate historical cost records for pre-1966... depreciable-type assets, allowance in lieu of specific recognition of other costs, or return on equity capital... either has no historical cost records or has incomplete records, the determination of historical cost may...
Implementation of Quality Assurance and Quality Control Measures in the National Phenology Database
NASA Astrophysics Data System (ADS)
Gerst, K.; Rosemartin, A.; Denny, E. G.; Marsh, L.; Barnett, L.
2015-12-01
The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database has over 5.5 million observation records for plants and animals for the period 1954-2015. These data have been used in a number of science, conservation and resource management applications, including national assessments of historical and potential future trends in phenology, regional assessments of spatio-temporal variation in organismal activity, and local monitoring for invasive species detection. Customizable data downloads are freely available, and data are accompanied by FGDC-compliant metadata, data-use and data-attribution policies, and vetted documented methodologies and protocols. The USA-NPN has implemented a number of measures to ensure both quality assurance and quality control. Here we describe the resources that have been developed so that incoming data submitted by both citizen and professional scientists are reliable; these include training materials, such as a botanical primer and species profiles. We also describe a number of automated quality control processes applied to incoming data streams to optimize data output quality. Existing and planned quality control measures for output of raw and derived data include: (1) Validation of site locations, including latitude, longitude, and elevation; (2) Flagging of records that conflict for a given date for an individual plant; (3) Flagging where species occur outside known ranges; (4) Flagging of records when phenophases occur outside of the plausible order for a species; (5) Flagging of records when intensity measures do not follow a plausible progression for a phenophase; (6) Flagging of records when a phenophase occurs outside of the plausible season, and (7) Quantification of precision and uncertainty for estimation of phenological metrics. Finally, we will describe preliminary work to develop methods for outlier detection that will inform plausibility checks. Ultimately we aim to maximize data quality of USA-NPN data and data products to ensure that this database can continue to be reliably applied for science and decision-making for multiple scales and applications.
Impacts of air pollution and climate on materials in Athens, Greece
NASA Astrophysics Data System (ADS)
Christodoulakis, John; Tzanis, Chris G.; Varotsos, Costas A.; Ferm, Martin; Tidblad, Johan
2017-01-01
For more than 10 years now the National and Kapodistrian University of Athens, Greece, has contributed to the UNECE (United Nations Economic Commission for Europe) ICP Materials (International Co-operative Programme on Effects on Materials including Historic and Cultural Monuments) programme for monitoring the corrosion/soiling levels of different kinds of materials due to environmental air-quality parameters. In this paper we present the results obtained from the analysis of observational data that were collected in Athens during the period 2003-2012. According to these results, the corrosion/soiling of the particular exposed materials tends to decrease over the years, except for the case of copper. Based on this long experimental database that is applicable to the multi-pollutant situation in the Athens basin, we present dose-response functions (DRFs) considering that dose
stands for the air pollutant concentration, response
for the material mass loss (normally per annum) and function
, the relationship derived by the best statistical fit to the data.
Zuppa, Athena; Vijayakumar, Sundararajan; Jayaraman, Bhuvana; Patel, Dimple; Narayan, Mahesh; Vijayakumar, Kalpana; Mondick, John T; Barrett, Jeffrey S
2007-09-01
Drug utilization in the inpatient setting can provide a mechanism to assess drug prescribing trends, efficiency, and cost-effectiveness of hospital formularies and examine subpopulations for which prescribing habits may be different. Such data can be used to correlate trends with time-dependent or seasonal changes in clinical event rates or the introduction of new pharmaceuticals. It is now possible to provide a robust, dynamic analysis of drug utilization in a large pediatric inpatient setting through the creation of a Web-based hospital drug utilization system that retrieves source data from our accounting database. The production implementation provides a dynamic and historical account of drug utilization at the authors' institution. The existing application can easily be extended to accommodate a multi-institution environment. The creation of a national or even global drug utilization network would facilitate the examination of geographical and/or socioeconomic influences in drug utilization and prescribing practices in general.
Real Time Computation of Kinetic Constraints to Support Equilibrium Reconstruction
NASA Astrophysics Data System (ADS)
Eggert, W. J.; Kolemen, E.; Eldon, D.
2016-10-01
A new method for quickly and automatically applying kinetic constraints to EFIT equilibrium reconstructions using readily available data is presented. The ultimate goal is to produce kinetic equilibrium reconstructions in real time and use them to constrain the DCON stability code as part of a disruption avoidance scheme. A first effort presented here replaces CPU-time expensive modules, such as the fast ion pressure profile calculation, with a simplified model. We show with a DIII-D database analysis that we can achieve reasonable predictions for selected applications by modeling the fast ion pressure profile and determining the fit parameters as functions of easily measured quantities including neutron rate and electron temperature on axis. Secondly, we present a strategy for treating Thomson scattering and Charge Exchange Recombination data to automatically form constraints for a kinetic equilibrium reconstruction, a process that historically was performed by hand. Work supported by US DOE DE-AC02-09CH11466 and DE-FC02-04ER54698.
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Chini, Marco; Matgen, Patrick; Giustarini, Laura
2013-04-01
There is a clear need for developing innovative processing chains based on earth observation (EO) data to generate products supporting emergency response and flood management at a global scale. Here an automatic flood mapping application is introduced. The latter is currently hosted on the Grid Processing on Demand (G-POD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver flooded areas using both recent and historical acquisitions of SAR data in an operational framework. It is worth mentioning that the method can be applied to both medium and high resolution SAR images. The flood mapping application consists of two main blocks: 1) A set of query tools for selecting the "crisis image" and the optimal corresponding pre-flood "reference image" from the G-POD archive. 2) An algorithm for extracting flooded areas using the previously selected "crisis image" and "reference image". The proposed method is a hybrid methodology, which combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. The method is based on the calibration of a statistical distribution of "open water" backscatter values inferred from SAR images of floods. Change detection with respect to a pre-flood reference image helps reducing over-detection of inundated areas. The algorithms are computationally efficient and operate with minimum data requirements, considering as input data a flood image and a reference image. Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate pre-flood reference image. Potential users will also be able to apply the implemented flood delineation algorithm. Case studies of several recent high magnitude flooding events (e.g. July 2007 Severn River flood, UK and March 2010 Red River flood, US) observed by high-resolution SAR sensors as well as airborne photography highlight advantages and limitations of the online application. A mid-term target is the exploitation of ESA SENTINEL 1 SAR data streams. In the long term it is foreseen to develop a potential extension of the application for systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis. On-going research activities investigate the usefulness of the method for mapping flood hazard at global scale using databases of historic SAR remote sensing-derived flood inundation maps.
Neuromyelitis optica: Application of computer diagnostics to historical case reports.
Garcia Reitboeck, Pablo; Garrard, Peter; Peters, Timothy
2017-01-01
The retrospective diagnosis of illnesses by medical historians can often be difficult and prone to bias, although knowledge of the medical disorders of historical figures is key to the understanding of their behavior and reactions. The recent application of computer diagnostics to historical figures allows an objective differential diagnosis to be accomplished. Taking an example from clinical neurology, we analyzed the earliest reported cases of Devic's disease (neuromyelitis optica) that commonly affects the optic nerve and spinal cord and was previously often confused with multiple sclerosis. We conclude that in most identified cases the software concurred with the contemporary physicians' interpretation, but some claimed cases either had insufficient data to provide a diagnosis or other possible diagnoses were suggested that had not been considered. Computational methods may, therefore, help historians to diagnose the ailments of historical figures with greater objectivity.
Functional brain imaging in neuropsychology over the past 25 years.
Roalf, David R; Gur, Ruben C
2017-11-01
Outline effects of functional neuroimaging on neuropsychology over the past 25 years. Functional neuroimaging methods and studies will be described that provide a historical context, offer examples of the utility of neuroimaging in specific domains, and discuss the limitations and future directions of neuroimaging in neuropsychology. Tracking the history of publications on functional neuroimaging related to neuropsychology indicates early involvement of neuropsychologists in the development of these methodologies. Initial progress in neuropsychological application of functional neuroimaging has been hampered by costs and the exposure to ionizing radiation. With rapid evolution of functional methods-in particular functional MRI (fMRI)-neuroimaging has profoundly transformed our knowledge of the brain. Its current applications span the spectrum of normative development to clinical applications. The field is moving toward applying sophisticated statistical approaches that will help elucidate distinct neural activation networks associated with specific behavioral domains. The impact of functional neuroimaging on clinical neuropsychology is more circumscribed, but the prospects remain enticing. The theoretical insights and empirical findings of functional neuroimaging have been led by many neuropsychologists and have transformed the field of behavioral neuroscience. Thus far they have had limited effects on the clinical practices of neuropsychologists. Perhaps it is time to add training in functional neuroimaging to the clinical neuropsychologist's toolkit and from there to the clinic or bedside. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Integrated knowledge-based tools for documenting and monitoring damages to built heritage
NASA Astrophysics Data System (ADS)
Cacciotti, R.
2015-08-01
The advancements of information technologies as applied to the most diverse fields of science define a breakthrough in the accessibility and processing of data for both expert and non-expert users. Nowadays it is possible to evidence an increasingly relevant research effort in the context of those domains, such as that of cultural heritage protection, in which knowledge mapping and sharing constitute critical prerequisites for accomplishing complex professional tasks. The aim of this paper is to outline the main results and outputs of the MONDIS research project. This project focusses on the development of integrated knowledge-based tools grounded on an ontological representation of the field of heritage conservation. The scope is to overcome the limitations of earlier databases by the application of modern semantic technologies able to integrate, organize and process useful information concerning damages to built heritage objects. In particular MONDIS addresses the need for supporting a diverse range of stakeholders (e.g. administrators, owners and professionals) in the documentation and monitoring of damages to historical constructions and in finding related remedies. The paper concentrates on the presentation of the following integrated knowledgebased components developed within the project: (I) MONDIS mobile application (plus desktop version), (II) MONDIS record explorer, (III) Ontomind profiles, (IV) knowledge matrix and (V) terminology editor. An example of practical application of the MONDIS integrated system is also provided and finally discussed.
PhOD - The Global Drifter Program
February 2018. (As of May 2018). Download all data Subsets of database are also available through February 2018. Download subsets of data Many historical drogue off dates have been reevaluated. As of November buoydata_15001_feb18.dat-gz, dirfl_15001_feb18.dat May 1, 2018: Milestone reached On Tuesday, May 1, 2018, NOAA's
ERIC Educational Resources Information Center
Luxford, Cynthia J.; Linenberger, Kimberly J.; Raker, Jeffrey R.; Baluyut, John Y.; Reed, Jessica J.; De Silva, Chamila; Holme, Thomas A.
2015-01-01
As a discipline, chemistry enjoys a unique position. While many academic areas prepared "cooperative examinations" in the 1930s, only chemistry maintained the activity within what has become the ACS Examinations Institute. As a result, the long-term existence of community-built, norm-referenced, standardized exams provides a historical…
ERIC Educational Resources Information Center
McCarron, Mary; Carroll, Rachael; Kelly, Caraiosa; McCallion, Philip
2015-01-01
Background:Historically, there has been higher and earlier mortality among people with intellectual disability as compared to the general population, but there have also been methodological problems and differences in the available studies. Method: Data were drawn from the 2012 National Intellectual Disability Database and the Census in Ireland. A…
Site characteristics of red spruce witness tree locations in the uplands of West Virginia, USA
Melissa Thomas-Van Gundy; Michael Strager; James. Rentch
2012-01-01
Knowledge, both of the historical range of spruce-dominated forests and associated site conditions, is needed by land managers to help define restoration goals and potential sites for restoration. We used an existing digital database of witness trees listed in deeds from 1752 to 1899 to compare characteristics of red spruce (Picea rubens Sarg.) sites...
2008-03-01
software- development environment. ▶ Frank W. Bentrem, Ph.D., John T. Sample, Ph.D., and Michael M. Harris he Naval Research Labor - atory (NRL) is the...sonars (Through-the-Sensor technology), supercomputer generated numer- ical models, and historical/ clima - tological databases. It uses a vari- ety of
Use of the Geographic Information System (GIS) in nurseries
Brent Olson; Chad Loreth
2002-01-01
The use of GIs in nursery operations provides a variety of opportunities. All planning activities can be incorporated into an accessible database. GIS can be used to create ways for employees to access and analyze data. The program can be used for historical record keeping. Use of GIS in planning can improve the efficiency of nursery operations. GIS can easily be used...
Nonlinear analysis of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea
NASA Astrophysics Data System (ADS)
Rojo-Garibaldi, Berenice; Salas-de-León, David Alberto; Adela Monreal-Gómez, María; Sánchez-Santillán, Norma Leticia; Salas-Monreal, David
2018-04-01
Hurricanes are complex systems that carry large amounts of energy. Their impact often produces natural disasters involving the loss of human lives and materials, such as infrastructure, valued at billions of US dollars. However, not everything about hurricanes is negative, as hurricanes are the main source of rainwater for the regions where they develop. This study shows a nonlinear analysis of the time series of the occurrence of hurricanes in the Gulf of Mexico and the Caribbean Sea obtained from 1749 to 2012. The construction of the hurricane time series was carried out based on the hurricane database of the North Atlantic basin hurricane database (HURDAT) and the published historical information. The hurricane time series provides a unique historical record on information about ocean-atmosphere interactions. The Lyapunov exponent indicated that the system presented chaotic dynamics, and the spectral analysis and nonlinear analyses of the time series of the hurricanes showed chaotic edge behavior. One possible explanation for this chaotic edge is the individual chaotic behavior of hurricanes, either by category or individually regardless of their category and their behavior on a regular basis.
NASA Astrophysics Data System (ADS)
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-01
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
Using Proxy Records to Document Gulf of Mexico Tropical Cyclones from 1820-1915
Rohli, Robert V.; DeLong, Kristine L.; Harley, Grant L.; Trepanier, Jill C.
2016-01-01
Observations of pre-1950 tropical cyclones are sparse due to observational limitations; therefore, the hurricane database HURDAT2 (1851–present) maintained by the National Oceanic and Atmospheric Administration may be incomplete. Here we provide additional documentation for HURDAT2 from historical United States Army fort records (1820–1915) and other archived documents for 28 landfalling tropical cyclones, 20 of which are included in HURDAT2, along the northern Gulf of Mexico coast. One event that occurred in May 1863 is not currently documented in the HURDAT2 database but has been noted in other studies. We identify seven tropical cyclones that occurred before 1851, three of which are potential tropical cyclones. We corroborate the pre-HURDAT2 storms with a tree-ring reconstruction of hurricane impacts from the Florida Keys (1707–2009). Using this information, we suggest landfall locations for the July 1822 hurricane just west of Mobile, Alabama and 1831 hurricane near Last Island, Louisiana on 18 August. Furthermore, we model the probable track of the August 1831 hurricane using the weighted average distance grid method that incorporates historical tropical cyclone tracks to supplement report locations. PMID:27898726
Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-05-07
A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.
Recent advances on terrain database correlation testing
NASA Astrophysics Data System (ADS)
Sakude, Milton T.; Schiavone, Guy A.; Morelos-Borja, Hector; Martin, Glenn; Cortes, Art
1998-08-01
Terrain database correlation is a major requirement for interoperability in distributed simulation. There are numerous situations in which terrain database correlation problems can occur that, in turn, lead to lack of interoperability in distributed training simulations. Examples are the use of different run-time terrain databases derived from inconsistent on source data, the use of different resolutions, and the use of different data models between databases for both terrain and culture data. IST has been developing a suite of software tools, named ZCAP, to address terrain database interoperability issues. In this paper we discuss recent enhancements made to this suite, including improved algorithms for sampling and calculating line-of-sight, an improved method for measuring terrain roughness, and the application of a sparse matrix method to the terrain remediation solution developed at the Visual Systems Lab of the Institute for Simulation and Training. We review the application of some of these new algorithms to the terrain correlation measurement processes. The application of these new algorithms improves our support for very large terrain databases, and provides the capability for performing test replications to estimate the sampling error of the tests. With this set of tools, a user can quantitatively assess the degree of correlation between large terrain databases.