Evaluation of Acoustic Propagation Paths into the Human Head
2005-07-25
paths. A 3D finite-element solid mesh was constructed using a digital image database of an adult male head. Finite-element analysis was used to model the...air-borne sound pressure amplitude) via the alternate propagation paths. A 3D finite-element solid mesh was constructed using a digital image database ... database of an adult male head Coupled acoustic-mechanical finite-element analysis (FEA) was used to model the wave propagation through the fluid-solid
Operative record using intraoperative digital data in neurosurgery.
Houkin, K; Kuroda, S; Abe, H
2000-01-01
The purpose of this study was to develop a new method for more efficient and accurate operative records using intra-operative digital data in neurosurgery, including macroscopic procedures and microscopic procedures under an operating microscope. Macroscopic procedures were recorded using a digital camera and microscopic procedures were also recorded using a microdigital camera attached to an operating microscope. Operative records were then recorded digitally and filed in a computer using image retouch software and database base software. The time necessary for editing of the digital data and completing the record was less than 30 minutes. Once these operative records are digitally filed, they are easily transferred and used as database. Using digital operative records along with digital photography, neurosurgeons can document their procedures more accurately and efficiently than by the conventional method (handwriting). A complete digital operative record is not only accurate but also time saving. Construction of a database, data transfer and desktop publishing can be achieved using the intra-operative data, including intra-operative photographs.
ERIC Educational Resources Information Center
Kahle, Brewster; Prelinger, Rick; Jackson, Mary E.; Boyack, Kevin W.; Wylie, Brian N.; Davidson, George S.; Witten, Ian H.; Bainbridge, David; Boddie, Stefan J.; Garrison, William A.; Cunningham, Sally Jo; Borgman, Christine L.; Hessel, Heather
2001-01-01
These six articles discuss various issues relating to digital libraries. Highlights include public access to digital materials; intellectual property concerns; the need for collaboration across disciplines; Greenstone software for construction and presentation of digital information collections; the Colorado Digitization Project; and conferences…
Algorithms and methodology used in constructing high-resolution terrain databases
NASA Astrophysics Data System (ADS)
Williams, Bryan L.; Wilkosz, Aaron
1998-07-01
This paper presents a top-level description of methods used to generate high-resolution 3D IR digital terrain databases using soft photogrammetry. The 3D IR database is derived from aerial photography and is made up of digital ground plane elevation map, vegetation height elevation map, material classification map, object data (tanks, buildings, etc.), and temperature radiance map. Steps required to generate some of these elements are outlined. The use of metric photogrammetry is discussed in the context of elevation map development; and methods employed to generate the material classification maps are given. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems. A discussion is also presented on database certification which consists of validation, verification, and accreditation procedures followed to certify that the developed databases give a true representation of the area of interest, and are fully compatible with the targeted digital simulators.
Digital Dental X-ray Database for Caries Screening
NASA Astrophysics Data System (ADS)
Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila
2016-06-01
Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.
Keyless Entry: Building a Text Database Using OCR Technology.
ERIC Educational Resources Information Center
Grotophorst, Clyde W.
1989-01-01
Discusses the use of optical character recognition (OCR) technology to produce an ASCII text database. A tutorial on digital scanning and OCR is provided, and a systems integration project which used the Calera CDP-3000XF scanner and text retrieval software to construct a database of dissertations at George Mason University is described. (four…
Large-scale feature searches of collections of medical imagery
NASA Astrophysics Data System (ADS)
Hedgcock, Marcus W.; Karshat, Walter B.; Levitt, Tod S.; Vosky, D. N.
1993-09-01
Large scale feature searches of accumulated collections of medical imagery are required for multiple purposes, including clinical studies, administrative planning, epidemiology, teaching, quality improvement, and research. To perform a feature search of large collections of medical imagery, one can either search text descriptors of the imagery in the collection (usually the interpretation), or (if the imagery is in digital format) the imagery itself. At our institution, text interpretations of medical imagery are all available in our VA Hospital Information System. These are downloaded daily into an off-line computer. The text descriptors of most medical imagery are usually formatted as free text, and so require a user friendly database search tool to make searches quick and easy for any user to design and execute. We are tailoring such a database search tool (Liveview), developed by one of the authors (Karshat). To further facilitate search construction, we are constructing (from our accumulated interpretation data) a dictionary of medical and radiological terms and synonyms. If the imagery database is digital, the imagery which the search discovers is easily retrieved from the computer archive. We describe our database search user interface, with examples, and compare the efficacy of computer assisted imagery searches from a clinical text database with manual searches. Our initial work on direct feature searches of digital medical imagery is outlined.
Overview of Historical Earthquake Document Database in Japan and Future Development
NASA Astrophysics Data System (ADS)
Nishiyama, A.; Satake, K.
2014-12-01
In Japan, damage and disasters from historical large earthquakes have been documented and preserved. Compilation of historical earthquake documents started in the early 20th century and 33 volumes of historical document source books (about 27,000 pages) have been published. However, these source books are not effectively utilized for researchers due to a contamination of low-reliability historical records and a difficulty for keyword searching by characters and dates. To overcome these problems and to promote historical earthquake studies in Japan, construction of text database started in the 21 century. As for historical earthquakes from the beginning of the 7th century to the early 17th century, "Online Database of Historical Documents in Japanese Earthquakes and Eruptions in the Ancient and Medieval Ages" (Ishibashi, 2009) has been already constructed. They investigated the source books or original texts of historical literature, emended the descriptions, and assigned the reliability of each historical document on the basis of written age. Another database compiled the historical documents for seven damaging earthquakes occurred along the Sea of Japan coast in Honshu, central Japan in the Edo period (from the beginning of the 17th century to the middle of the 19th century) and constructed text database and seismic intensity data base. These are now publicized on the web (written only in Japanese). However, only about 9 % of the earthquake source books have been digitized so far. Therefore, we plan to digitize all of the remaining historical documents by the research-program which started in 2014. The specification of the data base will be similar for previous ones. We also plan to combine this database with liquefaction traces database, which will be constructed by other research program, by adding the location information described in historical documents. Constructed database would be utilized to estimate the distributions of seismic intensities and tsunami heights.
NOAA Photo Library Banner Takes you to the Top Page Takes you to the About this Site page. Takes . Skip Theberge (NOAA Central Library) -- Collection development, site content, image digitization, and database construction. Kristin Ward (NOAA Central Library) -- HTML page construction Without the generosity
Fast Multiclass Segmentation using Diffuse Interface Methods on Graphs
2013-02-01
000 28 × 28 images of handwritten digits 0 through 9. Examples of entries can be found in Figure 6. The task is to classify each of the images into the...database of handwritten digits .” [Online]. Available: http://yann.lecun.com/exdb/mnist/ [36] J. Lellmann, J. H. Kappes, J. Yuan, F. Becker, and C...corresponding digit . The images include digits from 0 to 9; thus, this is a 10 class segmentation problem. To construct the weight matrix, we used N
Durack, Jeremy C.; Chao, Chih-Chien; Stevenson, Derek; Andriole, Katherine P.; Dev, Parvati
2002-01-01
Medical media collections are growing at a pace that exceeds the value they currently provide as research and educational resources. To address this issue, the Stanford MediaServer was designed to promote innovative multimedia-based application development. The nucleus of the MediaServer platform is a digital media database strategically designed to meet the information needs of many biomedical disciplines. Key features include an intuitive web-based interface for collaboratively populating the media database, flexible creation of media collections for diverse and specialized purposes, and the ability to construct a variety of end-user applications from the same database to support biomedical education and research. PMID:12463820
Durack, Jeremy C; Chao, Chih-Chien; Stevenson, Derek; Andriole, Katherine P; Dev, Parvati
2002-01-01
Medical media collections are growing at a pace that exceeds the value they currently provide as research and educational resources. To address this issue, the Stanford MediaServer was designed to promote innovative multimedia-based application development. The nucleus of the MediaServer platform is a digital media database strategically designed to meet the information needs of many biomedical disciplines. Key features include an intuitive web-based interface for collaboratively populating the media database, flexible creation of media collections for diverse and specialized purposes, and the ability to construct a variety of end-user applications from the same database to support biomedical education and research.
Research and development of a digital design system for hull structures
NASA Astrophysics Data System (ADS)
Zhan, Yi-Ting; Ji, Zhuo-Shang; Liu, Yin-Dong
2007-06-01
Methods used for digital ship design were studied and formed the basis of a proposed frame model suitable for ship construction modeling. Based on 3-D modeling software, a digital design system for hull structures was developed. Basic software systems for modeling, modifying, and assembly simulation were developed. The system has good compatibility, and models created by it can be saved in different 3-D file formats, and 2D engineering drawings can be output directly. The model can be modified dynamically, overcoming the necessity of repeated modifications during hull structural design. Through operations such as model construction, intervention inspection, and collision detection, problems can be identified and modified during the hull structural design stage. Technologies for centralized control of the system, database management, and 3-D digital design are integrated into this digital model in the preliminary design stage of shipbuilding.
Digital Earth system based river basin data integration
NASA Astrophysics Data System (ADS)
Zhang, Xin; Li, Wanqing; Lin, Chao
2014-12-01
Digital Earth is an integrated approach to build scientific infrastructure. The Digital Earth systems provide a three-dimensional visualization and integration platform for river basin data which include the management data, in situ observation data, remote sensing observation data and model output data. This paper studies the Digital Earth system based river basin data integration technology. Firstly, the construction of the Digital Earth based three-dimensional river basin data integration environment is discussed. Then the river basin management data integration technology is presented which is realized by general database access interface, web service and ActiveX control. Thirdly, the in situ data stored in database tables as records integration is realized with three-dimensional model of the corresponding observation apparatus display in the Digital Earth system by a same ID code. In the next two parts, the remote sensing data and the model output data integration technologies are discussed in detail. The application in the Digital Zhang River basin System of China shows that the method can effectively improve the using efficiency and visualization effect of the data.
Prototype of web-based database of surface wave investigation results for site classification
NASA Astrophysics Data System (ADS)
Hayashi, K.; Cakir, R.; Martin, A. J.; Craig, M. S.; Lorenzo, J. M.
2016-12-01
As active and passive surface wave methods are getting popular for evaluating site response of earthquake ground motion, demand on the development of database for investigation results is also increasing. Seismic ground motion not only depends on 1D velocity structure but also on 2D and 3D structures so that spatial information of S-wave velocity must be considered in ground motion prediction. The database can support to construct 2D and 3D underground models. Inversion of surface wave processing is essentially non-unique so that other information must be combined into the processing. The database of existed geophysical, geological and geotechnical investigation results can provide indispensable information to improve the accuracy and reliability of investigations. Most investigations, however, are carried out by individual organizations and investigation results are rarely stored in the unified and organized database. To study and discuss appropriate database and digital standard format for the surface wave investigations, we developed a prototype of web-based database to store observed data and processing results of surface wave investigations that we have performed at more than 400 sites in U.S. and Japan. The database was constructed on a web server using MySQL and PHP so that users can access to the database through the internet from anywhere with any device. All data is registered in the database with location and users can search geophysical data through Google Map. The database stores dispersion curves, horizontal to vertical spectral ratio and S-wave velocity profiles at each site that was saved in XML files as digital data so that user can review and reuse them. The database also stores a published 3D deep basin and crustal structure and user can refer it during the processing of surface wave data.
Intrusive Rock Database for the Digital Geologic Map of Utah
Nutt, C.J.; Ludington, Steve
2003-01-01
Digital geologic maps offer the promise of rapid and powerful answers to geologic questions using Geographic Information System software (GIS). Using modern GIS and database methods, a specialized derivative map can be easily prepared. An important limitation can be shortcomings in the information provided in the database associated with the digital map, a database which is often based on the legend of the original map. The purpose of this report is to show how the compilation of additional information can, when prepared as a database that can be used with the digital map, be used to create some types of derivative maps that are not possible with the original digital map and database. This Open-file Report consists of computer files with information about intrusive rocks in Utah that can be linked to the Digital Geologic Map of Utah (Hintze et al., 2000), an explanation of how to link the databases and map, and a list of references for the databases. The digital map, which represents the 1:500,000-scale Geologic Map of Utah (Hintze, 1980), can be obtained from the Utah Geological Survey (Map 179DM). Each polygon in the map has a unique identification number. We selected the polygons identified on the geologic map as intrusive rock, and constructed a database (UT_PLUT.xls) that classifies the polygons into plutonic map units (see tables). These plutonic map units are the key information that is used to relate the compiled information to the polygons on the map. The map includes a few polygons that were coded as intrusive on the state map but are largely volcanic rock; in these cases we note the volcanic rock names (rhyolite and latite) as used in the original sources Some polygons identified on the digital state map as intrusive rock were misidentified; these polygons are noted in a separate table of the database, along with some information about their true character. Fields may be empty because of lack of information from references used or difficulty in finding information. The information in the database is from a variety of sources, including geologic maps at scales ranging from 1:500,000 to 1:24,000, and thesis monographs. The references are shown twice: alphabetically and by region. The digital geologic map of Utah (Hintze and others, 2000) classifies intrusive rocks into only 3 categories, distinguished by age. They are: Ti, Tertiary intrusive rock; Ji, Upper to Middle Jurassic granite to quartz monzonite; and pCi, Early Proterozoic to Late Archean intrusive rock. Use of the tables provided in this report will permit selection and classification of those rocks by lithology and age. This database is a pilot study by the Survey and Analysis Project of the U.S. Geological Survey to characterize igneous rocks and link them to a digital map. The database, and others like it, will evolve as the project continues and other states are completed. We release this version now as an example, as a reference, and for those interested in Utah plutonic rocks.
Unconsolidated Aquifers in Tompkins County, New York
Miller, Todd S.
2000-01-01
Unconsolidated aquifers consisting of saturated sand and gravel are capable of supplying large quantities of good-quality water to wells in Tompkins County, but little published geohydrologic inform ation on such aquifers is available. In 1986, the U.S.Geological Survey (USGS) began collecting geohydrologic information and well data to construct an aquifer map showing the extent of unconsolidated aquifers in Tompkins county. Data sources included (1) water-well drillers. logs; (2) highway and other construction test-boring logs; (3) well data gathered by the Tompkins County Department of Health, (4) test-well logs from geohydrologic consultants that conducted projects for site-specific studies, and (5) well data that had been collected during past investigations by the USGS and entered into the National Water Information System (NWIS) database. In 1999, the USGS, in cooperation with the Tompkins County Department of Planning, compiled these data to construct this map. More than 600 well records were entered into the NWIS database in 1999 to supplement the 350 well records already in the database; this provided a total of 950 well records. The data were digitized and imported into a geographic information system (GIS) coverage so that well locations could be plotted on a map, and well data could be tabulated in a digital data base through ARC/INFO software. Data on the surficial geology were used with geohydrologic data from well records and previous studies to delineate the extent of aquifers on this map. This map depicts (1) the extent of unconsolidated aquifers in Tompkins County, and (2) locations of wells whose records were entered into the USGS NWIS database and made into a GIS digital coverage. The hydrologic information presented here is generalized and is not intended for detailed site evaluations. Precise locations of geohydrologic-unit boundaries, and a description of the hydrologic conditions within the units, would require additional detailed, site-specific information.
Common modeling system for digital simulation
NASA Technical Reports Server (NTRS)
Painter, Rick
1994-01-01
The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.
Alaska digital aeromagnetic database description
Connard, G.G.; Saltus, R.W.; Hill, P.L.; Carlson, L.; Milicevic, B.
1999-01-01
Northwest Geophysical Associates, Inc. (NGA) was contracted by the U.S. Geological Survey (USGS) to construct a database containing original aeromagnotic data (in digital form) from surveys, maps and grids for the State of Alaska from existing public-domain magnetic data. This database facilitates thedetailed study and interpretation of aeromagnetic data along flightline profiles and allows construction of custom grids for selected regions of Alaska. The database is linked to and reflect? the work from the statewide gridded compilation completed under a prior contract. The statewide gridded compilation is also described in Saltus and Simmons (1997) and in Saltus and others (1999). The database area generally covers the on-shore portion of the State of Alaska and the northern Gulf of Alaska excluding the Aleutian Islands. The area extends from 54'N to 72'N latitude and 129'W to 169'W longitude. The database includes the 85 surveys that were included in the previous statewide gridded compilation. Figure (1) shows the extents of the 85 individual data sets included in the statewide grids. NGA subcontracted a significant portion of the work described in this report to Paterson, Grant, and Watson Limited (PGW). Prior work by PGW (described in Meyer and Saltus, 1995 and Meyer and others, 1998) for the interior portion of Alrska (INTAK) is included in this present study. The previous PGW project compiled 25 of the 85 surveys included in the statewide grids. PGW also contributed 10 additional data sets that were not included in either of the prior contracts or the statewide grids. These additional data sets are included in the current project in the interest of making the database as complete as possible. Figure (2) shows the location of the additional data sets.
Database for the geologic map of Upper Geyser Basin, Yellowstone National Park, Wyoming
Abendini, Atosa A.; Robinson, Joel E.; Muffler, L. J. Patrick; White, D. E.; Beeson, Melvin H.; Truesdell, A. H.
2015-01-01
This dataset contains contacts, geologic units, and map boundaries from Miscellaneous Investigations Series Map I-1371, "The Geologic map of upper Geyser Basin, Yellowstone, National Park, Wyoming". This dataset was constructed to produce a digital geologic map as a basis for ongoing studies of hydrothermal processes.
NASA Astrophysics Data System (ADS)
Sakano, Toshikazu; Yamaguchi, Takahiro; Fujii, Tatsuya; Okumura, Akira; Furukawa, Isao; Ono, Sadayasu; Suzuki, Junji; Ando, Yutaka; Kohda, Ehiichi; Sugino, Yoshinori; Okada, Yoshiyuki; Amaki, Sachi
2000-05-01
We constructed a high-speed medical information network testbed, which is one of the largest testbeds in Japan, and applied it to practical medical checkups for the first time. The constructed testbed, which we call IMPACT, consists of a Super-High Definition Imaging system, a video conferencing system, a remote database system, and a 6 - 135 Mbps ATM network. The interconnected facilities include the School of Medicine in Keio University, a company's clinic, and an NTT R&D center, all in and around Tokyo. We applied IMPACT to the mass screening of the upper gastrointestinal (UGI) tract at the clinic. All 5419 radiographic images acquired at them clinic for 523 employees were digitized (2048 X 1698 X 12 bits) and transferred to a remote database in NTT. We then picked up about 50 images from five patients and sent them to nine radiological specialists at Keio University. The processing, which includes film digitization, image data transfer, and database registration, took 574 seconds per patient in average. The average reading time at Keio Univ. was 207 seconds. The overall processing time was estimated to be 781 seconds per patient. From these experimental results, we conclude that quasi-real time tele-medical checkups are possible with our prototype system.
Evaluation of Acoustic Propagation Paths into the Human Head
2005-04-01
pressure amplitude) via the alternate propagation paths. A 3D finite-element solid mesh was constructed using a digital image database of an adult...optics, rays are used to depict the path or paths taken as a light wave travels through a lens. However, in optics, the eikonal equation can be solved
Forsell, M; Häggström, M; Johansson, O; Sjögren, P
2008-11-08
To develop a personal digital assistant (PDA) application for oral health assessment fieldwork, including back-office and database systems (MobilDent). System design, construction and implementation of PDA, back-office and database systems. System requirements for MobilDent were collected, analysed and translated into system functions. User interfaces were implemented and system architecture was outlined. MobilDent was based on a platform with. NET (Microsoft) components, using an SQL Server 2005 (Microsoft) for data storage with Windows Mobile (Microsoft) operating system. The PDA devices were Dell Axim. System functions and user interfaces were specified for MobilDent. User interfaces for PDA, back-office and database systems were based on. NET programming. The PDA user interface was based on Windows suitable to a PDA display, whereas the back-office interface was designed for a normal-sized computer screen. A synchronisation module (MS Active Sync, Microsoft) was used to enable download of field data from PDA to the database. MobilDent is a feasible application for oral health assessment fieldwork, and the oral health assessment database may prove a valuable source for care planning, educational and research purposes. Further development of the MobilDent system will include wireless connectivity with download-on-demand technology.
Buckets: Smart Objects for Digital Libraries
NASA Technical Reports Server (NTRS)
Nelson, Michael L.
2001-01-01
Current discussion of digital libraries (DLs) is often dominated by the merits of the respective storage, search and retrieval functionality of archives, repositories, search engines, search interfaces and database systems. While these technologies are necessary for information management, the information content is more important than the systems used for its storage and retrieval. Digital information should have the same long-term survivability prospects as traditional hardcopy information and should be protected to the extent possible from evolving search engine technologies and vendor vagaries in database management systems. Information content and information retrieval systems should progress on independent paths and make limited assumptions about the status or capabilities of the other. Digital information can achieve independence from archives and DL systems through the use of buckets. Buckets are an aggregative, intelligent construct for publishing in DLs. Buckets allow the decoupling of information content from information storage and retrieval. Buckets exist within the Smart Objects and Dumb Archives model for DLs in that many of the functionalities and responsibilities traditionally associated with archives are pushed down (making the archives dumber) into the buckets (making them smarter). Some of the responsibilities imbued to buckets are the enforcement of their terms and conditions, and maintenance and display of their contents.
Design of a diagnostic encyclopaedia using AIDA.
van Ginneken, A M; Smeulders, A W; Jansen, W
1987-01-01
Diagnostic Encyclopaedia Workstation (DEW) is the name of a digital encyclopaedia constructed to contain reference knowledge with respect to the pathology of the ovary. Comparing DEW with the common sources of reference knowledge (i.e. books) leads to the following advantages of DEW: it contains more verbal knowledge, pictures and case histories, and it offers information adjusted to the needs of the user. Based on an analysis of the structure of this reference knowledge we have chosen AIDA to develop a relational database and we use a video-disc player to contain the pictorial part of the database. The system consists of a database input version and a read-only run version. The design of the database input version is discussed. Reference knowledge for ovary pathology requires 1-3 Mbytes of memory. At present 15% of this amount is available. The design of the run version is based on an analysis of which information must necessarily be specified to the system by the user to access a desired item of information. Finally, the use of AIDA in constructing DEW is evaluated.
Peng, Jinye; Babaguchi, Noboru; Luo, Hangzai; Gao, Yuli; Fan, Jianping
2010-07-01
Digital video now plays an important role in supporting more profitable online patient training and counseling, and integration of patient training videos from multiple competitive organizations in the health care network will result in better offerings for patients. However, privacy concerns often prevent multiple competitive organizations from sharing and integrating their patient training videos. In addition, patients with infectious or chronic diseases may not want the online patient training organizations to identify who they are or even which video clips they are interested in. Thus, there is an urgent need to develop more effective techniques to protect both video content privacy and access privacy . In this paper, we have developed a new approach to construct a distributed Hippocratic video database system for supporting more profitable online patient training and counseling. First, a new database modeling approach is developed to support concept-oriented video database organization and assign a degree of privacy of the video content for each database level automatically. Second, a new algorithm is developed to protect the video content privacy at the level of individual video clip by filtering out the privacy-sensitive human objects automatically. In order to integrate the patient training videos from multiple competitive organizations for constructing a centralized video database indexing structure, a privacy-preserving video sharing scheme is developed to support privacy-preserving distributed classifier training and prevent the statistical inferences from the videos that are shared for cross-validation of video classifiers. Our experiments on large-scale video databases have also provided very convincing results.
Keller, Gordon R.; Hildenbrand, T.G.; Kucks, R.; Webring, M.; Briesacher, A.; Rujawitz, K.; Hittleman, A.M.; Roman, D.R.; Winester, D.; Aldouri, R.; Seeley, J.; Rasillo, J.; Torres, R.; Hinze, W. J.; Gates, A.; Kreinovich, V.; Salayandia, L.
2006-01-01
Potential field data (gravity and magnetic measurements) are both useful and costeffective tools for many geologic investigations. Significant amounts of these data are traditionally in the public domain. A new magnetic database for North America was released in 2002, and as a result, a cooperative effort between government agencies, industry, and universities to compile an upgraded digital gravity anomaly database, grid, and map for the conterminous United States was initiated and is the subject of this paper. This database is being crafted into a data system that is accessible through a Web portal. This data system features the database, software tools, and convenient access. The Web portal will enhance the quality and quantity of data contributed to the gravity database that will be a shared community resource. The system's totally digital nature ensures that it will be flexible so that it can grow and evolve as new data, processing procedures, and modeling and visualization tools become available. Another goal of this Web-based data system is facilitation of the efforts of researchers and students who wish to collect data from regions currently not represented adequately in the database. The primary goal of upgrading the United States gravity database and this data system is to provide more reliable data that support societal and scientific investigations of national importance. An additional motivation is the international intent to compile an enhanced North American gravity database, which is critical to understanding regional geologic features, the tectonic evolution of the continent, and other issues that cross national boundaries. ?? 2006 Geological Society of America. All rights reserved.
NASA Astrophysics Data System (ADS)
Grindlay, J.; Tang, S.; Simcoe, R.; Laycock, S.; Los, E.; Mink, D.; Doane, A.; Champine, G.
2009-08-01
The temporal Universe is now possible to study on previously inaccessible timescales of days to decades, over a full century, with the planned full-digitization of the Harvard plate collection. The Digital Access to a Sky Century @ Harvard (DASCH) project has developed the world's highest-speed precision plate scanner and the required software to digitize the ˜500,000 glass photographic plates (mostly 20 x 25~cm) that record images of the full sky taken by some 20 telescopes in both hemispheres over the period 1880 - 1985. These provide ˜500-1000 measures of any object brighter than the plate limit (typically B ˜14 - 17) with photometric accuracy from the digital image typically Δm ˜0.10 - 0.15 mag, with the presently developed photometry pipeline and spatially-dependent calibration (using the Hubble Guide Star Catalog) for each plate. We provide an overview of DASCH, the processing, and example light curves that illustrate the power of this unique dataset and resource. Production scanning and serving on-line the entire ˜1 PB database (both images and derived light curves) on spinning disk could be completed within ˜3 - 5 y after funding (for scanner operations and database construction) is obtained.
Interhospital network system using the worldwide web and the common gateway interface.
Oka, A; Harima, Y; Nakano, Y; Tanaka, Y; Watanabe, A; Kihara, H; Sawada, S
1999-05-01
We constructed an interhospital network system using the worldwide web (WWW) and the Common Gateway Interface (CGI). Original clinical images are digitized and stored as a database for educational and research purposes. Personal computers (PCs) are available for data treatment and browsing. Our system is simple, as digitized images are stored into a Unix server machine. Images of important and interesting clinical cases are selected and registered into the image database using CGI. The main image format is 8- or 12-bit Joint Photographic Experts Group (JPEG) image. Original clinical images are finally stored in CD-ROM using a CD recorder. The image viewer can browse all of the images for one case at once as thumbnail pictures; image quality can be selected depending on the user's purpose. Using the network system, clinical images of interesting cases can be rapidly transmitted and discussed with other related hospitals. Data transmission from relational hospitals takes 1 to 2 minutes per 500 Kbyte of data. More distant hospitals (e.g., Rakusai Hospital, Kyoto) takes 1 minute more. The mean number of accesses our image database in a recent 3-month period was 470. There is a total about 200 cases in our image database, acquired over the past 2 years. Our system is useful for communication and image treatment between hospitals and we will describe the elements of our system and image database.
Digital release of the Alaska Quaternary fault and fold database
NASA Astrophysics Data System (ADS)
Koehler, R. D.; Farrell, R.; Burns, P.; Combellick, R. A.; Weakland, J. R.
2011-12-01
The Alaska Division of Geological & Geophysical Surveys (DGGS) has designed a Quaternary fault and fold database for Alaska in conformance with standards defined by the U.S. Geological Survey for the National Quaternary fault and fold database. Alaska is the most seismically active region of the United States, however little information exists on the location, style of deformation, and slip rates of Quaternary faults. Thus, to provide an accurate, user-friendly, reference-based fault inventory to the public, we are producing a digital GIS shapefile of Quaternary fault traces and compiling summary information on each fault. Here, we present relevant information pertaining to the digital GIS shape file and online access and availability of the Alaska database. This database will be useful for engineering geologic studies, geologic, geodetic, and seismic research, and policy planning. The data will also contribute to the fault source database being constructed by the Global Earthquake Model (GEM), Faulted Earth project, which is developing tools to better assess earthquake risk. We derived the initial list of Quaternary active structures from The Neotectonic Map of Alaska (Plafker et al., 1994) and supplemented it with more recent data where available. Due to the limited level of knowledge on Quaternary faults in Alaska, pre-Quaternary fault traces from the Plafker map are shown as a layer in our digital database so users may view a more accurate distribution of mapped faults and to suggest the possibility that some older traces may be active yet un-studied. The database will be updated as new information is developed. We selected each fault by reviewing the literature and georegistered the faults from 1:250,000-scale paper maps contained in 1970's vintage and earlier bedrock maps. However, paper map scales range from 1:20,000 to 1:500,000. Fault parameters in our GIS fault attribute tables include fault name, age, slip rate, slip sense, dip direction, fault line type (i.e., well constrained, moderately constrained, or inferred), and mapped scale. Each fault is assigned a three-integer CODE, based upon age, slip rate, and how well the fault is located. This CODE dictates the line-type for the GIS files. To host the database, we are developing an interactive web-map application with ArcGIS for Server and the ArcGIS API for JavaScript from Environmental Systems Research Institute, Inc. (Esri). The web-map application will present the database through a visible scale range with each fault displayed at the resolution of the original map. Application functionality includes: search by name or location, identification of fault by manual selection, and choice of base map. Base map options include topographic, satellite imagery, and digital elevation maps available from ArcGIS on-line. We anticipate that the database will be publically accessible from a portal embedded on the DGGS website by the end of 2011.
Rhinoplasty perioperative database using a personal digital assistant.
Kotler, Howard S
2004-01-01
To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.
NASA Technical Reports Server (NTRS)
Shearrow, Charles A.
1999-01-01
One of the identified goals of EM3 is to implement virtual manufacturing by the time the year 2000 has ended. To realize this goal of a true virtual manufacturing enterprise the initial development of a machinability database and the infrastructure must be completed. This will consist of the containment of the existing EM-NET problems and developing machine, tooling, and common materials databases. To integrate the virtual manufacturing enterprise with normal day to day operations the development of a parallel virtual manufacturing machinability database, virtual manufacturing database, virtual manufacturing paradigm, implementation/integration procedure, and testable verification models must be constructed. Common and virtual machinability databases will include the four distinct areas of machine tools, available tooling, common machine tool loads, and a materials database. The machine tools database will include the machine envelope, special machine attachments, tooling capacity, location within NASA-JSC or with a contractor, and availability/scheduling. The tooling database will include available standard tooling, custom in-house tooling, tool properties, and availability. The common materials database will include materials thickness ranges, strengths, types, and their availability. The virtual manufacturing databases will consist of virtual machines and virtual tooling directly related to the common and machinability databases. The items to be completed are the design and construction of the machinability databases, virtual manufacturing paradigm for NASA-JSC, implementation timeline, VNC model of one bridge mill and troubleshoot existing software and hardware problems with EN4NET. The final step of this virtual manufacturing project will be to integrate other production sites into the databases bringing JSC's EM3 into a position of becoming a clearing house for NASA's digital manufacturing needs creating a true virtual manufacturing enterprise.
Gao, Jun-Xue; Pei, Qiu-Yan; Li, Yun-Tao; Yang, Zhen-Juan
2015-06-01
The aim of this study was to create a database of anatomical ultrathin cross-sectional images of fetal hearts with different congenital heart diseases (CHDs) and preliminarily to investigate its clinical application. Forty Chinese fetal heart samples from induced labor due to different CHDs were cut transversely at 60-μm thickness. All thoracic organs were removed from the thoracic cavity after formalin fixation, embedded in optimum cutting temperature compound, and then frozen at -25°C for 2 hours. Subsequently, macro shots of the frozen serial sections were obtained using a digital camera in order to build a database of anatomical ultrathin cross-sectional images. Images in the database clearly displayed the fetal heart structures. After importing the images into three-dimensional software, the following functions could be realized: (1) based on the original database of transverse sections, databases of sagittal and coronal sections could be constructed; and (2) the original and constructed databases could be displayed continuously and dynamically, and rotated in arbitrary angles. They could also be displayed synchronically. The aforementioned functions of the database allowed for the retrieval of images and three-dimensional anatomy characteristics of the different fetal CHDs, and virtualization of fetal echocardiography findings. A database of 40 different cross-sectional fetal CHDs was established. An extensive database library of fetal CHDs, from which sonographers and students can study the anatomical features of fetal CHDs and virtualize fetal echocardiography findings via either centralized training or distance education, can be established in the future by accumulating further cases. Copyright © 2015. Published by Elsevier B.V.
Digital Video of Live-Scan Fingerprint Data
National Institute of Standards and Technology Data Gateway
NIST Digital Video of Live-Scan Fingerprint Data (PC database for purchase) NIST Special Database 24 contains MPEG-2 (Moving Picture Experts Group) compressed digital video of live-scan fingerprint data. The database is being distributed for use in developing and testing of fingerprint verification systems.
Black, J A; Waggamon, K A
1992-01-01
An isoelectric focusing method using thin-layer agarose gel has been developed for wheat gliadin. Using flat-bed units with a third electrode, up to 72 samples per gel may be analyzed. Advantages over traditional acid polyacrylamide gel electrophoresis methodology include: faster run times, nontoxic media, and greater sample capacity. The method is suitable for fingerprinting or purity testing of wheat varieties. Using digital images captured by a flat-bed scanner, a 4-band reference system using isoelectric points was devised. Software enables separated bands to be assigned pI values based upon reference tracks. Precision of assigned isoelectric points is shown to be on the order of 0.02 pH units. Captured images may be stored in a computer database and compared to unknown patterns to enable an identification. Parameters for a match with a stored pattern may be adjusted for pI interval required for a match, and number of best matches.
Geologic Map of the Tucson and Nogales Quadrangles, Arizona (Scale 1:250,000): A Digital Database
Peterson, J.A.; Berquist, J.R.; Reynolds, S.J.; Page-Nedell, S. S.; Digital database by Oland, Gustav P.; Hirschberg, Douglas M.
2001-01-01
The geologic map of the Tucson-Nogales 1:250,000 scale quadrangle (Peterson and others, 1990) was digitized by U.S. Geological Survey staff and University of Arizona contractors at the Southwest Field Office, Tucson, Arizona, in 2000 for input into a geographic information system (GIS). The database was created for use as a basemap in a decision support system designed by the National Industrial Minerals and Surface Processes project. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included; they may be obtained from a variety of commercial and government sources. Additionally, point features, such as strike and dip, were not captured from the original paper map and are not included in the database. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.
Forecasting of construction and demolition waste in Brazil.
Paz, Diogo Hf; Lafayette, Kalinny Pv
2016-08-01
The objective of this article is to develop a computerised tool (software) that facilitates the analysis of strategies for waste management on construction sites through the use of indicators of construction and demolition waste generation. The development involved the following steps: knowledge acquisition, structuring the system, coding and system evaluation. The step of knowledge acquisition aims to provide subsidies for the representation of them through models. In the step of structuring the system, it was presented the structuring and formalisation of knowledge for the development of the system, and has two stages: the construction of the conceptual model and the subsequent instantiation of the model. The coding system aims to implement (code) the conceptual model developed in a model played by computer (digital). The results showed that the system is very useful and applicable in construction sites, helping to improve the quality of waste management, and creating a database that will support new research. © The Author(s) 2016.
Architectural Heritage Visualization Using Interactive Technologies
NASA Astrophysics Data System (ADS)
Albourae, A. T.; Armenakis, C.; Kyan, M.
2017-08-01
With the increased exposure to tourists, historical monuments are at an ever-growing risk of disappearing. Building Information Modelling (BIM) offers a process of digitally documenting of all the features that are made or incorporated into the building over its life-span, thus affords unique opportunities for information preservation. BIM of historical buildings are called Historical Building Information Models (HBIM). This involves documenting a building in detail throughout its history. Geomatics professionals have the potential to play a major role in this area as they are often the first professionals involved on construction development sites for many Architectural, Engineering, and Construction (AEC) projects. In this work, we discuss how to establish an architectural database of a heritage site, digitally reconstruct, preserve and then interact with it through an immersive environment that leverages BIM for exploring historic buildings. The reconstructed heritage site under investigation was constructed in the early 15th century. In our proposed approach, the site selection was based on many factors such as architectural value, size, and accessibility. The 3D model is extracted from the original collected and integrated data (Image-based, range-based, CAD modelling, and land survey methods), after which the elements of the 3D objects are identified by creating a database using the BIM software platform (Autodesk Revit). The use of modern and widely accessible game engine technology (Unity3D) is explored, allowing the user to fully embed and interact with the scene using handheld devices. The details of implementing an integrated pipeline between HBIM, GIS and augmented and virtual reality (AVR) tools and the findings of the work are presented.
New data sources and derived products for the SRER digital spatial database
Craig Wissler; Deborah Angell
2003-01-01
The Santa Rita Experimental Range (SRER) digital database was developed to automate and preserve ecological data and increase their accessibility. The digital data holdings include a spatial database that is used to integrate ecological data in a known reference system and to support spatial analyses. Recently, the Advanced Resource Technology (ART) facility has added...
Mineral resources management based on GIS and RS: a case study of the Laozhaiwan Gold Mine
NASA Astrophysics Data System (ADS)
Wu, Hao; Hua, Xianghong; Wang, Xinzhou; Ma, Liguang; Yuan, Yanbin
2005-10-01
With the development of digital information technology in mining industry, the concept of DM (Digital Mining) and MGIS (Mining Geographical Information System) are becoming the research focus but not perfect. How to effectively manage the dataset of geological, surveying and mineral products grade is the key point that concerned the sustainable development and standardized management in mining industry. Based on the existing combined GIS and remote sensing technology, we propose a model named DMMIS (Digital Mining Management Information System), which is composed of the database layer, the ActiveX layer and the user interface layer. The system is used in Laozhaiwan Gold Mine, Yunnan Province of China, which is shown to demonstrate the feasibility of the research and development achievement stated in this paper. Finally, some conclusions and constructive advices for future research work are given.
47 CFR 74.788 - Digital construction period.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false Digital construction period. 74.788 Section 74... Translator, and TV Booster Stations § 74.788 Digital construction period. (a) Each original construction permit for the construction of a new digital low power television or television translator station shall...
Strabo: An App and Database for Structural Geology and Tectonics Data
NASA Astrophysics Data System (ADS)
Newman, J.; Williams, R. T.; Tikoff, B.; Walker, J. D.; Good, J.; Michels, Z. D.; Ash, J.
2016-12-01
Strabo is a data system designed to facilitate digital storage and sharing of structural geology and tectonics data. The data system allows researchers to store and share field and laboratory data as well as construct new multi-disciplinary data sets. Strabo is built on graph database technology, as opposed to a relational database, which provides the flexibility to define relationships between objects of any type. This framework allows observations to be linked in a complex and hierarchical manner that is not possible in traditional database topologies. Thus, the advantage of the Strabo data structure is the ability of graph databases to link objects in both numerous and complex ways, in a manner that more accurately reflects the realities of the collecting and organizing of geological data sets. The data system is accessible via a mobile interface (iOS and Android devices) that allows these data to be stored, visualized, and shared during primary collection in the field or the laboratory. The Strabo Data System is underlain by the concept of a "Spot," which we define as any observation that characterizes a specific area. This can be anything from a strike and dip measurement of bedding to cross-cutting relationships between faults in complex dissected terrains. Each of these spots can then contain other Spots and/or measurements (e.g., lithology, slickenlines, displacement magnitude.) Hence, the Spot concept is applicable to all relationships and observation sets. Strabo is therefore capable of quantifying and digitally storing large spatial variations and complex geometries of naturally deformed rocks within hierarchically related maps and images. These approaches provide an observational fidelity comparable to a traditional field book, but with the added benefits of digital data storage, processing, and ease of sharing. This approach allows Strabo to integrate seamlessly into the workflow of most geologists. Future efforts will focus on extending Strabo to other sub-disciplines as well as developing a desktop system for the enhanced collection and organization of microstructural data.
Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki
2016-01-01
Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.
Storage and retrieval of digital images in dermatology.
Bittorf, A; Krejci-Papa, N C; Diepgen, T L
1995-11-01
Differential diagnosis in dermatology relies on the interpretation of visual information in the form of clinical and histopathological images. Up until now, reference images have had to be retrieved from textbooks and/or appropriate journals. To overcome inherent limitations of those storage media with respect to the number of images stored, display, and search parameters available, we designed a computer-based database of digitized dermatologic images. Images were taken from the photo archive of the Dermatological Clinic of the University of Erlangen. A database was designed using the Entity-Relationship approach. It was implemented on a PC-Windows platform using MS Access* and MS Visual Basic®. As WWW-server a Sparc 10 workstation was used with the CERN Hypertext-Transfer-Protocol-Daemon (httpd) 3.0 pre 6 software running. For compressed storage on a hard drive, a quality factor of 60 allowed on-screen differential diagnosis and corresponded to a compression factor of 1:35 for clinical images and 1:40 for histopathological images. Hierarchical keys of clinical or histopathological criteria permitted multi-criteria searches. A script using the Common Gateway Interface (CGI) enabled remote search and image retrieval via the World-Wide-Web (W3). A dermatologic image database, featurig clinical and histopathological images was constructed which allows for multi-parameter searches and world-wide remote access.
Spatial digital database for the tectonic map of Southeast Arizona
map by Drewes, Harald; digital database by Fields, Robert A.; Hirschberg, Douglas M.; Bolm, Karen S.
2002-01-01
A spatial database was created for Drewes' (1980) tectonic map of southeast Arizona: this database supercedes Drewes and others (2001, ver. 1.0). Staff and a contractor at the U.S. Geological Survey in Tucson, Arizona completed an interim digital geologic map database for the east part of the map in 2001, made revisions to the previously released digital data for the west part of the map (Drewes and others, 2001, ver. 1.0), merged data files for the east and west parts, and added additional data not previously captured. Digital base map data files (such as topography, roads, towns, rivers and lakes) are not included: they may be obtained from a variety of commercial and government sources. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database can be queried in many ways to produce a variety of geologic maps and derivative products. Because Drewes' (1980) map sheets include additional text and graphics that were not included in this report, scanned images of his maps (i1109_e.jpg, i1109_w.jpg) are included as a courtesy to the reader. This database should not be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files (i1109_e.pdf and i1109_w.pdf) that are provided herein are representations of the database (see Appendix A). The map area is located in southeastern Arizona (fig. 1). This report describes the map units (from Drewes, 1980), the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Helen Kayser (Information Systems Support, Inc.) is greatly appreciated.
Preliminary geologic map of the Piru 7.5' quadrangle, southern California: a digital database
Yerkes, R.F.; Campbell, Russell H.
1995-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1995). More specific information about the units may be available in the original sources.
Knudsen, Keith L.; Noller, Jay S.; Sowers, Janet M.; Lettis, William R.
1997-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There are no paper maps included in the Open-File report. The report does include, however, PostScript plot files containing the images of the geologic map sheets with explanations, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously unpublished data, and new mapping by the authors, represents the general distribution of surficial deposits in the San Francisco bay region. Together with the accompanying text file (sf_geo.txt or sf_geo.pdf), it provides current information on Quaternary geology and liquefaction susceptibility of the San Francisco, California, 1:100,000 quadrangle. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:100,000 or smaller. The content and character of the database, as well as three methods of obtaining the database, are described below.
Digital images in the map revision process
NASA Astrophysics Data System (ADS)
Newby, P. R. T.
Progress towards the adoption of digital (or softcopy) photogrammetric techniques for database and map revision is reviewed. Particular attention is given to the Ordnance Survey of Great Britain, the author's former employer, where digital processes are under investigation but have not yet been introduced for routine production. Developments which may lead to increasing automation of database update processes appear promising, but because of the cost and practical problems associated with managing as well as updating large digital databases, caution is advised when considering the transition to softcopy photogrammetry for revision tasks.
Digital data storage systems, computers, and data verification methods
Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.
2005-12-27
Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.
Publications - MP 141 | Alaska Division of Geological & Geophysical Surveys
DGGS MP 141 Publication Details Title: Quaternary faults and folds in Alaska: A digital database Combellick, R.A., 2012, Quaternary faults and folds in Alaska: A digital database, in Koehler, R.D Quaternary faults, scale 1:3,700,000 (63.0 M) Digital Geospatial Data Digital Geospatial Data Quaternary
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Adams, David L.; Trinidad, P. Paul
1997-01-01
NASA Langley Technical Library has been involved in developing systems for full-text information delivery of NACA/NASA technical reports since 1991. This paper will describe the two prototypes it has developed and the present production system configuration. The prototype systems are a NACA CD-ROM of thirty-three classic paper NACA reports and a network-based Full-text Electronic Reports Documents System (FEDS) constructed from both paper and electronic formats of NACA and NASA reports. The production system is the DigiDoc System (DIGItal Documents) presently being developed based on the experiences gained from the two prototypes. DigiDoc configuration integrates the on-line catalog database World Wide Web interface and PDF technology to provide a powerful and flexible search and retrieval system. It describes in detail significant achievements and lessons learned in terms of data conversion, storage technologies, full-text searching and retrieval, and image databases. The conclusions from the experiences of digitization and full- text access and future plans for DigiDoc system implementation are discussed.
A digital future for the history of psychology?
Green, Christopher D
2016-08-01
This article discusses the role that digital approaches to the history of psychology are likely to play in the near future. A tentative hierarchy of digital methods is proposed. A few examples are briefly described: a digital repository, a simple visualization using ready-made online database and tools, and more complex visualizations requiring the assembly of the database and, possibly, the analytic tools by the researcher. The relationship of digital history to the old "New Economic History" (Cliometrics) is considered. The question of whether digital history and traditional history need be at odds or, instead, might complement each other is woven throughout. The rapidly expanding territory of digital humanistic research outside of psychology is briefly discussed. Finally, the challenging current employment trends in history and the humanities more broadly are considered, along with the role that digital skills might play in mitigating those factors for prospective academic workers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
ERIC Educational Resources Information Center
Painter, Derrick
1996-01-01
Discussion of dictionaries as databases focuses on the digitizing of The Oxford English dictionary (OED) and the use of Standard Generalized Mark-Up Language (SGML). Topics include the creation of a consortium to digitize the OED, document structure, relational databases, text forms, sequence, and discourse. (LRW)
A Digital 3D-Reconstruction of the Younger Dryas Baltic Ice Lake
NASA Astrophysics Data System (ADS)
Jakobsson, M.; Alm, G.; Bjorck, S.; Lindeberg, G.; Svensson, N.
2005-12-01
A digital 3D-reconstruction of the final stage of the ice dammed Baltic Ice Lake (BIL), dated to the very end of the Younger Dryas cold period (ca. 11 600 cal. yr BP) has been compiled using a combined bathymetric-topographic Digital Terrain Model (DTM), Scandinavian ice sheet limits, Baltic Sea Holocene bottom sediment thickness information, and a paleoshoreline database maintained at the Lund University. The combined bathymetric-topographic Digital Terrain Model (DTM) model used to reconstruct the ice dammed lake was compiled specifically for this study from publicly available data sets. The final DTM is in the form of a digital grid on Lamberts Equal Area projection with a resolution of 500 x 500 m, which permits a much more detailed reconstruction of the BIL than previously made. The lake was constructed through a series of experiments where mathematical algorithms were applied to fit the paleolake's surface through the shoreline database. The accumulated Holocene bottom sediments in the Baltic Sea were subsequently subtracted from the present bathymetry in our reconstruction. This allows us to estimate the Baltic Ice Lake's paleobathymetry, area, volume, and hypsometry, which will comprise key input data to lake/climate modeling exercises following this study. The Scandinavian ice sheet margin eventually retreated north of Mount Billingen, which was the high point in terrain of Southern central Sweden bordering to lower terrain further to the North. As a consequence, the BIL was catastrophically drained through this area, resulting in a 25 m drop of the lake level. With our digital BIL model we estimate that approximately 7, 800 km3 of water drained during this event and that the ice dammed lake area was reduced with ca 18 percent. The digital BIL reconstruction is analyzed using 3D-visualization techniques that provide new detailed information on the paleogeography in the area, both before and after the lake drainage, with implications for interpretations of geological records concerning the post-glacial environmental development of southern Scandinavia.
Huber, Lara
2011-06-01
In the neurosciences digital databases more and more are becoming important tools of data rendering and distributing. This development is due to the growing impact of imaging based trial design in cognitive neuroscience, including morphological as much as functional imaging technologies. As the case of the 'Laboratory of Neuro Imaging' (LONI) is showing, databases are attributed a specific epistemological power: Since the 1990s databasing is seen to foster the integration of neuroscientific data, although local regimes of data production, -manipulation and--interpretation are also challenging this development. Databasing in the neurosciences goes along with the introduction of new structures of integrating local data, hence establishing digital spaces of knowledge (epistemic spaces): At this stage, inherent norms of digital databases are affecting regimes of imaging-based trial design, for example clinical research into Alzheimer's disease.
47 CFR 74.788 - Digital construction period.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Translator, and TV Booster Stations § 74.788 Digital construction period. (a) Each original construction permit for the construction of a new digital low power television or television translator station shall...
Dickinson, William R.; digital database by Hirschberg, Douglas M.; Pitts, G. Stephen; Bolm, Karen S.
2002-01-01
The geologic map of Catalina Core Complex and San Pedro Trough by Dickinson (1992) was digitized for input into a geographic information system (GIS) by the U.S. Geological Survey staff and contractors in 2000-2001. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. The resulting digital geologic map database data can be queried in many ways to produce a variety of geologic maps and derivative products. Digital base map data (topography, roads, towns, rivers, lakes, and so forth) are not included; they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:125,000 (for example, 1:100,000 or 1:24,000). The digital geologic map plot files that are provided herein are representations of the database. The map area is located in southern Arizona. This report lists the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. The manuscript and digital data review by Lorre Moyer (USGS) is greatly appreciated.
Update of the Diatom EST Database: a new tool for digital transcriptomics
Maheswari, Uma; Mock, Thomas; Armbrust, E. Virginia; Bowler, Chris
2009-01-01
The Diatom Expressed Sequence Tag (EST) Database was constructed to provide integral access to ESTs from these ecologically and evolutionarily interesting microalgae. It has now been updated with 130 000 Phaeodactylum tricornutum ESTs from 16 cDNA libraries and 77 000 Thalassiosira pseudonana ESTs from seven libraries, derived from cells grown in different nutrient and stress regimes. The updated relational database incorporates results from statistical analyses such as log-likelihood ratios and hierarchical clustering, which help to identify differentially expressed genes under different conditions, and allow similarities in gene expression in different libraries to be investigated in a functional context. The database also incorporates links to the recently sequenced genomes of P. tricornutum and T. pseudonana, enabling an easy cross-talk between the expression pattern of diatom orthologs and the genome browsers. These improvements will facilitate exploration of diatom responses to conditions of ecological relevance and will aid gene function identification of diatom-specific genes and in silico gene prediction in this largely unexplored class of eukaryotes. The updated Diatom EST Database is available at http://www.biologie.ens.fr/diatomics/EST3. PMID:19029140
The Golosiiv on-line plate archive database, management and maintenance
NASA Astrophysics Data System (ADS)
Pakuliak, L.; Sergeeva, T.
2007-08-01
We intend to create online version of the database of the MAO NASU plate archive as VO-compatible structures in accordance with principles, developed by the International Virtual Observatory Alliance in order to make them available for world astronomical community. The online version of the log-book database is constructed by means of MySQL+PHP. Data management system provides a user with user interface, gives a capability of detailed traditional form-filling radial search of plates, obtaining some auxiliary sampling, the listing of each collection and permits to browse the detail descriptions of collections. The administrative tool allows database administrator the data correction, enhancement with new data sets and control of the integrity and consistence of the database as a whole. The VO-compatible database is currently constructing under the demands and in the accordance with principles of international data archives and has to be strongly generalized in order to provide a possibility of data mining by means of standard interfaces and to be the best fitted to the demands of WFPDB Group for databases of the plate catalogues. On-going enhancements of database toward the WFPDB bring the problem of the verification of data to the forefront, as it demands the high degree of data reliability. The process of data verification is practically endless and inseparable from data management owing to a diversity of data errors nature, that means to a variety of ploys of their identification and fixing. The current status of MAO NASU glass archive forces the activity in both directions simultaneously: the enhancement of log-book database with new sets of observational data as well as generalized database creation and the cross-identification between them. The VO-compatible version of the database is supplying with digitized data of plates obtained with MicroTek ScanMaker 9800 XL TMA. The scanning procedure is not total but is conducted selectively in the frames of special projects.
Rember, William C.; Bennett, Earl H.
2001-01-01
he paper geologic map of the east part of the Pullman 1·x 2· degree quadrangle, Idaho (Rember and Bennett, 1979) was scanned and initially attributed by Optronics Specialty Co., Inc. (Northridge, CA) and remitted to the U.S. Geological Survey for further attribution and publication of the geospatial digital files. The resulting digital geologic map GIS can be queried in many ways to produce a variety of geologic maps. This digital geospatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information in a geographic information system (GIS) for use in spatial analysis. Digital base map data files (topography, roads, towns, rivers and lakes, and others.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (for example, 1:100,000 or 1:24,000). The digital geologic map graphics and plot files (pull250k.gra/.hp /.eps) that are provided in the digital package are representations of the digital database.
Unveiling the geography of historical patents in the United States from 1836 to 1975
Petralia, Sergio; Balland, Pierre-Alexandre; Rigby, David L.
2016-01-01
It is clear that technology is a key driver of economic growth. Much less clear is where new technologies are produced and how the geography of U.S. invention has changed over the last two hundred years. Patent data report the geography, history, and technological characteristics of invention. However, those data have only recently become available in digital form and at the present time there exists no comprehensive dataset on the geography of knowledge production in the United States prior to 1975. The database presented in this paper unveils the geography of historical patents granted by the United States Patent and Trademark Office (USPTO) from 1836 to 1975. This historical dataset, HistPat, is constructed using digitalized records of original patent documents that are publicly available. We describe a methodological procedure that allows recovery of geographical information on patents from the digital records. HistPat can be used in different disciplines ranging from geography, economics, history, network science, and science and technology studies. Additionally, it is easily merged with post-1975 USPTO digital patent data to extend it until today. PMID:27576103
Geology of Point Reyes National Seashore and vicinity, California: a digital database
Clark, Jospeh C.; Brabb, Earl E.
1997-01-01
This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, a PostScript plot file containing an image of the geologic map sheet with explanation, as well as the accompanying text describing the geology of the area. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled 'For Those Who Aren't Familiar With Digital Geologic Map Databases' below. This digital map database, compiled from previously published and unpublished data and new mapping by the authors, represents the general distribution of surficial deposits and rock units in Point Reyes and surrounding areas. Together with the accompanying text file (pr-geo.txt or pr-geo.ps), it provides current information on the stratigraphy and structural geology of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:48,000 or smaller.
Geocoding and stereo display of tropical forest multisensor datasets
NASA Technical Reports Server (NTRS)
Welch, R.; Jordan, T. R.; Luvall, J. C.
1990-01-01
Concern about the future of tropical forests has led to a demand for geocoded multisensor databases that can be used to assess forest structure, deforestation, thermal response, evapotranspiration, and other parameters linked to climate change. In response to studies being conducted at the Braulino Carrillo National Park, Costa Rica, digital satellite and aircraft images recorded by Landsat TM, SPOT HRV, Thermal Infrared Multispectral Scanner, and Calibrated Airborne Multispectral Scanner sensors were placed in register using the Landsat TM image as the reference map. Despite problems caused by relief, multitemporal datasets, and geometric distortions in the aircraft images, registration was accomplished to within + or - 20 m (+ or - 1 data pixel). A digital elevation model constructed from a multisensor Landsat TM/SPOT stereopair proved useful for generating perspective views of the rugged, forested terrain.
Cannon, Debra M.; Bellino, Jason C.; Williams, Lester J.
2012-01-01
A digital dataset of hydrogeologic data for Mesozoic through early Tertiary rocks in the Southeastern Coastal Plain was developed using data from five U.S. Geological Survey (USGS) reports published between 1951 and 1996. These reports contain maps and data depicting the extent and elevation of the Southeast Coastal Plain stratigraphic and hydrogeologic units in Florida and parts of Mississippi, Alabama, Georgia, and South Carolina. The reports are: Professional Paper 1410-B (Renken, 1996), Professional Paper 1088 (Brown and others, 1979), Professional Paper 524-G (Applin and Applin, 1967), Professional Paper 447 (Applin and Applin, 1965), and Circular 91 (Applin, 1951). The digital dataset provides hydrogeologic data for the USGS Energy Resources Program assessment of potential reservoirs for carbon sequestration and for the USGS Groundwater Resource Program assessment of saline aquifers in the southeastern United States. A Geographic Information System (ArcGIS 9.3.1) was used to construct 33 digital (raster) surfaces representing the top or base of key stratigraphic and hydrogeologic units. In addition, the Geographic Information System was used to generate 102 geo-referenced scanned maps from the five reports and a geo-database containing structural and thickness contours, faults, extent polygons, and common features. The dataset also includes point data of well construction and stratigraphic elevations and scanned images of two geologic cross sections and a nomenclature chart.
The development of digital library system for drug research information.
Kim, H J; Kim, S R; Yoo, D S; Lee, S H; Suh, O K; Cho, J H; Shin, H T; Yoon, J P
1998-01-01
The sophistication of computer technology and information transmission on internet has made various cyber information repository available to information consumers. In the era of information super-highway, the digital library which can be accessed from remote sites at any time is considered the prototype of information repository. Using object-oriented DBMS, the very first model of digital library for pharmaceutical researchers and related professionals in Korea has been developed. The published research papers and researchers' personal information was included in the database. For database with research papers, 13 domestic journals were abstracted and scanned for full-text image files which can be viewed by Internet web browsers. The database with researchers' personal information was also developed and interlinked to the database with research papers. These database will be continuously updated and will be combined with world-wide information as the unique digital library in the field of pharmacy.
Dupree, Jean A.; Crowfoot, Richard M.
2012-01-01
The drainage basin is a fundamental hydrologic entity used for studies of surface-water resources and during planning of water-related projects. Numeric drainage areas published by the U.S. Geological Survey water science centers in Annual Water Data Reports and on the National Water Information Systems (NWIS) Web site are still primarily derived from hard-copy sources and by manual delineation of polygonal basin areas on paper topographic map sheets. To expedite numeric drainage area determinations, the Colorado Water Science Center developed a digital database structure and a delineation methodology based on the hydrologic unit boundaries in the National Watershed Boundary Dataset. This report describes the digital database architecture and delineation methodology and also presents the results of a comparison of the numeric drainage areas derived using this digital methodology with those derived using traditional, non-digital methods. (Please see report for full Abstract)
ERIC Educational Resources Information Center
Williamson, Ben
2015-01-01
This article examines the emergence of "digital governance" in public education in England. Drawing on and combining concepts from software studies, policy and political studies, it identifies some specific approaches to digital governance facilitated by network-based communications and database-driven information processing software…
Integrating Digital Images into the Art and Art History Curriculum.
ERIC Educational Resources Information Center
Pitt, Sharon P.; Updike, Christina B.; Guthrie, Miriam E.
2002-01-01
Describes an Internet-based image database system connected to a flexible, in-class teaching and learning tool (the Madison Digital Image Database) developed at James Madison University to bring digital images to the arts and humanities classroom. Discusses content, copyright issues, ensuring system effectiveness, instructional impact, sharing the…
ERIC Educational Resources Information Center
Kurtz, Michael J.; Eichorn, Guenther; Accomazzi, Alberto; Grant, Carolyn S.; Demleitner, Markus; Murray, Stephen S.; Jones, Michael L. W.; Gay, Geri K.; Rieger, Robert H.; Millman, David; Bruggemann-Klein, Anne; Klein, Rolf; Landgraf, Britta; Wang, James Ze; Li, Jia; Chan, Desmond; Wiederhold, Gio; Pitti, Daniel V.
1999-01-01
Includes six articles that discuss a digital library for astronomy; comparing evaluations of digital collection efforts; cross-organizational access management of Web-based resources; searching scientific bibliographic databases based on content-based relations between documents; semantics-sensitive retrieval for digital picture libraries; and…
Construction and application research of Three-dimensional digital power grid in Southwest China
NASA Astrophysics Data System (ADS)
Zhou, Yang; Zhou, Hong; You, Chuan; Jiang, Li; Xin, Weidong
2018-01-01
With the rapid development of Three-dimensional (3D) digital design technology in the field of power grid construction, the data foundation and technical means of 3D digital power grid construction approaches perfection. 3D digital power grid has gradually developed into an important part of power grid construction and management. In view of the complicated geological conditions in Southwest China and the difficulty in power grid construction and management, this paper is based on the data assets of Southwest power grid, and it aims at establishing a 3D digital power grid in Southwest China to provide effective support for power grid construction and operation management. This paper discusses the data architecture, technical architecture and system design and implementation process of the 3D digital power grid construction through teasing the key technology of 3D digital power grid. The application of power grid data assets management, transmission line corridor planning, geological hazards risk assessment, environmental impact assessment in 3D digital power grid are also discussed and analysed.
Spatial Digital Database for the Geologic Map of Oregon
Walker, George W.; MacLeod, Norman S.; Miller, Robert J.; Raines, Gary L.; Connors, Katherine A.
2003-01-01
Introduction This report describes and makes available a geologic digital spatial database (orgeo) representing the geologic map of Oregon (Walker and MacLeod, 1991). The original paper publication was printed as a single map sheet at a scale of 1:500,000, accompanied by a second sheet containing map unit descriptions and ancillary data. A digital version of the Walker and MacLeod (1991) map was included in Raines and others (1996). The dataset provided by this open-file report supersedes the earlier published digital version (Raines and others, 1996). This digital spatial database is one of many being created by the U.S. Geological Survey as an ongoing effort to provide geologic information for use in spatial analysis in a geographic information system (GIS). This database can be queried in many ways to produce a variety of geologic maps. This database is not meant to be used or displayed at any scale larger than 1:500,000 (for example, 1:100,000). This report describes the methods used to convert the geologic map data into a digital format, describes the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. Scanned images of the printed map (Walker and MacLeod, 1991), their correlation of map units, and their explanation of map symbols are also available for download.
Digital food photography: Dietary surveillance and beyond
USDA-ARS?s Scientific Manuscript database
The method used for creating a database of approximately 20,000 digital images of multiple portion sizes of foods linked to the USDA's Food and Nutrient Database for Dietary Studies (FNDDS) is presented. The creation of this database began in 2002, and its development has spanned 10 years. Initially...
Systems and methods for automatically identifying and linking names in digital resources
Parker, Charles T.; Lyons, Catherine M.; Roston, Gerald P.; Garrity, George M.
2017-06-06
The present invention provides systems and methods for automatically identifying name-like-strings in digital resources, matching these name-like-string against a set of names held in an expertly curated database, and for those name-like-strings found in said database, enhancing the content by associating additional matter with the name, wherein said matter includes information about the names that is held within said database and pointers to other digital resources which include the same name and it synonyms.
Automated Bulk Uploading of Images and Metadata to Flickr
ERIC Educational Resources Information Center
Michel, Jason Paul; Tzoc, Elias
2010-01-01
The Digital Initiatives department at Miami University, like most digital initiatives and special collections departments, has a large number of rich digital image collections, stored primarily in a third-party database. Typically, these databases are not findable to the average Web user. From a desire to expose these collections to the wider Web…
Digital database of the geologic map of the island of Hawai'i [Hawaii
Trusdell, Frank A.; Wolfe, Edward W.; Morris, Jean
2006-01-01
This online publication (DS 144) provides the digital database for the printed map by Edward W. Wolfe and Jean Morris (I-2524-A; 1996). This digital database contains all the information used to publish U.S. Geological Survey Geologic Investigations Series I-2524-A (available only in paper form; see http://pubs.er.usgs.gov/pubs/i/i2524A). The database contains the distribution and relationships of volcanic and surficial-sedimentary deposits on the island of Hawai‘i. This dataset represents the geologic history for the five volcanoes that comprise the Island of Hawai'i. The volcanoes are Kohala, Mauna Kea, Hualalai, Mauna Loa and Kīlauea.This database of the geologic map contributes to understanding the geologic history of the Island of Hawai‘i and provides the basis for understanding long-term volcanic processes in an intra-plate ocean island volcanic system. In addition the database also serves as a basis for producing volcanic hazards assessment for the island of Hawai‘i. Furthermore it serves as a base layer to be used for interdisciplinary research.This online publication consists of a digital database of the geologic map, an explanatory pamphlet, description of map units, correlation of map units diagram, and images for plotting. Geologic mapping was compiled at a scale of 1:100,000 for the entire mapping area. The geologic mapping was compiled as a digital geologic database in ArcInfo GIS format.
Baune, Bernhard T; Brignone, Mélanie; Larsen, Klaus Groes
2018-02-01
Major depressive disorder is a common condition that often includes cognitive dysfunction. A systematic literature review of studies and a network meta-analysis were carried out to assess the relative effect of antidepressants on cognitive dysfunction in major depressive disorder. MEDLINE, Embase, Cochrane, CDSR, and PsychINFO databases; clinical trial registries; and relevant conference abstracts were searched for randomized controlled trials assessing the effects of antidepressants/placebo on cognition. A network meta-analysis comparing antidepressants was conducted using a random effects model. The database search retrieved 11337 citations, of which 72 randomized controlled trials from 103 publications met the inclusion criteria. The review identified 86 cognitive tests assessing the effect of antidepressants on cognitive functioning. However, the Digit Symbol Substitution Test, which targets multiple domains of cognition and is recognized as being sensitive to change, was the only test that was used across 12 of the included randomized controlled trials and that allowed the construction of a stable network suitable for the network meta-analysis. The interventions assessed included selective serotonin reuptake inhibitors, serotonin-norepinephrine reuptake inhibitors, and other non-selective serotonin reuptake inhibitors/serotonin-norepinephrine reuptake inhibitors. The network meta-analysis using the Digit Symbol Substitution Test showed that vortioxetine was the only antidepressant that improved cognitive dysfunction on the Digit Symbol Substitution Test vs placebo {standardized mean difference: 0.325 (95% CI = 0.120; 0.529, P=.009}. Compared with other antidepressants, vortioxetine was statistically more efficacious on the Digit Symbol Substitution Test vs escitalopram, nortriptyline, and the selective serotonin reuptake inhibitor and tricyclic antidepressant classes. This study highlighted the large variability in measures used to assess cognitive functioning. The findings on the Digit Symbol Substitution Test indicate differential effects of various antidepressants on improving cognitive function in patients with major depressive disorder. © The Author 2017. Published by Oxford University Press on behalf of CINP.
[Design and development of an online system of parasite's images for training and evaluation].
Yuan-Chun, Mao; Sui, Xu; Jie, Wang; Hua-Yun, Zhou; Jun, Cao
2017-08-08
To design and develop an online training and evaluation system for parasitic pathogen recognition. The system was based on a Parasitic Diseases Specimen Image Digitization Construction Database by using MYSQL 5.0 as the system of database development software, and PHP 5 as the interface development language. It was mainly used for online training and evaluation of parasitic pathology diagnostic techniques. The system interface was designed simple, flexible, and easy to operate for medical staff. It enabled full day and 24 hours accessible to online training study and evaluation. Thus, the system broke the time and space constraints of the traditional training models. The system provides a shared platform for the professional training of parasitic diseases, and a reference for other training tasks.
NASA Astrophysics Data System (ADS)
Esponda, M.; Piraino, F.; Stanga, C.; Mezzino, D.
2017-08-01
This paper presents an integrated approach between digital documentation workflows and historical research in order to document log houses, outstanding example of vernacular architecture in Quebec, focusing on their geometrical-dimensional as well as on the intangible elements associated with these historical structures. The 18 log houses selected in the Laurentians represent the material culture of how settlers adapted to the harsh Quebec environment at the end of the nineteenth century. The essay describes some results coming by professor Mariana Esponda in 2015 (Carleton University) and the digital documentation was carried out through the grant New Paradigm/New Tools for Architectural Heritage in Canada, supported by SSHRC Training Program) (May-August 2016). The workflow of the research started with the digital documentation, accomplished with laser scanning techniques, followed by onsite observations, and archival researches. This led to the creation of an 'abacus', a first step into the development of a territorialhistorical database of the log houses, potentially updatable by other researchers. Another important part of the documentation of these buildings has been the development of Historic Building Information Models fundamental to analyze the geometry of the logs and to understand how these constructions were built. The realization of HBIMs was a first step into the modeling of irregular shapes such as those of the logs - different Level of Detail were adopted in order to show how the models can be used for different purposes. In the future, they can potentially be used for the creation of a virtual tour app for the story telling of these buildings.
Building information models for astronomy projects
NASA Astrophysics Data System (ADS)
Ariño, Javier; Murga, Gaizka; Campo, Ramón; Eletxigerra, Iñigo; Ampuero, Pedro
2012-09-01
A Building Information Model is a digital representation of physical and functional characteristics of a building. BIMs represent the geometrical characteristics of the Building, but also properties like bills of quantities, definition of COTS components, status of material in the different stages of the project, project economic data, etc. The BIM methodology, which is well established in the Architecture Engineering and Construction (AEC) domain for conventional buildings, has been brought one step forward in its application for Astronomical/Scientific facilities. In these facilities steel/concrete structures have high dynamic and seismic requirements, M&E installations are complex and there is a large amount of special equipment and mechanisms involved as a fundamental part of the facility. The detail design definition is typically implemented by different design teams in specialized design software packages. In order to allow the coordinated work of different engineering teams, the overall model, and its associated engineering database, is progressively integrated using a coordination and roaming software which can be used before starting construction phase for checking interferences, planning the construction sequence, studying maintenance operation, reporting to the project office, etc. This integrated design & construction approach will allow to efficiently plan construction sequence (4D). This is a powerful tool to study and analyze in detail alternative construction sequences and ideally coordinate the work of different construction teams. In addition engineering, construction and operational database can be linked to the virtual model (6D), what gives to the end users a invaluable tool for the lifecycle management, as all the facility information can be easily accessed, added or replaced. This paper presents the BIM methodology as implemented by IDOM with the E-ELT and ATST Enclosures as application examples.
Some thoughts on cartographic and geographic information systems for the 1980's
Starr, L.E.; Anderson, Kirk E.
1981-01-01
The U.S. Geological Survey is adopting computer techniques to meet the expanding need for cartographic base category data. Digital methods are becoming increasingly important in the mapmaking process, and the demand is growing for physical, social, and economic data. Recognizing these emerging needs, the National Mapping Division began, several years ago, an active program to develop advanced digital methods to support cartographic and geographic data processing. An integrated digital cartographic database would meet the anticipated needs. Such a database would contain data from various sources, and could provide a variety of standard and customized map and digital data file products. This cartographic database soon will be technologically feasible. The present trends in the economics of cartographic and geographic data handling and the growing needs for integrated physical, social, and economic data make such a database virtually mandatory.
[Review of digital ground object spectral library].
Zhou, Xiao-Hu; Zhou, Ding-Wu
2009-06-01
A higher spectral resolution is the main direction of developing remote sensing technology, and it is quite important to set up the digital ground object reflectance spectral database library, one of fundamental research fields in remote sensing application. Remote sensing application has been increasingly relying on ground object spectral characteristics, and quantitative analysis has been developed to a new stage. The present article summarized and systematically introduced the research status quo and development trend of digital ground object reflectance spectral libraries at home and in the world in recent years. Introducing the spectral libraries has been established, including desertification spectral database library, plants spectral database library, geological spectral database library, soil spectral database library, minerals spectral database library, cloud spectral database library, snow spectral database library, the atmosphere spectral database library, rocks spectral database library, water spectral database library, meteorites spectral database library, moon rock spectral database library, and man-made materials spectral database library, mixture spectral database library, volatile compounds spectral database library, and liquids spectral database library. In the process of establishing spectral database libraries, there have been some problems, such as the lack of uniform national spectral database standard and uniform standards for the ground object features as well as the comparability between different databases. In addition, data sharing mechanism can not be carried out, etc. This article also put forward some suggestions on those problems.
Construction of a century solar chromosphere data set for solar activity related research
NASA Astrophysics Data System (ADS)
Lin, Ganghua; Wang, Xiao Fan; Yang, Xiao; Liu, Suo; Zhang, Mei; Wang, Haimin; Liu, Chang; Xu, Yan; Tlatov, Andrey; Demidov, Mihail; Borovik, Aleksandr; Golovko, Aleksey
2017-06-01
This article introduces our ongoing project "Construction of a Century Solar Chromosphere Data Set for Solar Activity Related Research". Solar activities are the major sources of space weather that affects human lives. Some of the serious space weather consequences, for instance, include interruption of space communication and navigation, compromising the safety of astronauts and satellites, and damaging power grids. Therefore, the solar activity research has both scientific and social impacts. The major database is built up from digitized and standardized film data obtained by several observatories around the world and covers a time span of more than 100 years. After careful calibration, we will develop feature extraction and data mining tools and provide them together with the comprehensive database for the astronomical community. Our final goal is to address several physical issues: filament behavior in solar cycles, abnormal behavior of solar cycle 24, large-scale solar eruptions, and sympathetic remote brightenings. Significant signs of progress are expected in data mining algorithms and software development, which will benefit the scientific analysis and eventually advance our understanding of solar cycles.
From experimental imaging techniques to virtual embryology.
Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis
2004-01-01
Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.
DIGITAL CARTOGRAPHY OF THE PLANETS: NEW METHODS, ITS STATUS, AND ITS FUTURE.
Batson, R.M.
1987-01-01
A system has been developed that establishes a standardized cartographic database for each of the 19 planets and major satellites that have been explored to date. Compilation of the databases involves both traditional and newly developed digital image processing and mosaicking techniques, including radiometric and geometric corrections of the images. Each database, or digital image model (DIM), is a digital mosaic of spacecraft images that have been radiometrically and geometrically corrected and photometrically modeled. During compilation, ancillary data files such as radiometric calibrations and refined photometric values for all camera lens and filter combinations and refined camera-orientation matrices for all images used in the mapping are produced.
The Design and Product of National 1:1000000 Cartographic Data of Topographic Map
NASA Astrophysics Data System (ADS)
Wang, Guizhi
2016-06-01
National administration of surveying, mapping and geoinformation started to launch the project of national fundamental geographic information database dynamic update in 2012. Among them, the 1:50000 database was updated once a year, furthermore the 1:250000 database was downsized and linkage-updated on the basis. In 2014, using the latest achievements of 1:250000 database, comprehensively update the 1:1000000 digital line graph database. At the same time, generate cartographic data of topographic map and digital elevation model data. This article mainly introduce national 1:1000000 cartographic data of topographic map, include feature content, database structure, Database-driven Mapping technology, workflow and so on.
Yerkes, R.F.; Campbell, Russell H.
1995-01-01
This database, identified as "Preliminary Geologic Map of the Oat Mountain 7.5' Quadrangle, southern California: A Digital Database," has been approved for release and publication by the Director of the USGS. Although this database has been reviewed and is substantially complete, the USGS reserves the right to revise the data pursuant to further analysis and review. This database is released on condition that neither the USGS nor the U. S. Government may be held liable for any damages resulting from its use. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1993). More specific information about the units may be available in the original sources.
Access to digital library databases in higher education: design problems and infrastructural gaps.
Oswal, Sushil K
2014-01-01
After defining accessibility and usability, the author offers a broad survey of the research studies on digital content databases which have thus far primarily depended on data drawn from studies conducted by sighted researchers with non-disabled users employing screen readers and low vision devices. This article aims at producing a detailed description of the difficulties confronted by blind screen reader users with online library databases which now hold most of the academic, peer-reviewed journal and periodical content essential for research and teaching in higher education. The approach taken here is borrowed from descriptive ethnography which allows the author to create a complete picture of the accessibility and usability problems faced by an experienced academic user of digital library databases and screen readers. The author provides a detailed analysis of the different aspects of accessibility issues in digital databases under several headers with a special focus on full-text PDF files. The author emphasizes that long-term studies with actual, blind screen reader users employing both qualitative and computerized research tools can yield meaningful data for the designers and developers to improve these databases to a level that they begin to provide an equal access to the blind.
Mulrane, Laoighse; Rexhepaj, Elton; Smart, Valerie; Callanan, John J; Orhan, Diclehan; Eldem, Türkan; Mally, Angela; Schroeder, Susanne; Meyer, Kirstin; Wendt, Maria; O'Shea, Donal; Gallagher, William M
2008-08-01
The widespread use of digital slides has only recently come to the fore with the development of high-throughput scanners and high performance viewing software. This development, along with the optimisation of compression standards and image transfer techniques, has allowed the technology to be used in wide reaching applications including integration of images into hospital information systems and histopathological training, as well as the development of automated image analysis algorithms for prediction of histological aberrations and quantification of immunohistochemical stains. Here, the use of this technology in the creation of a comprehensive library of images of preclinical toxicological relevance is demonstrated. The images, acquired using the Aperio ScanScope CS and XT slide acquisition systems, form part of the ongoing EU FP6 Integrated Project, Innovative Medicines for Europe (InnoMed). In more detail, PredTox (abbreviation for Predictive Toxicology) is a subproject of InnoMed and comprises a consortium of 15 industrial (13 large pharma, 1 technology provider and 1 SME) and three academic partners. The primary aim of this consortium is to assess the value of combining data generated from 'omics technologies (proteomics, transcriptomics, metabolomics) with the results from more conventional toxicology methods, to facilitate further informed decision making in preclinical safety evaluation. A library of 1709 scanned images was created of full-face sections of liver and kidney tissue specimens from male Wistar rats treated with 16 proprietary and reference compounds of known toxicity; additional biological materials from these treated animals were separately used to create 'omics data, that will ultimately be used to populate an integrated toxicological database. In respect to assessment of the digital slides, a web-enabled digital slide management system, Digital SlideServer (DSS), was employed to enable integration of the digital slide content into the 'omics database and to facilitate remote viewing by pathologists connected with the project. DSS also facilitated manual annotation of digital slides by the pathologists, specifically in relation to marking particular lesions of interest. Tissue microarrays (TMAs) were constructed from the specimens for the purpose of creating a repository of tissue from animals used in the study with a view to later-stage biomarker assessment. As the PredTox consortium itself aims to identify new biomarkers of toxicity, these TMAs will be a valuable means of validation. In summary, a large repository of histological images was created enabling the subsequent pathological analysis of samples through remote viewing and, along with the utilisation of TMA technology, will allow the validation of biomarkers identified by the PredTox consortium. The population of the PredTox database with these digitised images represents the creation of the first toxicological database integrating 'omics and preclinical data with histological images.
Concierge: Personal Database Software for Managing Digital Research Resources
Sakai, Hiroyuki; Aoyama, Toshihiro; Yamaji, Kazutsuna; Usui, Shiro
2007-01-01
This article introduces a desktop application, named Concierge, for managing personal digital research resources. Using simple operations, it enables storage of various types of files and indexes them based on content descriptions. A key feature of the software is a high level of extensibility. By installing optional plug-ins, users can customize and extend the usability of the software based on their needs. In this paper, we also introduce a few optional plug-ins: literature management, electronic laboratory notebook, and XooNlps client plug-ins. XooNIps is a content management system developed to share digital research resources among neuroscience communities. It has been adopted as the standard database system in Japanese neuroinformatics projects. Concierge, therefore, offers comprehensive support from management of personal digital research resources to their sharing in open-access neuroinformatics databases such as XooNIps. This interaction between personal and open-access neuroinformatics databases is expected to enhance the dissemination of digital research resources. Concierge is developed as an open source project; Mac OS X and Windows XP versions have been released at the official site (http://concierge.sourceforge.jp). PMID:18974800
The comparative effectiveness of conventional and digital image libraries.
McColl, R I; Johnson, A
2001-03-01
Before introducing a hospital-wide image database to improve access, navigation and retrieval speed, a comparative study between a conventional slide library and a matching image database was undertaken to assess its relative benefits. Paired time trials and personal questionnaires revealed faster retrieval rates, higher image quality, and easier viewing for the pilot digital image database. Analysis of confidentiality, copyright and data protection exposed similar issues for both systems, thus concluding that the digital image database is a more effective library system. The authors suggest that in the future, medical images will be stored on large, professionally administered, centrally located file servers, allowing specialist image libraries to be tailored locally for individual users. The further integration of the database with web technology will enable cheap and efficient remote access for a wide range of users.
Development of Elevation and Relief Databases for ICESat-2/ATLAS Receiver Algorithms
NASA Astrophysics Data System (ADS)
Leigh, H. W.; Magruder, L. A.; Carabajal, C. C.; Saba, J. L.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.
2013-12-01
The Advanced Topographic Laser Altimeter System (ATLAS) is planned to launch onboard NASA's ICESat-2 spacecraft in 2016. ATLAS operates at a wavelength of 532 nm with a laser repeat rate of 10 kHz and 6 individual laser footprints. The satellite will be in a 500 km, 91-day repeat ground track orbit at an inclination of 92°. A set of onboard Receiver Algorithms has been developed to reduce the data volume and data rate to acceptable levels while still transmitting the relevant ranging data. The onboard algorithms limit the data volume by distinguishing between surface returns and background noise and selecting a small vertical region around the surface return to be included in telemetry. The algorithms make use of signal processing techniques, along with three databases, the Digital Elevation Model (DEM), the Digital Relief Map (DRM), and the Surface Reference Mask (SRM), to find the signal and determine the appropriate dynamic range of vertical data surrounding the surface for downlink. The DEM provides software-based range gating for ATLAS. This approach allows the algorithm to limit the surface signal search to the vertical region between minimum and maximum elevations provided by the DEM (plus some margin to account for uncertainties). The DEM is constructed in a nested, three-tiered grid to account for a hardware constraint limiting the maximum vertical range to 6 km. The DRM is used to select the vertical width of the telemetry band around the surface return. The DRM contains global values of relief calculated along 140 m and 700 m ground track segments consistent with a 92° orbit. The DRM must contain the maximum value of relief seen in any given area, but must be as close to truth as possible as the DRM directly affects data volume. The SRM, which has been developed independently from the DEM and DRM, is used to set parameters within the algorithm and select telemetry bands for downlink. Both the DEM and DRM are constructed from publicly available digital elevation models. No elevation models currently exist that provide global coverage at a sufficient resolution, so several regional models have been mosaicked together to produce global databases. In locations where multiple data sets are available, evaluations have been made to determine the optimal source for the databases, primarily based on resolution and accuracy. Separate procedures for calculating relief were developed for high latitude (>60N/S) regions in order to take advantage of polar stereographic projections. An additional method for generating the databases was developed for use over Antarctica, such that high resolution, regional elevation models can be easily incorporated as they become available in the future. The SRM is used to facilitate DEM and DRM production by defining those regions that are ocean and sea ice. Ocean and sea ice elevation values are defined by the geoid, while relief is set to a constant value. Results presented will include the details of data source selection, the methodologies used to create the databases, and the final versions of both the DEM and DRM databases. Companion presentations by McGarry, et al. and Carabajal, et al. describe the ATLAS onboard Receiver Algorithms and the database verification, respectively.
Design and application of BIM based digital sand table for construction management
NASA Astrophysics Data System (ADS)
Fuquan, JI; Jianqiang, LI; Weijia, LIU
2018-05-01
This paper explores the design and application of BIM based digital sand table for construction management. Aiming at the demands and features of construction management plan for bridge and tunnel engineering, the key functional features of digital sand table should include three-dimensional GIS, model navigation, virtual simulation, information layers, and data exchange, etc. That involving the technology of 3D visualization and 4D virtual simulation of BIM, breakdown structure of BIM model and project data, multi-dimensional information layers, and multi-source data acquisition and interaction. Totally, the digital sand table is a visual and virtual engineering information integrated terminal, under the unified data standard system. Also, the applications shall contain visual constructing scheme, virtual constructing schedule, and monitoring of construction, etc. Finally, the applicability of several basic software to the digital sand table is analyzed.
Construction of a Digital Learning Environment Based on Cloud Computing
ERIC Educational Resources Information Center
Ding, Jihong; Xiong, Caiping; Liu, Huazhong
2015-01-01
Constructing the digital learning environment for ubiquitous learning and asynchronous distributed learning has opened up immense amounts of concrete research. However, current digital learning environments do not fully fulfill the expectations on supporting interactive group learning, shared understanding and social construction of knowledge.…
Nicholson, Suzanne W.; Dicken, Connie L.; Horton, John D.; Foose, Michael P.; Mueller, Julia A.L.; Hon, Rudi
2006-01-01
The rapid growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national scale digital geologic maps that have standardized information about geologic age and lithology. Such maps can be conveniently used to generate derivative maps for manifold special purposes such as mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. Although two digital geologic maps (Schruben and others, 1994; Reed and Bush, 2004) of the United States currently exist, their scales (1:2,500,000 and 1:5,000,000) are too general for many regional applications. Most states have digital geologic maps at scales of about 1:500,000, but the databases are not comparably structured and, thus, it is difficult to use the digital database for more than one state at a time. This report describes the result for a seven state region of an effort by the U.S. Geological Survey to produce a series of integrated and standardized state geologic map databases that cover the entire United States. In 1997, the United States Geological Survey's Mineral Resources Program initiated the National Surveys and Analysis (NSA) Project to develop national digital databases. One primary activity of this project was to compile a national digital geologic map database, utilizing state geologic maps, to support studies in the range of 1:250,000- to 1:1,000,000-scale. To accomplish this, state databases were prepared using a common standard for the database structure, fields, attribution, and data dictionaries. For Alaska and Hawaii new state maps are being prepared and the preliminary work for Alaska is being released as a series of 1:250,000 scale quadrangle reports. This document provides background information and documentation for the integrated geologic map databases of this report. This report is one of a series of such reports releasing preliminary standardized geologic map databases for the United States. The data products of the project consist of two main parts, the spatial databases and a set of supplemental tables relating to geologic map units. The datasets serve as a data resource to generate a variety of stratigraphic, age, and lithologic maps. This documentation is divided into four main sections: (1) description of the set of data files provided in this report, (2) specifications of the spatial databases, (3) specifications of the supplemental tables, and (4) an appendix containing the data dictionaries used to populate some fields of the spatial database and supplemental tables.
New digital magnetic anomaly database for North America
Finn, C.A.; Pilkington, M.; Cuevas, A.; Hernandez, I.; Urrutia, J.
2001-01-01
The Geological Survey of Canada (GSC), U.S. Geological Survey (USGS), and Consejo de Recursos Minerales of Mexico (CRM) are compiling an upgraded digital magnetic anomaly database and map for North America. This trinational project is expected to be completed by late 2002.
Digital Equipment Corporation's CRDOM Software and Database Publications.
ERIC Educational Resources Information Center
Adams, Michael Q.
1986-01-01
Acquaints information professionals with Digital Equipment Corporation's compact optical disk read-only-memory (CDROM) search and retrieval software and growing library of CDROM database publications (COMPENDEX, Chemical Abstracts Services). Highlights include MicroBASIS, boolean operators, range operators, word and phrase searching, proximity…
Data Processing Factory for the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan
2002-12-01
The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.
Prioritising sewerage maintenance using inferred sewer age: a case study for Edinburgh.
Arthur, S; Burkhard, R
2010-01-01
The reported research project focuses on using a database which contains details of customer contacts and CCTV data for a key Scottish catchment to construct a GIS based sewer condition model. Given the nature of the asset registry, a key research challenge was estimating the age of individual lengths of pipe. Within this context, asset age was inferred using the estimated age of surface developments-this involved overlaying the network in a GIS with historical digital maps. The paper illustrates that inferred asset age can reliably be used to highlight assets which are more likely to fail.
NASA Technical Reports Server (NTRS)
Carrere, Veronique
1990-01-01
Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.
The integration of digital orthophotographs with GISs in a microcomputer environment
NASA Technical Reports Server (NTRS)
Steiner, David R.
1992-01-01
The issues involved in the use of orthoimages as a data source for GIS databases are examined. The integration of digital photographs into a GIS is discussed. A prototype PC-based program for the production of GIS databases using orthoimages is described.
NASA Astrophysics Data System (ADS)
Othmanli, Hussein; Zhao, Chengyi; Stahr, Karl
2017-04-01
The Tarim River Basin is the largest continental basin in China. The region has extremely continental desert climate characterized by little rainfall <50 mm/a and high potential evaporation >3000 mm/a. The climate change is affecting severely the basin causing soil salinization, water shortage, and regression in crop production. Therefore, a Soil and Land Resources Information System (SLISYS-Tarim) for the regional simulation of crop yield production in the basin was developed. The SLISYS-Tarim consists of a database and an agro-ecological simulation model EPIC (Environmental Policy Integrated Climate). The database comprises relational tables including information about soils, terrain conditions, land use, and climate. The soil data implicate information of 50 soil profiles which were dug, analyzed, described and classified in order to characterize the soils in the region. DEM data were integrated with geological maps to build a digital terrain structure. Remote sensing data of Landsat images were applied for soil mapping, and for land use and land cover classification. An additional database for climate data, land management and crop information were linked to the system, too. Construction of the SLISYS-Tarim database was accomplished by integrating and overlaying the recommended thematic maps within environment of the geographic information system (GIS) to meet the data standard of the global and national SOTER digital database. This database forms appropriate input- and output data for the crop modelling with the EPIC model at various scales in the Tarim Basin. The EPIC model was run for simulating cotton production under a constructed scenario characterizing the current management practices, soil properties and climate conditions. For the EPIC model calibration, some parameters were adjusted so that the modeled cotton yield fits to the measured yield on the filed scale. The validation of the modeling results was achieved in a later step based on remote sensing data. The simulated cotton yield varied according to field management, soil type and salinity level, where soil salinity was the main limiting factor. Furthermore, the calibrated and validated EPIC model was run under several scenarios of climate conditions and land management practices to estimate the effect of climate change on cotton production and sustainability of agriculture systems in the basin. The application of SLISYS-Tarim showed that this database can be a suitable framework for storage and retrieval of soil and terrain data at various scales. The simulation with the EPIC model can assess the impact of climate change and management strategies. Therefore, SLISYS-Tarim can be a good tool for regional planning and serve the decision support system on regional and national scale.
[Constructing 3-dimensional colorized digital dental model assisted by digital photography].
Ye, Hong-qiang; Liu, Yu-shu; Liu, Yun-song; Ning, Jing; Zhao, Yi-jiao; Zhou, Yong-sheng
2016-02-18
To explore a method of constructing universal 3-dimensional (3D) colorized digital dental model which can be displayed and edited in common 3D software (such as Geomagic series), in order to improve the visual effect of digital dental model in 3D software. The morphological data of teeth and gingivae were obtained by intra-oral scanning system (3Shape TRIOS), constructing 3D digital dental models. The 3D digital dental models were exported as STL files. Meanwhile, referring to the accredited photography guide of American Academy of Cosmetic Dentistry (AACD), five selected digital photographs of patients'teeth and gingivae were taken by digital single lens reflex camera (DSLR) with the same exposure parameters (except occlusal views) to capture the color data. In Geomagic Studio 2013, after STL file of 3D digital dental model being imported, digital photographs were projected on 3D digital dental model with corresponding position and angle. The junctions of different photos were carefully trimmed to get continuous and natural color transitions. Then the 3D colorized digital dental model was constructed, which was exported as OBJ file or WRP file which was a special file for software of Geomagic series. For the purpose of evaluating the visual effect of the 3D colorized digital model, a rating scale on color simulation effect in views of patients'evaluation was used. Sixteen patients were recruited and their scores on colored and non-colored digital dental models were recorded. The data were analyzed using McNemar-Bowker test in SPSS 20. Universal 3D colorized digital dental model with better color simulation was constructed based on intra-oral scanning and digital photography. For clinical application, the 3D colorized digital dental models, combined with 3D face images, were introduced into 3D smile design of aesthetic rehabilitation, which could improve the patients' cognition for the esthetic digital design and virtual prosthetic effect. Universal 3D colorized digital dental model with better color simulation can be constructed assisted by 3D dental scanning system and digital photography. In clinical practice, the communication between dentist and patients could be improved assisted by the better visual perception since the colorized 3D digital dental models with better color simulation effect.
Seismic databases of The Caucasus
NASA Astrophysics Data System (ADS)
Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.
2012-12-01
The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.
BAO Plate Archive Project: Digitization, Electronic Database and Research Programmes
NASA Astrophysics Data System (ADS)
Mickaelian, A. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Farmanyan, S. V.; Gigoyan, K. S.; Gyulzadyan, M. V.; Khachatryan, K. G.; Knyazyan, A. V.; Kostandyan, G. R.; Mikayelyan, G. A.; Nikoghosyan, E. H.; Paronyan, G. M.; Vardanyan, A. V.
2016-06-01
The most important part of the astronomical observational heritage are astronomical plate archives created on the basis of numerous observations at many observatories. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,000 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt type and other smaller telescopes during 1947-1991. In 2002-2005, the famous Markarian Survey (also called First Byurakan Survey, FBS) 1874 plates were digitized and the Digitized FBS (DFBS) was created. New science projects have been conducted based on these low-dispersion spectroscopic material. A large project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage was started in 2015. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 11 astronomers and 2 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization, as well as Armenian Virtual Observatory (ArVO) database will be used to accommodate all new data. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects, mainly including high proper motion stars, variable objects and Solar System bodies.
BAO Plate Archive digitization, creation of electronic database and its scientific usage
NASA Astrophysics Data System (ADS)
Mickaelian, Areg M.
2015-08-01
Astronomical plate archives created on the basis of numerous observations at many observatories are important part of the astronomical heritage. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,500 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. In 2002-2005, the famous Markarian Survey (First Byurakan Survey, FBS) 2000 plates were digitized and the Digitized FBS (DFBS, http://www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on these low-dispersion spectroscopic material. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 9 astronomers and 3 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization, as well as Armenian Virtual Observatory (ArVO) database to accommodate all new data. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects.
Programmed database system at the Chang Gung Craniofacial Center: part II--digitizing photographs.
Chuang, Shiow-Shuh; Hung, Kai-Fong; de Villa, Glenda H; Chen, Philip K T; Lo, Lun-Jou; Chang, Sophia C N; Yu, Chung-Chih; Chen, Yu-Ray
2003-07-01
The archival tools used for digital images in advertising are not to fulfill the clinic requisition and are just beginning to develop. The storage of a large amount of conventional photographic slides needs a lot of space and special conditions. In spite of special precautions, degradation of the slides still occurs. The most common degradation is the appearance of fungus flecks. With the recent advances in digital technology, it is now possible to store voluminous numbers of photographs on a computer hard drive and keep them for a long time. A self-programmed interface has been developed to integrate database and image browser system that can build and locate needed files archive in a matter of seconds with the click of a button. This system requires hardware and software were market provided. There are 25,200 patients recorded in the database that involve 24,331 procedures. In the image files, there are 6,384 patients with 88,366 digital pictures files. From 1999 through 2002, NT400,000 dollars have been saved using the new system. Photographs can be managed with the integrating Database and Browse software for database archiving. This allows labeling of the individual photographs with demographic information and browsing. Digitized images are not only more efficient and economical than the conventional slide images, but they also facilitate clinical studies.
Digital geologic map of the Coeur d'Alene 1:100,000 quadrangle, Idaho and Montana
digital compilation by Munts, Steven R.
2000-01-01
Between 1961 and 1969, Alan Griggs and others conducted fieldwork to prepare a geologic map of the Spokane 1:250,000 map (Griggs, 1973). Their field observations were posted on paper copies of 15-minute quadrangle maps. In 1999, the USGS contracted with the Idaho Geological Survey to prepare a digital version of the Coeur d’Alene 1:100,000 quadrangle. To facilitate this work, the USGS obtained the field maps prepared by Griggs and others from the USGS Field Records Library in Denver, Colorado. The Idaho Geological Survey (IGS) digitized these maps and used them in their mapping program. The mapping focused on field checks to resolve problems in poorly known areas and in areas of disagreement between adjoining maps. The IGS is currently in the process of preparing a final digital spatial database for the Coeur d’Alene 1:100,000 quadrangle. However, there was immediate need for a digital version of the geologic map of the Coeur d’Alene 1:100,000 quadrangle and the data from the field sheets along with several other sources were assembled to produce this interim product. This interim product is the digital geologic map of the Coeur d’Alene 1:100,000 quadrangle, Idaho and Montana. It was compiled from the preliminary digital files prepared by the Idaho Geological, and supplemented by data from Griggs (1973) and from digital databases by Bookstrom and others (1999) and Derkey and others (1996). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variety of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000). The digital geologic map graphics (of00-135_map.pdf) that are provided are representations of the digital database. The map area is located in north Idaho. This open-file report describes the geologic map units, the methods used to convert the geologic map data into a digital format, the ArcInfo GIS file structures and relationships, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet.
Geologic and structure map of the Choteau 1 degree by 2 degrees Quadrangle, western Montana
Mudge, Melville R.; Earhart, Robert L.; Whipple, James W.; Harrison, Jack E.
1982-01-01
The geologic and structure map of Choteau 1 x 2 degree quadrangle (Mudge and others, 1982) was originally converted to a digital format by Jeff Silkwood (U.S. Forest Service and completed by the U.S. Geological Survey staff and contractor at the Spokane Field Office (WA) in 2000 for input into a geographic information system (GIS). The resulting digital geologic map (GIS) database can be queried in many ways to produce a variey of geologic maps. Digital base map data files (topography, roads, towns, rivers and lakes, etc.) are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:250,000 (e.g. 1:100,000 or 1:24,000. The digital geologic map graphics and plot files (chot250k.gra/.hp/.eps and chot-map.pdf) that are provided in the digital package are representations of the digital database. They are not designed to be cartographic products.
Image database for digital hand atlas
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente; Dey, Partha S.; Gertych, Arkadiusz; Pospiech-Kurkowska, Sywia
2003-05-01
Bone age assessment is a procedure frequently performed in pediatric patients to evaluate their growth disorder. A commonly used method is atlas matching by a visual comparison of a hand radiograph with a small reference set of old Greulich-Pyle atlas. We have developed a new digital hand atlas with a large set of clinically normal hand images of diverse ethnic groups. In this paper, we will present our system design and implementation of the digital atlas database to support the computer-aided atlas matching for bone age assessment. The system consists of a hand atlas image database, a computer-aided diagnostic (CAD) software module for image processing and atlas matching, and a Web user interface. Users can use a Web browser to push DICOM images, directly or indirectly from PACS, to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, are then extracted and compared with patterns from the atlas image database to assess the bone age. The digital atlas method built on a large image database and current Internet technology provides an alternative to supplement or replace the traditional one for a quantitative, accurate and cost-effective assessment of bone age.
NASA Astrophysics Data System (ADS)
Mickaelian, A. M.; Gigoyan, K. S.; Gyulzadyan, M. V.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Samsonyan, A. L.; Mikayelyan, G. A.; Farmanyan, S. V.; Harutyunyan, V. L.
2017-12-01
We present the Byurakan Astrophysical Observatory (BAO) Plate Archive Project that is aimed at digitization, extraction and analysis of archival data and building an electronic database and interactive sky map. BAO Plate Archive consists of 37,500 photographic plates and films, obtained with 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. The famous Markarian Survey (or the First Byurakan Survey, FBS) 2000 plates were digitized in 2002-2005 and the Digitized FBS (DFBS, www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on this low-dispersion spectroscopic material. Several other smaller digitization projects have been carried out as well, such as part of Second Byurakan Survey (SBS) plates, photographic chain plates in Coma, where the blazar ON 231 is located and 2.6m film spectra of FBS Blue Stellar Objects. However, most of the plates and films are not digitized. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. Armenian Virtual Observatory (ArVO, www.aras.am/Arvo/arvo.htm) database will accommodate all new data. The project runs in collaboration with the Armenian Institute of Informatics and Automation Problems (IIAP) and will continues during 4 years in 2015-2018. The final result will be an Electronic Database and online Interactive Sky map to be used for further research projects. ArVO will provide all standards and tools for efficient usage of the scientific output and its integration in international databases.
Yokohama, Noriya; Tsuchimoto, Tadashi; Oishi, Masamichi; Itou, Katsuya
2007-01-20
It has been noted that the downtime of medical informatics systems is often long. Many systems encounter downtimes of hours or even days, which can have a critical effect on daily operations. Such systems remain especially weak in the areas of database and medical imaging data. The scheme design shows the three-layer architecture of the system: application, database, and storage layers. The application layer uses the DICOM protocol (Digital Imaging and Communication in Medicine) and HTTP (Hyper Text Transport Protocol) with AJAX (Asynchronous JavaScript+XML). The database is designed to decentralize in parallel using cluster technology. Consequently, restoration of the database can be done not only with ease but also with improved retrieval speed. In the storage layer, a network RAID (Redundant Array of Independent Disks) system, it is possible to construct exabyte-scale parallel file systems that exploit storage spread. Development and evaluation of the test-bed has been successful in medical information data backup and recovery in a network environment. This paper presents a schematic design of the new medical informatics system that can be accommodated from a recovery and the dynamic Web application for medical imaging distribution using AJAX.
Nebert, Douglas; Anderson, Dean
1987-01-01
The U. S. Geological Survey (USGS) in cooperation with the U. S. Environmental Protection Agency Office of Pesticide Programs and several State agencies in Oregon has prepared a digital spatial database at 1:500,000 scale to be used as a basis for evaluating the potential for ground-water contamination by pesticides and other agricultural chemicals. Geographic information system (GIS) software was used to assemble, analyze, and manage spatial and tabular environmental data in support of this project. Physical processes were interpreted relative to published spatial data and an integrated database to support the appraisal of regional ground-water contamination was constructed. Ground-water sampling results were reviewed relative to the environmental factors present in several agricultural areas to develop an empirical knowledge base which could be used to assist in the selection of future sampling or study areas.
NASA Astrophysics Data System (ADS)
Sakano, Toshikazu; Furukawa, Isao; Okumura, Akira; Yamaguchi, Takahiro; Fujii, Tetsuro; Ono, Sadayasu; Suzuki, Junji; Matsuya, Shoji; Ishihara, Teruo
2001-08-01
The wide spread of digital technology in the medical field has led to a demand for the high-quality, high-speed, and user-friendly digital image presentation system in the daily medical conferences. To fulfill this demand, we developed a presentation system for radiological and pathological images. It is composed of a super-high-definition (SHD) imaging system, a radiological image database (R-DB), a pathological image database (P-DB), and the network interconnecting these three. The R-DB consists of a 270GB RAID, a database server workstation, and a film digitizer. The P-DB includes an optical microscope, a four-million-pixel digital camera, a 90GB RAID, and a database server workstation. A 100Mbps Ethernet LAN interconnects all the sub-systems. The Web-based system operation software was developed for easy operation. We installed the whole system in NTT East Kanto Hospital to evaluate it in the weekly case conferences. The SHD system could display digital full-color images of 2048 x 2048 pixels on a 28-inch CRT monitor. The doctors evaluated the image quality and size, and found them applicable to the actual medical diagnosis. They also appreciated short image switching time that contributed to smooth presentation. Thus, we confirmed that its characteristics met the requirements.
Grasso, Dennis N.
2003-01-01
Surface effects maps were produced for 72 of 89 underground detonations conducted at the Frenchman Flat, Rainier Mesa and Aqueduct Mesa, Climax Stock, Shoshone Mountain, Buckboard Mesa, and Dome Mountain testing areas of the Nevada Test Site between August 10, 1957 (Saturn detonation, Area 12) and September 18, 1992 (Hunters Trophy detonation, Area 12). The ?Other Areas? Surface Effects Map Database, which was used to construct the maps shown in this report, contains digital reproductions of these original maps. The database is provided in both ArcGIS (v. 8.2) geodatabase format and ArcView (v. 3.2) shapefile format. This database contains sinks, cracks, faults, and other surface effects having a combined (cumulative) length of 136.38 km (84.74 mi). In GIS digital format, the user can view all surface effects maps simultaneously, select and view the surface effects of one or more sites of interest, or view specific surface effects by area or site. Three map layers comprise the database. They are: (1) the surface effects maps layer (oase_n27f), (2) the bar symbols layer (oase_bar_n27f), and (3) the ball symbols layer (oase_ball_n27f). Additionally, an annotation layer, named 'Ball_and_Bar_Labels,' and a polygon features layer, named 'Area12_features_poly_n27f,' are contained in the geodatabase version of the database. The annotation layer automatically labels all 295 ball-and-bar symbols shown on these maps. The polygon features layer displays areas of ground disturbances, such as rock spall and disturbed ground caused by the detonations. Shapefile versions of the polygon features layer in Nevada State Plane and Universal Transverse Mercator projections, named 'area12_features_poly_n27f.shp' and 'area12_features_poly_u83m.shp,' are also provided in the archive.
A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review
Callahan, Ryan; Darzi, Ara; Mayer, Erik
2016-01-01
Background Digital maturity is the extent to which digital technologies are used as enablers to deliver a high-quality health service. Extensive literature exists about how to assess the components of digital maturity, but it has not been used to design a comprehensive framework for evaluation. Consequently, the measurement systems that do exist are limited to evaluating digital programs within one service or care setting, meaning that digital maturity evaluation is not accounting for the needs of patients across their care pathways. Objective The objective of our study was to identify the best methods and metrics for evaluating digital maturity and to create a novel, evidence-based tool for evaluating digital maturity across patient care pathways. Methods We systematically reviewed the literature to find the best methods and metrics for evaluating digital maturity. We searched the PubMed database for all papers relevant to digital maturity evaluation. Papers were selected if they provided insight into how to appraise digital systems within the health service and if they indicated the factors that constitute or facilitate digital maturity. Papers were analyzed to identify methodology for evaluating digital maturity and indicators of digitally mature systems. We then used the resulting information about methodology to design an evaluation framework. Following that, the indicators of digital maturity were extracted and grouped into increasing levels of maturity and operationalized as metrics within the evaluation framework. Results We identified 28 papers as relevant to evaluating digital maturity, from which we derived 5 themes. The first theme concerned general evaluation methodology for constructing the framework (7 papers). The following 4 themes were the increasing levels of digital maturity: resources and ability (6 papers), usage (7 papers), interoperability (3 papers), and impact (5 papers). The framework includes metrics for each of these levels at each stage of the typical patient care pathway. Conclusions The framework uses a patient-centric model that departs from traditional service-specific measurements and allows for novel insights into how digital programs benefit patients across the health system. Trial Registration N/A PMID:27080852
Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng
2018-02-02
In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications.
NOAA Data Rescue of Key Solar Databases and Digitization of Historical Solar Images
NASA Astrophysics Data System (ADS)
Coffey, H. E.
2006-08-01
Over a number of years, the staff at NOAA National Geophysical Data Center (NGDC) has worked to rescue key solar databases by converting them to digital format and making them available via the World Wide Web. NOAA has had several data rescue programs where staff compete for funds to rescue important and critical historical data that are languishing in archives and at risk of being lost due to deteriorating condition, loss of any metadata or descriptive text that describe the databases, lack of interest or funding in maintaining databases, etc. The Solar-Terrestrial Physics Division at NGDC was able to obtain funds to key in some critical historical tabular databases. Recently the NOAA Climate Database Modernization Program (CDMP) funded a project to digitize historical solar images, producing a large online database of historical daily full disk solar images. The images include the wavelengths Calcium K, Hydrogen Alpha, and white light photos, as well as sunspot drawings and the comprehensive drawings of a multitude of solar phenomena on one daily map (Fraunhofer maps and Wendelstein drawings). Included in the digitization are high resolution solar H-alpha images taken at the Boulder Solar Observatory 1967-1984. The scanned daily images document many phases of solar activity, from decadal variation to rotational variation to daily changes. Smaller versions are available online. Larger versions are available by request. See http://www.ngdc.noaa.gov/stp/SOLAR/ftpsolarimages.html. The tabular listings and solar imagery will be discussed.
Geologic Map Database of Texas
Stoeser, Douglas B.; Shock, Nancy; Green, Gregory N.; Dumonceaux, Gayle M.; Heran, William D.
2005-01-01
The purpose of this report is to release a digital geologic map database for the State of Texas. This database was compiled for the U.S. Geological Survey (USGS) Minerals Program, National Surveys and Analysis Project, whose goal is a nationwide assemblage of geologic, geochemical, geophysical, and other data. This release makes the geologic data from the Geologic Map of Texas available in digital format. Original clear film positives provided by the Texas Bureau of Economic Geology were photographically enlarged onto Mylar film. These films were scanned, georeferenced, digitized, and attributed by Geologic Data Systems (GDS), Inc., Denver, Colorado. Project oversight and quality control was the responsibility of the U.S. Geological Survey. ESRI ArcInfo coverages, AMLs, and shapefiles are provided.
Modernization and multiscale databases at the U.S. geological survey
Morrison, J.L.
1992-01-01
The U.S. Geological Survey (USGS) has begun a digital cartographic modernization program. Keys to that program are the creation of a multiscale database, a feature-based file structure that is derived from a spatial data model, and a series of "templates" or rules that specify the relationships between instances of entities in reality and features in the database. The database will initially hold data collected from the USGS standard map products at scales of 1:24,000, 1:100,000, and 1:2,000,000. The spatial data model is called the digital line graph-enhanced model, and the comprehensive rule set consists of collection rules, product generation rules, and conflict resolution rules. This modernization program will affect the USGS mapmaking process because both digital and graphic products will be created from the database. In addition, non-USGS map users will have more flexibility in uses of the databases. These remarks are those of the session discussant made in response to the six papers and the keynote address given in the session. ?? 1992.
Derkey, Pamela D.; Johnson, Bruce R.; Lackaff, Beatrice B.; Derkey, Robert E.
1998-01-01
The geologic map of the Rosalia 1:100,000-scale quadrangle was compiled in 1990 by S.Z. Waggoner of the Washington state Division of Geology and Earth Resources. This data was entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The intent was to provide a digital geospatial database for a previously published black and white paper geologic map. This database can be queried in many ways to produce a variety of geologic maps. Digital base map data files are not included: they may be obtained from a variety of commercial and government sources. This database is not meant to be used or displayed at any scale larger than 1:100,000 (e.g., 1:62,500 or 1:24,000) as it has been somewhat generalized to fit the 1:100,000 scale map. The map area is located in eastern Washington and extends across the state border into western Idaho. This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. We wish to thank J. Eric Schuster of the Washington Division of Geology and Earth Resources for providing the original stable-base mylar and the funding for it to be scanned. We also thank Dick Blank and Barry Moring of the U.S. Geological Survey for reviewing the manuscript and digital files, respectively.
Archive and Database as Metaphor: Theorizing the Historical Record
ERIC Educational Resources Information Center
Manoff, Marlene
2010-01-01
Digital media increase the visibility and presence of the past while also reshaping our sense of history. We have extraordinary access to digital versions of books, journals, film, television, music, art and popular culture from earlier eras. New theoretical formulations of database and archive provide ways to think creatively about these changes…
Digital geomorphological landslide hazard mapping of the Alpago area, Italy
NASA Astrophysics Data System (ADS)
van Westen, Cees J.; Soeters, Rob; Sijmons, Koert
Large-scale geomorphological maps of mountainous areas are traditionally made using complex symbol-based legends. They can serve as excellent "geomorphological databases", from which an experienced geomorphologist can extract a large amount of information for hazard mapping. However, these maps are not designed to be used in combination with a GIS, due to their complex cartographic structure. In this paper, two methods are presented for digital geomorphological mapping at large scales using GIS and digital cartographic software. The methods are applied to an area with a complex geomorphological setting on the Borsoia catchment, located in the Alpago region, near Belluno in the Italian Alps. The GIS database set-up is presented with an overview of the data layers that have been generated and how they are interrelated. The GIS database was also converted into a paper map, using a digital cartographic package. The resulting largescale geomorphological hazard map is attached. The resulting GIS database and cartographic product can be used to analyse the hazard type and hazard degree for each polygon, and to find the reasons for the hazard classification.
Windsor, J S; Rodway, G W; Middleton, P M; McCarthy, S
2006-01-01
Objective The emergence of a new generation of “point‐and‐shoot” digital cameras offers doctors a compact, portable and user‐friendly solution to the recording of highly detailed digital photographs and video images. This work highlights the use of such technology, and provides information for those who wish to record, store and display their own medical images. Methods Over a 3‐month period, a digital camera was carried by a doctor in a busy, adult emergency department and used to record a range of clinical images that were subsequently transferred to a computer database. Results In total, 493 digital images were recorded, of which 428 were photographs and 65 were video clips. These were successfully used for teaching purposes, publications and patient records. Conclusions This study highlights the importance of informed consent, the selection of a suitable package of digital technology and the role of basic photographic technique in developing a successful digital database in a busy clinical environment. PMID:17068281
Landscape features, standards, and semantics in U.S. national topographic mapping databases
Varanka, Dalia
2009-01-01
The objective of this paper is to examine the contrast between local, field-surveyed topographical representation and feature representation in digital, centralized databases and to clarify their ontological implications. The semantics of these two approaches are contrasted by examining the categorization of features by subject domains inherent to national topographic mapping. When comparing five USGS topographic mapping domain and feature lists, results indicate that multiple semantic meanings and ontology rules were applied to the initial digital database, but were lost as databases became more centralized at national scales, and common semantics were replaced by technological terms.
Brabb, Earl E.; Roberts, Sebastian; Cotton, William R.; Kropp, Alan L.; Wright, Robert H.; Zinn, Erik N.; Digital database by Roberts, Sebastian; Mills, Suzanne K.; Barnes, Jason B.; Marsolek, Joanna E.
2000-01-01
This publication consists of a digital map database on a geohazards web site, http://kaibab.wr.usgs.gov/geohazweb/intro.htm, this text, and 43 digital map images available for downloading at this site. The report is stored as several digital files, in ARC export (uncompressed) format for the database, and Postscript and PDF formats for the map images. Several of the source data layers for the images have already been released in other publications by the USGS and are available for downloading on the Internet. These source layers are not included in this digital database, but rather a reference is given for the web site where the data can be found in digital format. The exported ARC coverages and grids lie in UTM zone 10 projection. The pamphlet, which only describes the content and character of the digital map database, is included as Postscript, PDF, and ASCII text files and is also available on paper as USGS Open-File Report 00-127. The full versatility of the spatial database is realized by importing the ARC export files into ARC/INFO or an equivalent GIS. Other GIS packages, including MapInfo and ARCVIEW, can also use the ARC export files. The Postscript map image can be used for viewing or plotting in computer systems with sufficient capacity, and the considerably smaller PDF image files can be viewed or plotted in full or in part from Adobe ACROBAT software running on Macintosh, PC, or UNIX platforms.
Increasing the efficiency of digitization workflows for herbarium specimens.
Tulig, Melissa; Tarnowsky, Nicole; Bevans, Michael; Anthony Kirchgessner; Thiers, Barbara M
2012-01-01
The New York Botanical Garden Herbarium has been databasing and imaging its estimated 7.3 million plant specimens for the past 17 years. Due to the size of the collection, we have been selectively digitizing fundable subsets of specimens, making successive passes through the herbarium with each new grant. With this strategy, the average rate for databasing complete records has been 10 specimens per hour. With 1.3 million specimens databased, this effort has taken about 130,000 hours of staff time. At this rate, to complete the herbarium and digitize the remaining 6 million specimens, another 600,000 hours would be needed. Given the current biodiversity and economic crises, there is neither the time nor money to complete the collection at this rate.Through a combination of grants over the last few years, The New York Botanical Garden has been testing new protocols and tactics for increasing the rate of digitization through combinations of data collaboration, field book digitization, partial data entry and imaging, and optical character recognition (OCR) of specimen images. With the launch of the National Science Foundation's new Advancing Digitization of Biological Collections program, we hope to move forward with larger, more efficient digitization projects, capturing data from larger portions of the herbarium at a fraction of the cost and time.
Increasing the efficiency of digitization workflows for herbarium specimens
Tulig, Melissa; Tarnowsky, Nicole; Bevans, Michael; Anthony Kirchgessner; Thiers, Barbara M.
2012-01-01
Abstract The New York Botanical Garden Herbarium has been databasing and imaging its estimated 7.3 million plant specimens for the past 17 years. Due to the size of the collection, we have been selectively digitizing fundable subsets of specimens, making successive passes through the herbarium with each new grant. With this strategy, the average rate for databasing complete records has been 10 specimens per hour. With 1.3 million specimens databased, this effort has taken about 130,000 hours of staff time. At this rate, to complete the herbarium and digitize the remaining 6 million specimens, another 600,000 hours would be needed. Given the current biodiversity and economic crises, there is neither the time nor money to complete the collection at this rate. Through a combination of grants over the last few years, The New York Botanical Garden has been testing new protocols and tactics for increasing the rate of digitization through combinations of data collaboration, field book digitization, partial data entry and imaging, and optical character recognition (OCR) of specimen images. With the launch of the National Science Foundation’s new Advancing Digitization of Biological Collections program, we hope to move forward with larger, more efficient digitization projects, capturing data from larger portions of the herbarium at a fraction of the cost and time. PMID:22859882
Guidelines for establishing and maintaining construction quality databases : tech brief.
DOT National Transportation Integrated Search
2006-12-01
Construction quality databases contain a variety of construction-related data that characterize the quality of materials and workmanship. The primary purpose of construction quality databases is to help State highway agencies (SHAs) assess the qualit...
Digital hand atlas and computer-aided bone age assessment via the Web
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente
1999-07-01
A frequently used assessment method of bone age is atlas matching by a radiological examination of a hand image against a reference set of atlas patterns of normal standards. We are in a process of developing a digital hand atlas with a large standard set of normal hand and wrist images that reflect the skeletal maturity, race and sex difference, and current child development. The digital hand atlas will be used for a computer-aided bone age assessment via Web. We have designed and partially implemented a computer-aided diagnostic (CAD) system for Web-based bone age assessment. The system consists of a digital hand atlas, a relational image database and a Web-based user interface. The digital atlas is based on a large standard set of normal hand an wrist images with extracted bone objects and quantitative features. The image database uses a content- based indexing to organize the hand images and their attributes and present to users in a structured way. The Web-based user interface allows users to interact with the hand image database from browsers. Users can use a Web browser to push a clinical hand image to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, will be extracted and compared with patterns from the atlas database to assess the bone age. The relevant reference imags and the final assessment report will be sent back to the user's browser via Web. The digital atlas will remove the disadvantages of the currently out-of-date one and allow the bone age assessment to be computerized and done conveniently via Web. In this paper, we present the system design and Web-based client-server model for computer-assisted bone age assessment and our initial implementation of the digital atlas database.
Using Geocoded Databases in Teaching Urban Historical Geography.
ERIC Educational Resources Information Center
Miller, Roger P.
1986-01-01
Provides information regarding hardware and software requirements for using geocoded databases in urban historical geography. Reviews 11 IBM and Apple Macintosh database programs and describes the pen plotter and digitizing table interface used with the databases. (JDH)
Yao, Qingqiang; Wei, Bo; Guo, Yang; Jin, Chengzhe; Du, Xiaotao; Yan, Chao; Yan, Junwei; Hu, Wenhao; Xu, Yan; Zhou, Zhi; Wang, Yijin; Wang, Liming
2015-01-01
The study aims to investigate the techniques of design and construction of CT 3D reconstructional data-based polycaprolactone (PCL)-hydroxyapatite (HA) scaffold. Femoral and lumbar spinal specimens of eight male New Zealand white rabbits were performed CT and laser scanning data-based 3D printing scaffold processing using PCL-HA powder. Each group was performed eight scaffolds. The CAD-based 3D printed porous cylindrical stents were 16 piece × 3 groups, including the orthogonal scaffold, the Pozi-hole scaffold and the triangular hole scaffold. The gross forms, fiber scaffold diameters and porosities of the scaffolds were measured, and the mechanical testing was performed towards eight pieces of the three kinds of cylindrical scaffolds, respectively. The loading force, deformation, maximum-affordable pressure and deformation value were recorded. The pore-connection rate of each scaffold was 100 % within each group, there was no significant difference in the gross parameters and micro-structural parameters of each scaffold when compared with the design values (P > 0.05). There was no significant difference in the loading force, deformation and deformation value under the maximum-affordable pressure of the three different cylinder scaffolds when the load was above 320 N. The combination of CT and CAD reverse technology could accomplish the design and manufacturing of complex bone tissue engineering scaffolds, with no significant difference in the impacts of the microstructures towards the physical properties of different porous scaffolds under large load.
Training system for digital mammographic diagnoses of breast cancer
NASA Astrophysics Data System (ADS)
Thomaz, R. L.; Nirschl Crozara, M. G.; Patrocinio, A. C.
2013-03-01
As the technology evolves, the analog mammography systems are being replaced by digital systems. The digital system uses video monitors as the display of mammographic images instead of the previously used screen-film and negatoscope for analog images. The change in the way of visualizing mammographic images may require a different approach for training the health care professionals in diagnosing the breast cancer with digital mammography. Thus, this paper presents a computational approach to train the health care professionals providing a smooth transition between analog and digital technology also training to use the advantages of digital image processing tools to diagnose the breast cancer. This computational approach consists of a software where is possible to open, process and diagnose a full mammogram case from a database, which has the digital images of each of the mammographic views. The software communicates with a gold standard digital mammogram cases database. This database contains the digital images in Tagged Image File Format (TIFF) and the respective diagnoses according to BI-RADSTM, these files are read by software and shown to the user as needed. There are also some digital image processing tools that can be used to provide better visualization of each single image. The software was built based on a minimalist and a user-friendly interface concept that might help in the smooth transition. It also has an interface for inputting diagnoses from the professional being trained, providing a result feedback. This system has been already completed, but hasn't been applied to any professional training yet.
Creating a standardized watersheds database for the Lower Rio Grande/Río Bravo, Texas
Brown, J.R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Río Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets.Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds.A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Creating a standardized watersheds database for the lower Rio Grande/Rio Bravo, Texas
Brown, Julie R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Rio Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets. Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds. A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Research on the Digital Communication and Development of Yunnan Bai Embroidery
NASA Astrophysics Data System (ADS)
Xu, Wu; Jin, Chunjie; Su, Ying; Wu, Lei; He, Jin
2017-12-01
Our country attaches great importance to the protection and development of intangible culture these days, but the shortcoming of discoloration, breakage and occupying too much space still exist in the traditional way of museum protection. This paper starts from the analysis of the above problems, and then cogitates why and how to use the virtual reality (VR) technology to better solve these problems and analyzes this specific object of the Yunnan Bai embroidery in order to achieve its full human value and economic value. Firstly, using 3D MAX to design and produce the three-dimensional model of the embroideries of Bai nationality. Secondly, using the large number of embroidery model data that we collect to construct the Yunnan Bai embroidery model database. Next, creating a digital display system of virtual embroidery and putting the digital display system to the PC client websites and mobile phone applications to achieve information sharing. Finally, through the use of virtual display technology for three-dimensional design of embroidery, the embroidery clothing, bedding and other works with modern style can be designed so as to continuously pursue and give full play to the charm and economic value of embroidery.
Sugimoto, Jonathan D; Labrique, Alain B; Ahmad, Salahuddin; Rashid, Mahbubur; Klemm, Rolf D W; Christian, Parul; West, Keith P
2007-12-01
In the last decade, geographic information systems (GIS) have become accessible to researchers in developing countries, yet guidance remains sparse for developing a GIS. Drawing on experience in developing a GIS for a large community trial in rural Bangladesh, six stages for constructing, maintaining, and using a GIS for health research purposes were outlined. The system contains 0.25 million landmarks, including 150,000 houses, in an area of 435 sq km with over 650,000 people. Assuming access to reasonably accurate paper boundary maps of the intended working area and the absence of pre-existing digital local-area maps, the six stages are: to (a) digitize and update existing paper maps, (b) join the digitized maps into a large-area map, (c) reference this large-area map to a geographic coordinate system, (d) insert location landmarks of interest, (e) maintain the GIS, and (f) link it to other research databases. These basic steps can produce a household-level, updated, scaleable GIS that can both enhance field efficiency and support epidemiologic analyses of demographic patterns, diseases, and health outcomes.
Sugimoto, Jonathan D.; Labrique, Alain B.; Salahuddin, Ahmad; Rashid, Mahbubur; Klemm, Rolf D.W.; Christian, Parul; West, Keith P.
2007-01-01
In the last decade, geographic information systems (GIS) have become accessible to researchers in developing countries, yet guidance remains sparse for developing a GIS. Drawing on experience in developing a GIS for a large community trial in rural Bangladesh, six stages for constructing, maintaining, and using a GIS for health research purposes were outlined. The system contains 0.25 million landmarks, including 150,000 houses, in an area of 435 sq km with over 650,000 people. Assuming access to reasonably accurate paper boundary maps of the intended working area and the absence of pre-existing digital local-area maps, the six stages are: to (a) digitize and update existing paper maps, (b) join the digitized maps into a large-area map, (c) reference this large-area map to a geographic coordinate system, (d) insert location landmarks of interest, (e) maintain the GIS, and (f) link it to other research databases. These basic steps can produce a household-level, updated, scaleable GIS that can both enhance field efficiency and support epidemiologic analyses of demographic patterns, diseases, and health outcomes. PMID:18402187
Conversion of environmental data to a digital-spatial database, Puget Sound area, Washington
Uhrich, M.A.; McGrath, T.S.
1997-01-01
Data and maps from the Puget Sound Environmental Atlas, compiled for the U.S. Environmental Protection Agency, the Puget Sound Water Quality Authority, and the U.S. Army Corps of Engineers, have been converted into a digital-spatial database using a geographic information system. Environmental data for the Puget Sound area,collected from sources other than the Puget SoundEnvironmental Atlas by different Federal, State, andlocal agencies, also have been converted into thisdigital-spatial database. Background on the geographic-information-system planning process, the design and implementation of the geographic information-system database, and the reasons for conversion to this digital-spatial database are included in this report. The Puget Sound Environmental Atlas data layers include information about seabird nesting areas, eelgrass and kelp habitat, marine mammal and fish areas, and shellfish resources and bed certification. Data layers, from sources other than the Puget Sound Environmental Atlas, include the Puget Sound shoreline, the water-body system, shellfish growing areas, recreational shellfish beaches, sewage-treatment outfalls, upland hydrography,watershed and political boundaries, and geographicnames. The sources of data, descriptions of the datalayers, and the steps and errors of processing associated with conversion to a digital-spatial database used in development of the Puget Sound Geographic Information System also are included in this report. The appendixes contain data dictionaries for each of the resource layers and error values for the conversion of Puget SoundEnvironmental Atlas data.
Hybrid Modeling Based on Scsg-Br and Orthophoto
NASA Astrophysics Data System (ADS)
Zhou, G.; Huang, Y.; Yue, T.; Li, X.; Huang, W.; He, C.; Wu, Z.
2018-05-01
With the development of digital city, digital applications are more and more widespread, while the urban buildings are more complex. Therefore, establishing an effective data model is the key to express urban building models accurately. In addition, the combination of 3D building model and remote sensing data become a trend to build digital city there are a large amount of data resulting in data redundancy. In order to solve the limitation of single modelling of constructive solid geometry (CSG), this paper presents a mixed modelling method based on SCSG-BR for urban buildings representation. On one hand, the improved CSG method, which is called as "Spatial CSG (SCSG)" representation method, is used to represent the exterior shape of urban buildings. On the other hand, the boundary representation (BR) method represents the topological relationship between geometric elements of urban building, in which the textures is considered as the attribute data of the wall and the roof of urban building. What's more, the method combined file database and relational database is used to manage the data of three-dimensional building model, which can decrease the complex processes in texture mapping. During the data processing, the least-squares algorithm with constraints is used to orthogonalize the building polygons and adjust the polygons topology to ensure the accuracy of the modelling data. Finally, this paper matches the urban building model with the corresponding orthophoto. This paper selects data of Denver, Colorado, USA to establish urban building realistic model. The results show that the SCSG-BR method can represent the topological relations of building more precisely. The organization and management of urban building model data reduce the redundancy of data and improve modelling speed. The combination of orthophoto and urban building model further strengthens the application in view analysis and spatial query, which enhance the scope of digital city applications.
NASA Technical Reports Server (NTRS)
Benson, Robert F.; Truhlik, Vladimir; Huang, Xueqin; Wang, Yongli; Bilitza, Dieter
2012-01-01
The topside sounders of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. The resulting ionograms were displayed on 35 mm film for analysis by visual inspection. Each of these satellites, launched between 1962 and 1971, produced data for 10 to 20 years. A number of the original telemetry tapes from this large data set have been converted directly into digital records. Software, known as the Topside Ionogram Scalar With True-Height (TOPIST) algorithm, has been produced and used for the automatic inversion of the ionogram reflection traces on more than 100,000 ISIS-2 digital topside ionograms into topside vertical electron density profiles Ne(h). Here we present some topside ionospheric solar cycle variations deduced from the TOPIST database to illustrate the scientific benefit of improving and expanding the topside ionospheric Ne(h) database. The profile improvements will be based on improvements in the TOPIST software motivated by direct comparisons between TOPIST profiles and profiles produced by manual scaling in the early days of the ISIS program. The database expansion will be based on new software designed to overcome limitations in the original digital topside ionogram database caused by difficulties encountered during the analog-to-digital conversion process in the detection of the ionogram frame sync pulse and/or the frequency markers. This improved and expanded TOPIST topside Ne(h) database will greatly enhance investigations into both short- and long-term ionospheric changes, e.g., the observed topside ionospheric responses to magnetic storms, induced by interplanetary magnetic clouds, and solar cycle variations, respectively.
Enhanced digital mapping project : final report
DOT National Transportation Integrated Search
2004-11-19
The Enhanced Digital Map Project (EDMap) was a three-year effort launched in April 2001 to develop a range of digital map database enhancements that enable or improve the performance of driver assistance systems currently under development or conside...
ERIC Educational Resources Information Center
Raths, David
2008-01-01
With the widespread digitization of art, photography, and music, plus the introduction of streaming video, many colleges and universities are realizing that they must develop or purchase systems to preserve their school's digitized objects; that they must create searchable databases so that researchers can find and share copies of digital files;…
Digital Initiatives and Metadata Use in Thailand
ERIC Educational Resources Information Center
SuKantarat, Wichada
2008-01-01
Purpose: This paper aims to provide information about various digital initiatives in libraries in Thailand and especially use of Dublin Core metadata in cataloguing digitized objects in academic and government digital databases. Design/methodology/approach: The author began researching metadata use in Thailand in 2003 and 2004 while on sabbatical…
Digital map databases in support of avionic display systems
NASA Astrophysics Data System (ADS)
Trenchard, Michael E.; Lohrenz, Maura C.; Rosche, Henry, III; Wischow, Perry B.
1991-08-01
The emergence of computerized mission planning systems (MPS) and airborne digital moving map systems (DMS) has necessitated the development of a global database of raster aeronautical chart data specifically designed for input to these systems. The Naval Oceanographic and Atmospheric Research Laboratory''s (NOARL) Map Data Formatting Facility (MDFF) is presently dedicated to supporting these avionic display systems with the development of the Compressed Aeronautical Chart (CAC) database on Compact Disk Read Only Memory (CDROM) optical discs. The MDFF is also developing a series of aircraft-specific Write-Once Read Many (WORM) optical discs. NOARL has initiated a comprehensive research program aimed at improving the pilots'' moving map displays current research efforts include the development of an alternate image compression technique and generation of a standard set of color palettes. The CAC database will provide digital aeronautical chart data in six different scales. CAC is derived from the Defense Mapping Agency''s (DMA) Equal Arc-second (ARC) Digitized Raster Graphics (ADRG) a series of scanned aeronautical charts. NOARL processes ADRG to tailor the chart image resolution to that of the DMS display while reducing storage requirements through image compression techniques. CAC is being distributed by DMA as a library of CDROMs.
NASA Astrophysics Data System (ADS)
Ferrière, Ludovic; Steinwender, Christian
2014-05-01
The Natural History Museum Vienna (NHMV) owns one of the largest building, decorative, and ornamental stones collections in Europe. This important collection dates back to the 19th century and was initiated by curator Felix Karrer after a donation of the "Union-Baugesellschaft" (Karrer, 1892). It contains rock samples used for the construction of most of the famous buildings and monuments in Vienna and in the entire Austria and surrounding countries, as well as from other famous constructions and antique (Egyptian, Greek, Roman, etc.) monuments in the world. Decorative stones that were used for the inside parts of buildings as well as artificial materials, such as stucco, tiles, and building-materials like gravel, are also part of this collection. Unfortunately, most specimens of this collection cannot be displayed at the NHMV (i.e., only 500 specimens are visible in the display Hall I) and are therefore preserved in storage rooms, and not accessible to the public. The main objective of our project of digitalization is to share our rock collection and all treasures it contains with the large majority of interested persons, and especially to provide knowledge on these rocks for people who need this information, such as people who work in cultural, architectural, scientific, and commercial fields. So far 4,500 samples from our collection have been processed with the support of the Open Up! project (Opening up the Natural History Heritage for Europeana). Our database contains all information available on these samples (including e.g., the name of the rock, locality, historic use, heritage utilization, etc.), high-quality digital photographs (with both top and bottom sides of the samples), and scanned labels (both "old" NHMV labels and other (original) labels attached to the samples). We plan to achieve the full digitalization of our unique collection within the next two years and to develop a website to provide access to the content of our database (if adequate funding is obtained), - a unique way to show the public these thousands of objects, and also to facilitate the work of architects and stone/building conservation workers who need information about these (historical) stones. This website would be a bridge between our "museum and scientific expertise" and people from the "culture, renovation, and architecture" fields. We see our database & website as a crossover media between "geology" and the numerous fields in which rocks are used, knowing that we will provide information on where these stones were/are excavated as well as the place where they were/are used. Such type of information is also of interest for historians and archaeologists. References: Karrer F. 1892. Führer durch die Baumaterial-Sammlung des k. k. naturhistorischen Hofmuseums in Wien. Verlag von R. Lechner's k. u. k. Hof- und Univ.-Buchhandlung, Wien. 304p.
Psychometric Properties of a Digital Citizenship Questionnaire
ERIC Educational Resources Information Center
Nordin, Mohamad Sahari; Ahmad, Tunku Badariah Tunku; Zubairi, Ainol Madziah; Ismail, Nik Ahmad Hisham; Rahman, Abdul Hamid Abdul; Trayek, Fuad A. A.; Ibrahim, Mohd Burhan
2016-01-01
The purpose of this study was twofold, i.e. to examine the extent to which students' self-reported use of digital technology constituted meaningful and interpretable dimensions of the digital citizenship construct, and to test the adequacy of the construct in terms of its reliability, convergent validity, discriminant validity, and measurement…
NASA Astrophysics Data System (ADS)
Wang, Bo
2018-04-01
Based on the digitized information and network, digital campus is an integration of teaching, management, science and research, life service and technology service, and it is one of the current mainstream construction form of campus function. This paper regarded the "mobile computing" core digital environment construction development as the background, explored the multiple management system technology content design and achievement of multimedia classrooms in digital campus and scientifically proved the technology superiority of management system.
Large image microscope array for the compilation of multimodality whole organ image databases.
Namati, Eman; De Ryk, Jessica; Thiesse, Jacqueline; Towfic, Zaid; Hoffman, Eric; Mclennan, Geoffrey
2007-11-01
Three-dimensional, structural and functional digital image databases have many applications in education, research, and clinical medicine. However, to date, apart from cryosectioning, there have been no reliable means to obtain whole-organ, spatially conserving histology. Our aim was to generate a system capable of acquiring high-resolution images, featuring microscopic detail that could still be spatially correlated to the whole organ. To fulfill these objectives required the construction of a system physically capable of creating very fine whole-organ sections and collecting high-magnification and resolution digital images. We therefore designed a large image microscope array (LIMA) to serially section and image entire unembedded organs while maintaining the structural integrity of the tissue. The LIMA consists of several integrated components: a novel large-blade vibrating microtome, a 1.3 megapixel peltier cooled charge-coupled device camera, a high-magnification microscope, and a three axis gantry above the microtome. A custom control program was developed to automate the entire sectioning and automated raster-scan imaging sequence. The system is capable of sectioning unembedded soft tissue down to a thickness of 40 microm at specimen dimensions of 200 x 300 mm to a total depth of 350 mm. The LIMA system has been tested on fixed lung from sheep and mice, resulting in large high-quality image data sets, with minimal distinguishable disturbance in the delicate alveolar structures. Copyright 2007 Wiley-Liss, Inc.
Mavrikakis, I; Mantas, J; Diomidous, M
2007-01-01
This paper is based on the research on the possible structure of an information system for the purposes of occupational health and safety management. We initiated a questionnaire in order to find the possible interest on the part of potential users in the subject of occupational health and safety. The depiction of the potential interest is vital both for the software analysis cycle and development according to previous models. The evaluation of the results tends to create pilot applications among different enterprises. Documentation and process improvements ascertained quality of services, operational support, occupational health and safety advice are the basics of the above applications. Communication and codified information among intersted parts is the other target of the survey regarding health issues. Computer networks can offer such services. The network will consist of certain nodes responsible to inform executives on Occupational Health and Safety. A web database has been installed for inserting and searching documents. The submission of files to a server and the answers to questionnaires through the web help the experts to perform their activities. Based on the requirements of enterprises we have constructed a web file server. We submit files so that users can retrieve the files which they need. The access is limited to authorized users. Digital watermarks authenticate and protect digital objects.
Open Clients for Distributed Databases
NASA Astrophysics Data System (ADS)
Chayes, D. N.; Arko, R. A.
2001-12-01
We are actively developing a collection of open source example clients that demonstrate use of our "back end" data management infrastructure. The data management system is reported elsewhere at this meeting (Arko and Chayes: A Scaleable Database Infrastructure). In addition to their primary goal of being examples for others to build upon, some of these clients may have limited utility in them selves. More information about the clients and the data infrastructure is available on line at http://data.ldeo.columbia.edu. The available examples to be demonstrated include several web-based clients including those developed for the Community Review System of the Digital Library for Earth System Education, a real-time watch standers log book, an offline interface to use log book entries, a simple client to search on multibeam metadata and others are Internet enabled and generally web-based front ends that support searches against one or more relational databases using industry standard SQL queries. In addition to the web based clients, simple SQL searches from within Excel and similar applications will be demonstrated. By defining, documenting and publishing a clear interface to the fully searchable databases, it becomes relatively easy to construct client interfaces that are optimized for specific applications in comparison to building a monolithic data and user interface system.
Ginseng Genome Database: an open-access platform for genomics of Panax ginseng.
Jayakodi, Murukarthick; Choi, Beom-Soon; Lee, Sang-Choon; Kim, Nam-Hoon; Park, Jee Young; Jang, Woojong; Lakshmanan, Meiyappan; Mohan, Shobhana V G; Lee, Dong-Yup; Yang, Tae-Jin
2018-04-12
The ginseng (Panax ginseng C.A. Meyer) is a perennial herbaceous plant that has been used in traditional oriental medicine for thousands of years. Ginsenosides, which have significant pharmacological effects on human health, are the foremost bioactive constituents in this plant. Having realized the importance of this plant to humans, an integrated omics resource becomes indispensable to facilitate genomic research, molecular breeding and pharmacological study of this herb. The first draft genome sequences of P. ginseng cultivar "Chunpoong" were reported recently. Here, using the draft genome, transcriptome, and functional annotation datasets of P. ginseng, we have constructed the Ginseng Genome Database http://ginsengdb.snu.ac.kr /, the first open-access platform to provide comprehensive genomic resources of P. ginseng. The current version of this database provides the most up-to-date draft genome sequence (of approximately 3000 Mbp of scaffold sequences) along with the structural and functional annotations for 59,352 genes and digital expression of genes based on transcriptome data from different tissues, growth stages and treatments. In addition, tools for visualization and the genomic data from various analyses are provided. All data in the database were manually curated and integrated within a user-friendly query page. This database provides valuable resources for a range of research fields related to P. ginseng and other species belonging to the Apiales order as well as for plant research communities in general. Ginseng genome database can be accessed at http://ginsengdb.snu.ac.kr /.
Benford's Law for Quality Assurance of Manner of Death Counts in Small and Large Databases.
Daniels, Jeremy; Caetano, Samantha-Jo; Huyer, Dirk; Stephen, Andrew; Fernandes, John; Lytwyn, Alice; Hoppe, Fred M
2017-09-01
To assess if Benford's law, a mathematical law used for quality assurance in accounting, can be applied as a quality assurance measure for the manner of death determination. We examined a regional forensic pathology service's monthly manner of death counts (N = 2352) from 2011 to 2013, and provincial monthly and weekly death counts from 2009 to 2013 (N = 81,831). We tested whether each dataset's leading digit followed Benford's law via the chi-square test. For each database, we assessed whether number 1 was the most common leading digit. The manner of death counts first digit followed Benford's law in all the three datasets. Two of the three datasets had 1 as the most frequent leading digit. The manner of death data in this study showed qualities consistent with Benford's law. The law has potential as a quality assurance metric in the manner of death determination for both small and large databases. © 2017 American Academy of Forensic Sciences.
Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng
2018-01-01
In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications. PMID:29393887
NASA Astrophysics Data System (ADS)
Pennington, Catherine; Dashwood, Claire; Freeborough, Katy
2014-05-01
The National Landslide Database has been developed by the British Geological Survey (BGS) and is the focus for national geohazard research for landslides in Great Britain. The history and structure of the geospatial database and associated Geographical Information System (GIS) are explained, along with the future developments of the database and its applications. The database is the most extensive source of information on landslides in Great Britain with over 16,500 records of landslide events, each documented as fully as possible. Data are gathered through a range of procedures, including: incorporation of other databases; automated trawling of current and historical scientific literature and media reports; new field- and desk-based mapping technologies with digital data capture, and crowd-sourcing information through social media and other online resources. This information is invaluable for the investigation, prevention and mitigation of areas of unstable ground in accordance with Government planning policy guidelines. The national landslide susceptibility map (GeoSure) and a national landslide domain map currently under development rely heavily on the information contained within the landslide database. Assessing susceptibility to landsliding requires knowledge of the distribution of failures and an understanding of causative factors and their spatial distribution, whilst understanding the frequency and types of landsliding present is integral to modelling how rainfall will influence the stability of a region. Communication of landslide data through the Natural Hazard Partnership (NHP) contributes to national hazard mitigation and disaster risk reduction with respect to weather and climate. Daily reports of landslide potential are published by BGS through the NHP and data collected for the National Landslide Database is used widely for the creation of these assessments. The National Landslide Database is freely available via an online GIS and is used by a variety of stakeholders for research purposes.
Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David
2010-05-01
Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.
Internet-based information system of digital geological data providing
NASA Astrophysics Data System (ADS)
Yuon, Egor; Soukhanov, Mikhail; Markov, Kirill
2015-04-01
One of the Russian Federal аgency of mineral resources problems is to provide the geological information which was delivered during the field operation for the means of federal budget. This information should be present in the current, conditional form. Before, the leading way of presenting geological information were paper geological maps, slices, borehole diagrams reports etc. Technologies of database construction, including distributed databases, technologies of construction of distributed information-analytical systems and Internet-technologies are intensively developing nowadays. Most of geological organizations create their own information systems without any possibility of integration into other systems of the same orientation. In 2012, specialists of VNIIgeosystem together with specialists of VSEGEI started the large project - creating the system of providing digital geological materials with using modern and perspective internet-technologies. The system is based on the web-server and the set of special programs, which allows users to efficiently get rasterized and vectorised geological materials. These materials are: geological maps of scale 1:1M, geological maps of scale 1:200 000 and 1:2 500 000, the fragments of seamless geological 1:1M maps, structural zoning maps inside the seamless fragments, the legends for State geological maps 1:200 000 and 1:1 000 000, full author's set of maps and also current materials for international projects «Atlas of geological maps for Circumpolar Arctic scale 1:5 000 000» and «Atlas of Geologic maps of central Asia and adjacent areas scale 1:2 500 000». The most interesting and functional block of the system - is the block of providing structured and well-formalized geological vector materials, based on Gosgeolkart database (NGKIS), managed by Oracle and the Internet-access is supported by web-subsystem NGKIS, which is currently based on MGS-Framework platform, developed by VNIIgeosystem. One of the leading elements is the web-service, which realizes the interaction of all parts of the system and controls whole the way of the request from the user to the database and back, adopted to the GeoSciML and EarthResourceML view. The experience of creation the Internet-based information system of digital geological data providing, and also previous works, including the developing of web-service of NGKIS-system, allows to tell, that technological realization of presenting Russian geological-cartographical data with using of international standards is possible. While realizing, it could be some difficulties, associated with geological material depth. Russian informational geological model is more deep and wide, than foreign. This means the main problem of using international standards and formats: Russian geological data presentation is possible only with decreasing the data detalisation. But, such a problem becomes not very important, if the service publishes also Russian vocabularies, not associated with international vocabularies. In this case, the international format could be the interchange format to change data between Russian users. The integration into the international projects reaches developing of the correlation schemes between Russian and foreign classificators and vocabularies.
ERIC Educational Resources Information Center
Williamson, Ben
2016-01-01
Educational institutions and governing practices are increasingly augmented with digital database technologies that function as new kinds of policy instruments. This article surveys and maps the landscape of digital policy instrumentation in education and provides two detailed case studies of new digital data systems. The Learning Curve is a…
BIM and IoT: A Synopsis from GIS Perspective
NASA Astrophysics Data System (ADS)
Isikdag, U.
2015-10-01
Internet-of-Things (IoT) focuses on enabling communication between all devices, things that are existent in real life or that are virtual. Building Information Models (BIMs) and Building Information Modelling is a hype that has been the buzzword of the construction industry for last 15 years. BIMs emerged as a result of a push by the software companies, to tackle the problems of inefficient information exchange between different software and to enable true interoperability. In BIM approach most up-to-date an accurate models of a building are stored in shared central databases during the design and the construction of a project and at post-construction stages. GIS based city monitoring / city management applications require the fusion of information acquired from multiple resources, BIMs, City Models and Sensors. This paper focuses on providing a method for facilitating the GIS based fusion of information residing in digital building "Models" and information acquired from the city objects i.e. "Things". Once this information fusion is accomplished, many fields ranging from Emergency Response, Urban Surveillance, Urban Monitoring to Smart Buildings will have potential benefits.
[A computer-aided image diagnosis and study system].
Li, Zhangyong; Xie, Zhengxiang
2004-08-01
The revolution in information processing, particularly the digitizing of medicine, has changed the medical study, work and management. This paper reports a method to design a system for computer-aided image diagnosis and study. Combined with some good idea of graph-text system and picture archives communicate system (PACS), the system was realized and used for "prescription through computer", "managing images" and "reading images under computer and helping the diagnosis". Also typical examples were constructed in a database and used to teach the beginners. The system was developed by the visual developing tools based on object oriented programming (OOP) and was carried into operation on the Windows 9X platform. The system possesses friendly man-machine interface.
D Textured Modelling of both Exterior and Interior of Korean Styled Architectures
NASA Astrophysics Data System (ADS)
Lee, J.-D.; Bhang, K.-J.; Schuhr, W.
2017-08-01
This paper describes 3D modelling procedure of two Korean styled architectures which were performed through a series of processing from data acquired with the terrestrial laser scanner. These two case projects illustate the use of terrestrial laser scanner as a digital documentation tool for management, conservation and restoration of the cultural assets. We showed an approach to automate reconstruction of both the outside and inside models of a building from laser scanning data. Laser scanning technology is much more efficient than existing photogrammetry in measuring shape and constructing spatial database for preservation and restoration of cultural assets as well as for deformation monitoring and safety diagnosis of structures.
Geologic map of the eastern part of the Challis National Forest and vicinity, Idaho
Wilson, A.B.; Skipp, B.A.
1994-01-01
The paper version of the Geologic Map of the eastern part of the Challis National Forest and vicinity, Idaho was compiled by Anna Wilson and Betty Skipp in 1994. The geology was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.
The use of karst geomorphology for planning, hazard avoidance and development in Great Britain
NASA Astrophysics Data System (ADS)
Cooper, Anthony H.; Farrant, Andrew R.; Price, Simon J.
2011-11-01
Within Great Britain five main types of karstic rocks - dolomite, limestone, chalk, gypsum and salt - are present. Each presents a different type and severity of karstic geohazard which are related to the rock solubility and geological setting. Typical karstic features associated with these rocks have been databased by the British Geological Survey (BGS) with records of sinkholes, cave entrances, stream sinks, resurgences and building damage; data for more than half of the country has been gathered. BGS has manipulated digital map data, for bedrock and superficial deposits, with digital elevation slope models, superficial deposit thickness models, the karst data and expertly interpreted areas, to generate a derived dataset assessing the likelihood of subsidence due to karst collapse. This dataset is informed and verified by the karst database and marketed as part of the BGS GeoSure suite. It is currently used by environmental regulators, the insurance and construction industries, and the BGS semi-automated enquiry system. The database and derived datasets can be further combined and manipulated using GIS to provide other datasets that deal with specific problems. Sustainable drainage systems, some of which use soak-aways into the ground, are being encouraged in Great Britain, but in karst areas they can cause ground stability problems. Similarly, open loop ground source heat or cooling pump systems may induce subsidence if installed in certain types of karstic environments such as in chalk with overlying sand deposits. Groundwater abstraction also has the potential to trigger subsidence in karst areas. GIS manipulation of the karst information is allowing Great Britain to be zoned into areas suitable, or unsuitable, for such uses; it has the potential to become part of a suite of planning management tools for local and National Government to assess the long term sustainable use of the ground.
A Medical Image Backup Architecture Based on a NoSQL Database and Cloud Computing Services.
Santos Simões de Almeida, Luan Henrique; Costa Oliveira, Marcelo
2015-01-01
The use of digital systems for storing medical images generates a huge volume of data. Digital images are commonly stored and managed on a Picture Archiving and Communication System (PACS), under the DICOM standard. However, PACS is limited because it is strongly dependent on the server's physical space. Alternatively, Cloud Computing arises as an extensive, low cost, and reconfigurable resource. However, medical images contain patient information that can not be made available in a public cloud. Therefore, a mechanism to anonymize these images is needed. This poster presents a solution for this issue by taking digital images from PACS, converting the information contained in each image file to a NoSQL database, and using cloud computing to store digital images.
Aitken, Douglas S.
1997-01-01
This Open-File report is a digital topographic map database. It contains a digital version of the 1970 U.S. Geological Survey topographic map of the San Francisco Bay Region (3 sheets), at a scale of 1:125,000. These ARC/INFO coverages are in vector format. The vectorization process has distorted characters representing letters and numbers, as well as some road and other symbols, making them difficult to read in some instances. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The content and character of the database and methods of obtaining it are described herein.
Nicholson, Suzanne W.; Stoeser, Douglas B.; Wilson, Frederic H.; Dicken, Connie L.; Ludington, Steve
2007-01-01
The growth in the use of Geographic nformation Systems (GS) has highlighted the need for regional and national digital geologic maps attributed with age and rock type information. Such spatial data can be conveniently used to generate derivative maps for purposes that include mineral-resource assessment, metallogenic studies, tectonic studies, human health and environmental research. n 1997, the United States Geological Survey’s Mineral Resources Program initiated an effort to develop national digital databases for use in mineral resource and environmental assessments. One primary activity of this effort was to compile a national digital geologic map database, utilizing state geologic maps, to support mineral resource studies in the range of 1:250,000- to 1:1,000,000-scale. Over the course of the past decade, state databases were prepared using a common standard for the database structure, fields, attributes, and data dictionaries. As of late 2006, standardized geological map databases for all conterminous (CONUS) states have been available on-line as USGS Open-File Reports. For Alaska and Hawaii, new state maps are being prepared, and the preliminary work for Alaska is being released as a series of 1:500,000-scale regional compilations. See below for a list of all published databases.
Ito, Kei
2010-01-01
Digital brain atlas is a kind of image database that specifically provide information about neurons and glial cells in the brain. It has various advantages that are unmatched by conventional paper-based atlases. Such advantages, however, may become disadvantages if appropriate cares are not taken. Because digital atlases can provide unlimited amount of data, they should be designed to minimize redundancy and keep consistency of the records that may be added incrementally by different staffs. The fact that digital atlases can easily be revised necessitates a system to assure that users can access previous versions that might have been cited in papers at a particular period. To inherit our knowledge to our descendants, such databases should be maintained for a very long period, well over 100 years, like printed books and papers. Technical and organizational measures to enable long-term archive should be considered seriously. Compared to the initial development of the database, subsequent efforts to increase the quality and quantity of its contents are not regarded highly, because such tasks do not materialize in the form of publications. This fact strongly discourages continuous expansion of, and external contributions to, the digital atlases after its initial launch. To solve these problems, the role of the biocurators is vital. Appreciation of the scientific achievements of the people who do not write papers, and establishment of the secure academic career path for them, are indispensable for recruiting talents for this very important job. PMID:20661458
Storage and distribution of pathology digital images using integrated web-based viewing systems.
Marchevsky, Alberto M; Dulbandzhyan, Ronda; Seely, Kevin; Carey, Steve; Duncan, Raymond G
2002-05-01
Health care providers have expressed increasing interest in incorporating digital images of gross pathology specimens and photomicrographs in routine pathology reports. To describe the multiple technical and logistical challenges involved in the integration of the various components needed for the development of a system for integrated Web-based viewing, storage, and distribution of digital images in a large health system. An Oracle version 8.1.6 database was developed to store, index, and deploy pathology digital photographs via our Intranet. The database allows for retrieval of images by patient demographics or by SNOMED code information. The Intranet of a large health system accessible from multiple computers located within the medical center and at distant private physician offices. The images can be viewed using any of the workstations of the health system that have authorized access to our Intranet, using a standard browser or a browser configured with an external viewer or inexpensive plug-in software, such as Prizm 2.0. The images can be printed on paper or transferred to film using a digital film recorder. Digital images can also be displayed at pathology conferences by using wireless local area network (LAN) and secure remote technologies. The standardization of technologies and the adoption of a Web interface for all our computer systems allows us to distribute digital images from a pathology database to a potentially large group of users distributed in multiple locations throughout a large medical center.
Publications - DDS 3 | Alaska Division of Geological & Geophysical Surveys
Division of Geological & Geophysical Surveys Digital Data Series 3, http://doi.org/10.14509/qff. http Combellick, R.A., 2012, Quaternary faults and folds in Alaska: A digital database, 31 p., 1 sheet, 1 map of Alaska (Plafker and others, 1994), 1 p. Digital Geospatial Data Digital Geospatial Data QFF
A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol
ERIC Educational Resources Information Center
Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.
2006-01-01
Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…
Digitized Database of Old Seismograms Recorder in Romania
NASA Astrophysics Data System (ADS)
Paulescu, Daniel; Rogozea, Maria; Popa, Mihaela; Radulian, Mircea
2016-08-01
The aim of this paper is to describe a managing system for a unique Romanian database of historical seismograms and complementary documentation (metadata) and its dissemination and analysis procedure. For this study, 5188 historical seismograms recorded between 1903 and 1957 by the Romanian seismological observatories (Bucharest-Filaret, Focşani, Bacău, Vrincioaia, Câmpulung-Muscel, Iaşi) were used. In order to reconsider the historical instrumental data, the analog seismograms are converted to digital images and digital waveforms (digitization/ vectorialisation). First, we applied a careful scanning procedure of the seismograms and related material (seismic bulletins, station books, etc.). In a next step, the high resolution scanned seismograms will be processed to obtain the digital/numeric waveforms. We used a Colortrac Smartlf Cx40 scanner which provides images in TIFF or JPG format. For digitization the algorithm Teseo2 developed by the National Institute of Geophysics and Volcanology in Rome (Italy), within the framework of the SISMOS Project, will be used.
Geometric processing of digital images of the planets
NASA Technical Reports Server (NTRS)
Edwards, Kathleen
1987-01-01
New procedures and software have been developed for geometric transformation of images to support digital cartography of the planets. The procedures involve the correction of spacecraft camera orientation of each image with the use of ground control and the transformation of each image to a Sinusoidal Equal-Area map projection with an algorithm which allows the number of transformation calculations to vary as the distortion varies within the image. When the distortion is low in an area of an image, few transformation computations are required, and most pixels can be interpolated. When distortion is extreme, the location of each pixel is computed. Mosaics are made of these images and stored as digital databases. Completed Sinusoidal databases may be used for digital analysis and registration with other spatial data. They may also be reproduced as published image maps by digitally transforming them to appropriate map projections.
NASA Astrophysics Data System (ADS)
Angel, V.; Garvey, A.; Sydor, M.
2017-08-01
In the face of changing economies and patterns of development, the definition of heritage is diversifying, and the role of inventories in local heritage planning is coming to the fore. The Durand neighbourhood is a layered and complex area located in inner-city Hamilton, Ontario, Canada, and the second subject area in a set of pilot inventory studies to develop a new city-wide inventory strategy for the City of Hamilton,. This paper presents an innovative digital workflow developed to undertake the Durand Built Heritage Inventory project. An online database was developed to be at the centre of all processes, including digital documentation, record management, analysis and variable outputs. Digital tools were employed for survey work in the field and analytical work in the office, resulting in a GIS-based dataset that can be integrated into Hamilton's larger municipal planning system. Together with digital mapping and digitized historical resources, the Durand database has been leveraged to produce both digital and static outputs to shape recommendations for the protection of Hamilton's heritage resources.
The thinking of Cloud computing in the digital construction of the oil companies
NASA Astrophysics Data System (ADS)
CaoLei, Qizhilin; Dengsheng, Lei
In order to speed up digital construction of the oil companies and enhance productivity and decision-support capabilities while avoiding the disadvantages from the waste of the original process of building digital and duplication of development and input. This paper presents a cloud-based models for the build in the digital construction of the oil companies that National oil companies though the private network will join the cloud data of the oil companies and service center equipment integrated into a whole cloud system, then according to the needs of various departments to prepare their own virtual service center, which can provide a strong service industry and computing power for the Oil companies.
[Scientific significance and prospective application of digitized virtual human].
Zhong, Shi-zhen
2003-03-01
As a cutting-edge research project, digitization of human anatomical information combines conventional medicine with information technology, computer technology, and virtual reality technology. Recent years have seen the establishment of, or the ongoing effort to establish various virtual human models in many countries, on the basis of continuous sections of human body that are digitized by means of computational medicine incorporating information technology to quantitatively simulate human physiological and pathological conditions, and to provide wide prospective applications in the fields of medicine and other disciplines. This article addresses 4 issues concerning the progress in virtual human model researches as the following: (1) Worldwide survey of sectioning and modeling of visible human. American visible human database was completed in 1994, which contains both a male and a female datasets, and has found wide application internationally. South Korea also finished the data collection for a male visible Korean human dataset in 2000. (2) Application of the dataset of Visible Human Project (VHP). This dataset has yielded plentiful fruits in medical education and clinical research, and further plans are proposed and practiced to construct a Physical Human and Physiological Human . (3) Scientific significance and prospect of virtual human studies. Digitized human dataset may eventually contribute to the development of many new high-tech industries. (4) Progress of virtual Chinese human project. The 174th session of Xiangshang Science Conferences held in 2001 marked the initiation of digitized virtual human project in China, and some key techniques have been explored. By now the data-collection process for 4 Chinese virtual human datasets have been successfully completed.
NASA Technical Reports Server (NTRS)
Djorgovski, S. George
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.
Trustworthy History and Provenance for Files and Databases
ERIC Educational Resources Information Center
Hasan, Ragib
2009-01-01
In today's world, information is increasingly created, processed, transmitted, and stored digitally. While the digital nature of information has brought enormous benefits, it has also created new vulnerabilities and attacks against data. Unlike physical documents, digitally stored information can be rapidly copied, erased, or modified. The…
Digitized Archival Primary Sources in STEM: A Selected Webliography
ERIC Educational Resources Information Center
Jankowski, Amy
2017-01-01
Accessibility and findability of digitized archival resources can be a challenge, particularly for students or researchers not familiar with archival formats and digital interfaces, which adhere to different descriptive standards than more widely familiar library resources. Numerous aggregate archival collection databases exist, which provide a…
A Digital Library in the Mid-Nineties, Ahead or On Schedule?
ERIC Educational Resources Information Center
Dijkstra, Joost
1994-01-01
Discussion of the future possibilities of digital library systems highlights digital projects developed at Tilburg University (Netherlands). Topics addressed include online access to databases; electronic document delivery; agreements between libraries and Elsevier Science publishers to provide journal articles; full text document delivery; and…
site requires that JavaScript is enabled in order to function properly. Online Digital Special Collections Welcome to Online Digital Special Collections of interest to Department of Transportation (DOT
Bedford, David R.; Ludington, Steve; Nutt, Constance M.; Stone, Paul A.; Miller, David M.; Miller, Robert J.; Wagner, David L.; Saucedo, George J.
2003-01-01
The USGS is creating an integrated national database for digital state geologic maps that includes stratigraphic, age, and lithologic information. The majority of the conterminous 48 states have digital geologic base maps available, often at scales of 1:500,000. This product is a prototype, and is intended to demonstrate the types of derivative maps that will be possible with the national integrated database. This database permits the creation of a number of types of maps via simple or sophisticated queries, maps that may be useful in a number of areas, including mineral-resource assessment, environmental assessment, and regional tectonic evolution. This database is distributed with three main parts: a Microsoft Access 2000 database containing geologic map attribute data, an Arc/Info (Environmental Systems Research Institute, Redlands, California) Export format file containing points representing designation of stratigraphic regions for the Geologic Map of Utah, and an ArcView 3.2 (Environmental Systems Research Institute, Redlands, California) project containing scripts and dialogs for performing a series of generalization and mineral resource queries. IMPORTANT NOTE: Spatial data for the respective stage geologic maps is not distributed with this report. The digital state geologic maps for the states involved in this report are separate products, and two of them are produced by individual state agencies, which may be legally and/or financially responsible for this data. However, the spatial datasets for maps discussed in this report are available to the public. Questions regarding the distribution, sale, and use of individual state geologic maps should be sent to the respective state agency. We do provide suggestions for obtaining and formatting the spatial data to make it compatible with data in this report. See section ‘Obtaining and Formatting Spatial Data’ in the PDF version of the report.
ERIC Educational Resources Information Center
Tzoc, Elias; Ubbes, Valerie A.
2017-01-01
In 2013, the Center for Digital Scholarship at Miami University was established and coincided with the redesign of the Children's Picture Book Database, which had a successful web presence for nearly 20 years. We developed the Digital Literacy Partnership (DLP) website project in order to upgrade the project to Omeka as a new digital management…
Changing State Digital Libraries
ERIC Educational Resources Information Center
Pappas, Marjorie L.
2006-01-01
Research has shown that state virtual or digital libraries are evolving into websites that are loaded with free resources, subscription databases, and instructional tools. In this article, the author explores these evolving libraries based on the following questions: (1) How user-friendly are the state digital libraries?; (2) How do state digital…
Geologic Communications | Alaska Division of Geological & Geophysical
improves a database for the Division's digital and map-based geological, geophysical, and geochemical data interfaces DGGS metadata and digital data distribution - Geospatial datasets published by DGGS are designed to be compatible with a broad variety of digital mapping software, to present DGGS's geospatial data
Dibblee, T. W.; Digital database compiled by Graham, S. E.; Mahony, T.M.; Blissenbach, J.L.; Mariant, J.J.; Wentworth, C.M.
1999-01-01
This Open-File Report is a digital geologic map database. The report serves to introduce and describe the digital data. There is no paper map included in the Open-File Report. The report includes PostScript and PDF plot files that can be used to plot images of the geologic map sheet and explanation sheet. This digital map database is prepared from a previously published map by Dibblee (1973). The geologic map database delineates map units that are identified by general age, lithology, and clast size following the stratigraphic nomenclature of the U.S. Geological Survey. For descriptions of the units, their stratigraphic relations, and sources of geologic mapping, consult the explanation sheet (of99-14_4b.ps or of99-14_4d.pdf), or the original published paper map (Dibblee, 1973). The scale of the source map limits the spatial resolution (scale) of the database to 1:125,000 or smaller. For those interested in the geology of Carrizo Plain and vicinity who do not use an ARC/INFO compatible Geographic Information System (GIS), but would like to obtain a paper map and explanation, PDF and PostScript plot files containing map images of the data in the digital database, as well as PostScript and PDF plot files of the explanation sheet and explanatory text, have been included in the database package (please see the section 'Digital Plot Files', page 5). The PostScript plot files require a gzip utility to access them. For those without computer capability, we can provide users with the PostScript or PDF files on tape that can be taken to a vendor for plotting. Paper plots can also be ordered directly from the USGS (please see the section 'Obtaining Plots from USGS Open-File Services', page 5). The content and character of the database, methods of obtaining it, and processes of extracting the map database from the tar (tape archive) file are described herein. The map database itself, consisting of six ARC/INFO coverages, can be obtained over the Internet or by magnetic tape copy as described below. The database was compiled using ARC/INFO, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). The ARC/INFO coverages are stored in uncompressed ARC export format (ARC/INFO version 7.x). All data files have been compressed, and may be uncompressed with gzip, which is available free of charge over the Internet via links from the USGS Public Domain Software page (http://edcwww.cr.usgs.gov/doc/edchome/ndcdb/public.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView.
Tsao, Liuxing; Ma, Liang
2016-11-01
Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.
NASA Astrophysics Data System (ADS)
Streiter, R.; Wanielik, G.
2013-07-01
The construction of highways and federal roadways is subject to many restrictions and designing rules. The focus is on safety, comfort and smooth driving. Unfortunately, the planning information for roadways and their real constitution, course and their number of lanes and lane widths is often unsure or not available. Due to digital map databases of roads raised much interest during the last years and became one major cornerstone of innovative Advanced Driving Assistance Systems (ADASs), the demand for accurate and detailed road information increases considerably. Within this project a measurement system for collecting high accurate road data was developed. This paper gives an overview about the sensor configuration within the measurement vehicle, introduces the implemented algorithms and shows some applications implemented in the post processing platform. The aim is to recover the origin parametric description of the roadway and the performance of the measurement system is being evaluated against several original road construction information.
CHIP Demonstrator: Semantics-Driven Recommendations and Museum Tour Generation
NASA Astrophysics Data System (ADS)
Aroyo, Lora; Stash, Natalia; Wang, Yiwen; Gorgels, Peter; Rutledge, Lloyd
The main objective of the CHIP project is to demonstrate how Semantic Web technologies can be deployed to provide personalized access to digital museum collections. We illustrate our approach with the digital database ARIA of the Rijksmuseum Amsterdam. For the semantic enrichment of the Rijksmuseum ARIA database we collaborated with the CATCH STITCH project to produce mappings to Iconclass, and with the MultimediaN E-culture project to produce the RDF/OWL of the ARIA and Adlib databases. The main focus of CHIP is on exploring the potential of applying adaptation techniques to provide personalized experience for the museum visitors both on the Web site and in the museum.
Oh, Chang-Seok; Kim, Kyong-Jee; Chung, Eilho; Choi, Hong-Joo
2015-04-01
Digital report (DR), a new method for students' dissection report, has been introduced to replace the traditional method in the anatomy laboratory. Laboratory tasks were assigned to groups of five students, and each group was asked to make a DR of their dissection tasks and upload it on the website for the anatomy course developed by the authors' institution. For creating the DR, students were instructed to take photographs of their findings with digital cameras, to mark the orientation and label the structures on the photograph. Students were assessed as a group by evaluating the DR. All the photographs of the DR were saved to construct a database that can be used by the students who will take the anatomy course in the following years. A questionnaire consisting of 14 questions was administered at the end of the anatomy course to evaluate the effectiveness of the DR. The results of the student survey showed that the DR was useful for making the students participate more actively in the teamwork for dissection, and for making dissection reports by referring to the DR made by the students from previous years. The DR was also more helpful for the anatomy teacher to assess student learning in the anatomy laboratory than conventional practical examinations and paper-based dissection reports. DR, a paperless report of team-based dissection, is concurrent with the 'digital' age and is in line with the need for a more systematic and objective evaluation of students' dissection.
LDEF meteoroid and debris database
NASA Technical Reports Server (NTRS)
Dardano, C. B.; See, Thomas H.; Zolensky, Michael E.
1994-01-01
The Long Duration Exposure Facility (LDEF) Meteoroid and Debris Special Investigation Group (M&D SIG) database is maintained at the Johnson Space Center (JSC), Houston, Texas, and consists of five data tables containing information about individual features, digitized images of selected features, and LDEF hardware (i.e., approximately 950 samples) archived at JSC. About 4000 penetrations (greater than 300 micron in diameter) and craters (greater than 500 micron in diameter) were identified and photodocumented during the disassembly of LDEF at the Kennedy Space Center (KSC), while an additional 4500 or so have subsequently been characterized at JSC. The database also contains some data that have been submitted by various PI's, yet the amount of such data is extremely limited in its extent, and investigators are encouraged to submit any and all M&D-type data to JSC for inclusion within the M&D database. Digitized stereo-image pairs are available for approximately 4500 features through the database.
eCTG: an automatic procedure to extract digital cardiotocographic signals from digital images.
Sbrollini, Agnese; Agostinelli, Angela; Marcantoni, Ilaria; Morettini, Micaela; Burattini, Luca; Di Nardo, Francesco; Fioretti, Sandro; Burattini, Laura
2018-03-01
Cardiotocography (CTG), consisting in the simultaneous recording of fetal heart rate (FHR) and maternal uterine contractions (UC), is a popular clinical test to assess fetal health status. Typically, CTG machines provide paper reports that are visually interpreted by clinicians. Consequently, visual CTG interpretation depends on clinician's experience and has a poor reproducibility. The lack of databases containing digital CTG signals has limited number and importance of retrospective studies finalized to set up procedures for automatic CTG analysis that could contrast visual CTG interpretation subjectivity. In order to help overcoming this problem, this study proposes an electronic procedure, termed eCTG, to extract digital CTG signals from digital CTG images, possibly obtainable by scanning paper CTG reports. eCTG was specifically designed to extract digital CTG signals from digital CTG images. It includes four main steps: pre-processing, Otsu's global thresholding, signal extraction and signal calibration. Its validation was performed by means of the "CTU-UHB Intrapartum Cardiotocography Database" by Physionet, that contains digital signals of 552 CTG recordings. Using MATLAB, each signal was plotted and saved as a digital image that was then submitted to eCTG. Digital CTG signals extracted by eCTG were eventually compared to corresponding signals directly available in the database. Comparison occurred in terms of signal similarity (evaluated by the correlation coefficient ρ, and the mean signal error MSE) and clinical features (including FHR baseline and variability; number, amplitude and duration of tachycardia, bradycardia, acceleration and deceleration episodes; number of early, variable, late and prolonged decelerations; and UC number, amplitude, duration and period). The value of ρ between eCTG and reference signals was 0.85 (P < 10 -560 ) for FHR and 0.97 (P < 10 -560 ) for UC. On average, MSE value was 0.00 for both FHR and UC. No CTG feature was found significantly different when measured in eCTG vs. reference signals. eCTG procedure is a promising useful tool to accurately extract digital FHR and UC signals from digital CTG images. Copyright © 2018 Elsevier B.V. All rights reserved.
The use of a personal digital assistant for wireless entry of data into a database via the Internet.
Fowler, D L; Hogle, N J; Martini, F; Roh, M S
2002-01-01
Researchers typically record data on a worksheet and at some later time enter it into the database. Wireless data entry and retrieval using a personal digital assistant (PDA) at the site of patient contact can simplify this process and improve efficiency. A surgeon and a nurse coordinator provided the content for the database. The computer programmer created the database, placed the pages of the database on the PDA screen, and researched and installed security measures. Designing the database took 6 months. Meeting Health Insurance Portability and Accountability Act of 1996 (HIPAA) requirements for patient confidentiality, satisfying institutional Information Services requirements, and ensuring connectivity required an additional 8 months before the functional system was complete. It is now possible to achieve wireless entry and retrieval of data using a PDA. Potential advantages include collection and entry of data at the same time, easy entry of data from multiple sites, and retrieval of data at the patient's bedside.
Hydrologic data for an investigation of the Smith River Watershed through water year 2010
Nilges, Hannah L.; Caldwell, Rodney R.
2012-01-01
Hydrologic data collected through water year 2010 and compiled as part of a U.S. Geological Survey study of the water resources of the Smith River watershed in west-central Montana are presented in this report. Tabulated data presented in this report were collected at 173 wells and 65 surface-water sites. Figures include location maps of data-collection sites and hydrographs of streamflow. Digital data files used to construct the figures, hydrographs, and data tables are included in the report. Data collected by the USGS are also stored in the USGS National Water Information System database and are available through the USGS National Water Information System Water Data for Montana Web page at http://waterdata.usgs.gov/mt/nwis/.
Ahmadi, Farshid Farnood; Ebadi, Hamid
2009-01-01
3D spatial data acquired from aerial and remote sensing images by photogrammetric techniques is one of the most accurate and economic data sources for GIS, map production, and spatial data updating. However, there are still many problems concerning storage, structuring and appropriate management of spatial data obtained using these techniques. According to the capabilities of spatial database management systems (SDBMSs); direct integration of photogrammetric and spatial database management systems can save time and cost of producing and updating digital maps. This integration is accomplished by replacing digital maps with a single spatial database. Applying spatial databases overcomes the problem of managing spatial and attributes data in a coupled approach. This management approach is one of the main problems in GISs for using map products of photogrammetric workstations. Also by the means of these integrated systems, providing structured spatial data, based on OGC (Open GIS Consortium) standards and topological relations between different feature classes, is possible at the time of feature digitizing process. In this paper, the integration of photogrammetric systems and SDBMSs is evaluated. Then, different levels of integration are described. Finally design, implementation and test of a software package called Integrated Photogrammetric and Oracle Spatial Systems (IPOSS) is presented.
Remote vibration monitoring system using wireless internet data transfer
NASA Astrophysics Data System (ADS)
Lemke, John
2000-06-01
Vibrations from construction activities can affect infrastructure projects in several ways. Within the general vicinity of a construction site, vibrations can result in damage to existing structures, disturbance to people, damage to sensitive machinery, and degraded performance of precision instrumentation or motion sensitive equipment. Current practice for monitoring vibrations in the vicinity of construction sites commonly consists of measuring free field or structural motions using velocity transducers connected to a portable data acquisition unit via cables. This paper describes an innovative way to collect, process, transmit, and analyze vibration measurements obtained at construction sites. The system described measures vibration at the sensor location, performs necessary signal conditioning and digitization, and sends data to a Web server using wireless data transmission and Internet protocols. A Servlet program running on the Web server accepts the transmitted data and incorporates it into a project database. Two-way interaction between the Web-client and the Web server is accomplished through the use of a Servlet program and a Java Applet running inside a browser located on the Web client's computer. Advantages of this system over conventional vibration data logging systems include continuous unattended monitoring, reduced costs associated with field data collection, instant access to data files and graphs by project team members, and the ability to remotely modify data sampling schemes.
NASA Astrophysics Data System (ADS)
Yulaeva, E.; Fan, Y.; Moosdorf, N.; Richard, S. M.; Bristol, S.; Peters, S. E.; Zaslavsky, I.; Ingebritsen, S.
2015-12-01
The Digital Crust EarthCube building block creates a framework for integrating disparate 3D/4D information from multiple sources into a comprehensive model of the structure and composition of the Earth's upper crust, and to demonstrate the utility of this model in several research scenarios. One of such scenarios is estimation of various crustal properties related to fluid dynamics (e.g. permeability and porosity) at each node of any arbitrary unstructured 3D grid to support continental-scale numerical models of fluid flow and transport. Starting from Macrostrat, an existing 4D database of 33,903 chronostratigraphic units, and employing GeoDeepDive, a software system for extracting structured information from unstructured documents, we construct 3D gridded fields of sediment/rock porosity, permeability and geochemistry for large sedimentary basins of North America, which will be used to improve our understanding of large-scale fluid flow, chemical weathering rates, and geochemical fluxes into the ocean. In this talk, we discuss the methods, data gaps (particularly in geologically complex terrain), and various physical and geological constraints on interpolation and uncertainty estimation.
Construction of digital core by adaptive porosity method
NASA Astrophysics Data System (ADS)
Xia, Huifen; Liu, Ting; Zhao, Ling; Sun, Yanyu; Pan, Junliang
2017-05-01
The construction of digital core has its unique advantages in the study of water flooding or polymer flooding oil displacement efficiency. The frequency distribution of pore size is measured by mercury injection experiment, the coordination number by CT scanning method, and the wettability data by imbibition displacement was measured, on the basis of considering the ratio of pore throat ratio and wettability, the principle of adaptive porosity is used to construct the digital core. The results show that the water flooding recovery, the degree of polymer flooding and the results of the Physical simulation experiment are in good agreement.
NASA Astrophysics Data System (ADS)
Howe, Michael
2014-05-01
Much of the digital geological information on the composition, properties and dynamics of the subsurface is based ultimately on physical samples, many of which are archived to provide a basis for the information. Online metadata catalogues of these collections have now been available for many years. Many of these are institutional and tightly focussed, with UK examples including the British Geological Survey's (BGS) palaeontological samples database, PalaeoSaurus (http://www.bgs.ac.uk/palaeosaurus/), and mineralogical and petrological sample database, Britrocks (http://www.bgs.ac.uk/data/britrocks.html) . There are now a growing number of international sample metadata databases, including The Palaeobiology Database (http://paleobiodb.org/) and SESAR, the IGSN (International Geo Sample Number) database (http://www.geosamples.org/catalogsearch/ ). More recently the emphasis has moved beyond metadata (locality, identification, age, citations, etc) to digital imagery, with the intention of providing the user with at least enough information to determine whether viewing the sample would be worthwhile. Recent BGS examples include high resolution (e.g. 7216 x 5412 pixel) hydrocarbon well core images (http://www.bgs.ac.uk/data/offshoreWells/wells.cfc?method=searchWells) , high resolution rock thin section images (e.g. http://www.largeimages.bgs.ac.uk/iip/britrocks.html?id=290000/291739 ) and building stone images (http://geoscenic.bgs.ac.uk/asset-bank/action/browseItems?categoryId=1547&categoryTypeId=1) . This has been developed further with high resolution stereo images. The Jisc funded GB3D type fossils online project delivers these as red-cyan anaglyphs (http://www.3d-fossils.ac.uk/). More innovatively, the GB3D type fossils project has laser scanned several thousand type fossils and the resulting 3d-digital models are now being delivered through the online portal. Importantly, this project also represents collaboration between the BGS, Oxford and Cambridge Universities, the National Museums of Wales, and numerous other national, local and regional museums. The lack of currently accepted international standards and infrastructures for the delivery of high resolution images and 3d-digital models has necessitated the BGS in developing or selecting its own. Most high resolution images have been delivered using the JPEG 2000 format because of its quality and speed. Digital models have been made available in both .PLY and .OBJ format because of their respective efficient file size, and flexibility. Consideration must now be given to European and international standards and infrastructures for the delivery of high resolution images and 3d-digital models.
Shaping the Curriculum: The Power of a Library's Digital Resources
ERIC Educational Resources Information Center
Kirkwood, Patricia
2011-01-01
Researchers were the first adopters of digital resources available through the library. Online journals and databases make finding research articles much easier than when this author started as a librarian more than 20 years ago. Speedier interlibrary loan due to digital delivery means research materials are never far away. Making it easier for…
A Digital Library for Education: The PEN-DOR Project.
ERIC Educational Resources Information Center
Fullerton, Karen; Greenberg, Jane; McClure, Maureen; Rasmussen, Edie; Stewart, Darin
1999-01-01
Describes Pen-DOR (Pennsylvania Education Network Digital Object Repository), a digital library designed to provide K-12 educators with access to multimedia resources and tools to create new lesson plans and modify existing ones via the World Wide Web. Discusses design problems of a distributed, object-oriented database architecture and describes…
::: American Indians of the Pacific Northwest Collection :::
Ask Us! University of Washington Libraries Digital Collections Toggle navigation Browse Special learn from the images and writings of the time...This site provides an extensive digital collection of digital databases includes over 2,300 original photographs as well as over 1,500 pages from the Annual
Digital Construction and Characterization of Noise-like Spread Spectrum Signals
2016-11-01
Digital Construction and Characterization of Noise -like Spread Spectrum Signals Donald C. Buzanowski II, Frederick J. Block, Thomas C. Royster MIT...Lincoln Laboratory Lexington, MA 02420 Abstract—A new method for generating digital noise -like spread spectrum signals is proposed. A standard binary...employing signals that are noise -like (e.g., [1]). Direct sequence spread spectrum (DSSS) signals provide benefits such as protection against jamming, low
Data Services Upgrade: Perfecting the ISIS-I Topside Digital Ionogram Database
NASA Technical Reports Server (NTRS)
Wang, Yongli; Benson, Robert F.; Bilitza, Dieter; Fung, Shing. F.; Chu, Philip; Huang, Xueqin; Truhlik, Vladimir
2015-01-01
The ionospheric topside sounders of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. More than 16,000 of the original telemetry tapes from three satellites were used to produce topside digital ionograms, via an analog-to-digital (A/D) conversion process, suitable for modern analysis techniques. Unfortunately, many of the resulting digital topside ionogram files could not be auto-processed to produce topside Ne(h) profiles because of problems encountered during the A/D process. Software has been written to resolve these problems and here we report on (1) the first application of this software to a significant portion of the ISIS-1 digital topside-ionogram database, (2) software improvements motivated by this activity, (3) N(sub e)(h) profiles automatically produced from these corrected ISIS-1 digital ionogram files, and (4) the availability via the Virtual Wave Observatory (VWO) of the corrected ISIS-1 digital topside ionogram files for research. We will also demonstrate the use of these N(sub e)(h) profiles for making refinements in the International Reference Ionosphere (IRI) and in the determination of transition heights from Oxygen ion to Hydrogen ion.
Content Independence in Multimedia Databases.
ERIC Educational Resources Information Center
de Vries, Arjen P.
2001-01-01
Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…
Tufts Health Sciences Database: Lessons, Issues, and Opportunities.
ERIC Educational Resources Information Center
Lee, Mary Y.; Albright, Susan A.; Alkasab, Tarik; Damassa, David A.; Wang, Paul J.; Eaton, Elizabeth K.
2003-01-01
Describes a seven-year experience with developing the Tufts Health Sciences Database, a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. Identifies major effects on teaching and learning. Also addresses issues of faculty development, copyright and…
Computer Storage and Retrieval of Position - Dependent Data.
1982-06-01
This thesis covers the design of a new digital database system to replace the merged (observation and geographic location) record, one file per cruise...68 "The Digital Data Library System: Library Storage and Retrieval of Digital Geophysical Data" by Robert C. Groan) provided a relatively simple...dependent, ’geophysical’ data. The system is operational on a Digital Equipment Corporation VAX-11/780 computer. Values of measured and computed
Digital methods for the history of psychology: Introduction and resources.
Fox Lee, Shayna
2016-02-01
At the York University Digital History of Psychology Laboratory, we have been working on projects that explore what digital methodologies have to offer historical research in our field. This piece provides perspective on the history and theory of digital history, as well as introductory resources for those who are curious about incorporating these methods into their own work. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
12 digit Hydrologic Units (HUCs) for EPA Region 2 and surrounding states (Northeastern states, parts of the Great Lakes, Puerto Rico and the USVI) downloaded from the Natural Resources Conservation Service (NRCS) Geospatial Gateway and imported into the EPA Region 2 Oracle/SDE database. This layer reflects 2009 updates to the national Watershed Boundary Database (WBD) that included new boundary data for New York and New Jersey.
Wilshire, Howard G.; Bedford, David R.; Coleman, Teresa
2002-01-01
3. Plottable map representations of the database at 1:24,000 scale in PostScript and Adobe PDF formats. The plottable files consist of a color geologic map derived from the spatial database, composited with a topographic base map in the form of the USGS Digital Raster Graphic for the map area. Color symbology from each of these datasets is maintained, which can cause plot file sizes to be large.
Optimal Path Planning Program for Autonomous Speed Sprayer in Orchard Using Order-Picking Algorithm
NASA Astrophysics Data System (ADS)
Park, T. S.; Park, S. J.; Hwang, K. Y.; Cho, S. I.
This study was conducted to develop a software program which computes optimal path for autonomous navigation in orchard, especially for speed sprayer. Possibilities of autonomous navigation in orchard were shown by other researches which have minimized distance error between planned path and performed path. But, research of planning an optimal path for speed sprayer in orchard is hardly founded. In this study, a digital map and a database for orchard which contains GPS coordinate information (coordinates of trees and boundary of orchard) and entity information (heights and widths of trees, radius of main stem of trees, disease of trees) was designed. An orderpicking algorithm which has been used for management of warehouse was used to calculate optimum path based on the digital map. Database for digital map was created by using Microsoft Access and graphic interface for database was made by using Microsoft Visual C++ 6.0. It was possible to search and display information about boundary of an orchard, locations of trees, daily plan for scattering chemicals and plan optimal path on different orchard based on digital map, on each circumstance (starting speed sprayer in different location, scattering chemicals for only selected trees).
(abstract) Synthesis of Speaker Facial Movements to Match Selected Speech Sequences
NASA Technical Reports Server (NTRS)
Scott, Kenneth C.
1994-01-01
We are developing a system for synthesizing image sequences the simulate the facial motion of a speaker. To perform this synthesis, we are pursuing two major areas of effort. We are developing the necessary computer graphics technology to synthesize a realistic image sequence of a person speaking selected speech sequences. Next, we are developing a model that expresses the relation between spoken phonemes and face/mouth shape. A subject is video taped speaking an arbitrary text that contains expression of the full list of desired database phonemes. The subject is video taped from the front speaking normally, recording both audio and video detail simultaneously. Using the audio track, we identify the specific video frames on the tape relating to each spoken phoneme. From this range we digitize the video frame which represents the extreme of mouth motion/shape. Thus, we construct a database of images of face/mouth shape related to spoken phonemes. A selected audio speech sequence is recorded which is the basis for synthesizing a matching video sequence; the speaker need not be the same as used for constructing the database. The audio sequence is analyzed to determine the spoken phoneme sequence and the relative timing of the enunciation of those phonemes. Synthesizing an image sequence corresponding to the spoken phoneme sequence is accomplished using a graphics technique known as morphing. Image sequence keyframes necessary for this processing are based on the spoken phoneme sequence and timing. We have been successful in synthesizing the facial motion of a native English speaker for a small set of arbitrary speech segments. Our future work will focus on advancement of the face shape/phoneme model and independent control of facial features.
47 CFR 15.713 - TV bands database.
Code of Federal Regulations, 2011 CFR
2011-10-01
... authorized services operating in the TV bands. In addition, a TV bands database must also verify that the FCC identifier (FCC ID) of a device seeking access to its services is valid; under this requirement the TV bands... information will come from the official Commission database. These services include: (i) Digital television...
ERIC Educational Resources Information Center
Young, Terrence E., Jr.
2004-01-01
Today's elementary school students have been exposed to computers since birth, so it is not surprising that they are so proficient at using them. As a result, they are ready to search databases that include topics and information appropriate for their age level. Subscription databases are digital copies of magazines, newspapers, journals,…
Construction of Database for Pulsating Variable Stars
NASA Astrophysics Data System (ADS)
Chen, B. Q.; Yang, M.; Jiang, B. W.
2011-07-01
A database for the pulsating variable stars is constructed for Chinese astronomers to study the variable stars conveniently. The database includes about 230000 variable stars in the Galactic bulge, LMC and SMC observed by the MACHO (MAssive Compact Halo Objects) and OGLE (Optical Gravitational Lensing Experiment) projects at present. The software used for the construction is LAMP, i.e., Linux+Apache+MySQL+PHP. A web page is provided to search the photometric data and the light curve in the database through the right ascension and declination of the object. More data will be incorporated into the database.
Database of well and areal data, South San Francisco Bay and Peninsula area, California
Leighton, D.A.; Fio, J.L.; Metzger, L.F.
1995-01-01
A database was developed to organize and manage data compiled for a regional assessment of geohydrologic and water-quality conditions in the south San Francisco Bay and Peninsula area in California. Available data provided by local, State, and Federal agencies and private consultants was utilized in the assessment. The database consists of geographicinformation system data layers and related tables and American Standard Code for Information Interchange files. Documentation of the database is necessary to avoid misinterpretation of the data and to make users aware of potential errors and limitations. Most of the data compiled were collected from wells and boreholes (collectively referred to as wells in this report). This point-specific data, including construction, water-level, waterquality, pumping test, and lithologic data, are contained in tables and files that are related to a geographic information system data layer that contains the locations of the wells. There are 1,014 wells in the data layer and the related tables contain 35,845 water-level measurements (from 293 of the wells) and 9,292 water-quality samples (from 394 of the wells). Calculation of hydraulic heads and gradients from the water levels can be affected adversely by errors in the determination of the altitude of land surface at the well. Cation and anion balance computations performed on 396 of the water-quality samples indicate high cation and anion balance errors for 51 (13 percent) of the samples. Well drillers' reports were interpreted for 762 of the wells, and digital representations of the lithology of the formations are contained in files following the American Standard Code for Information Interchange. The usefulness of drillers' descriptions of the formation lithology is affected by the detail and thoroughness of the drillers' descriptions, as well as the knowledge, experience, and vocabulary of the individual who described the drill cuttings. Additional data layers were created that contain political, geohydrologic, and other geographic data. These layers contain features represented by areas and lines rather than discrete points. The layers consist of data representing the thickness of alluvium, surficial geology, physiographic subareas, watershed boundaries, land use, water-supply districts, wastewater treatment districts, and recharge basins. The layers manually digitizing paper maps, acquisition of data already in digital form, or creation of new layers from available layers. The scale of the source data affects the accurate representation of real-world features with the data layer, and, therefore, the scale of the source data must be considered when the data are analyzed and plotted.
Academic Knowledge Construction and Multimodal Curriculum Development
ERIC Educational Resources Information Center
Loveless, Douglas J., Ed.; Griffith, Bryant, Ed.; Bérci, Margaret E., Ed.; Ortlieb, Evan, Ed.; Sullivan, Pamela, Ed.
2014-01-01
While incorporating digital technologies into the classroom has offered new ways of teaching and learning into educational processes, it is essential to take a look at how the digital shift impacts teachers, school administration, and curriculum development. "Academic Knowledge Construction and Multimodal Curriculum Development" presents…
7 CFR 1740.9 - Grant application.
Code of Federal Regulations, 2011 CFR
2011-01-01
... authorized the initiation of digital broadcasting at the project sites. In the event that an FCC construction... advance of funds for that site conditional upon the submission of a construction permit. (j) Compliance... Digital Transition,” and must include the Environmental Questionnaire/Certification, available from RUS...
Wells, Erica L; Kofler, Michael J; Soto, Elia F; Schaefer, Hillary S; Sarver, Dustin E
2018-01-01
Pediatric ADHD is associated with impairments in working memory, but these deficits often go undetected when using clinic-based tests such as digit span backward. The current study pilot-tested minor administration/scoring modifications to improve digit span backward's construct and predictive validities in a well-characterized sample of children with ADHD. WISC-IV digit span was modified to administer all trials (i.e., ignore discontinue rule) and count digits rather than trials correct. Traditional and modified scores were compared to a battery of criterion working memory (construct validity) and academic achievement tests (predictive validity) for 34 children with ADHD ages 8-13 (M=10.41; 11 girls). Traditional digit span backward scores failed to predict working memory or KTEA-2 achievement (allns). Alternate administration/scoring of digit span backward significantly improved its associations with working memory reordering (r=.58), working memory dual-processing (r=.53), working memory updating (r=.28), and KTEA-2 achievement (r=.49). Consistent with prior work, these findings urge caution when interpreting digit span performance. Minor test modifications may address test validity concerns, and should be considered in future test revisions. Digit span backward becomes a valid measure of working memory at exactly the point that testing is traditionally discontinued. Copyright © 2017 Elsevier Ltd. All rights reserved.
Seasonal land-cover regions of the United States
Loveland, Thomas R.; Merchant, James W.; Brown, Jesslyn F.; Ohlen, Donald O.; Reed, Bradley C.; Olson, Paul; Hutchinson, John
1995-01-01
Global-change investigations have been hindered by deficiencies in the availability and quality of land-cover data. The U.S. Geological Survey and the University of Nebraska-Lincoln have collaborated on the development of a new approach to land-cover characterization that attempts to address requirements of the global-change research community and others interested in regional patterns of land cover. An experimental 1 -kilometer-resolution database of land-cover characteristics for the coterminous U.S. has been prepared to test and evaluate the approach. Using multidate Advanced Very High Resolution Radiometer (AVHRR) satellite data complemented by elevation, climate, ecoregions, and other digital spatial datasets, the authors define 152, seasonal land-cover regions. The regionalization is based on a taxonomy of areas with respect to data on land cover, seasonality or phenology, and relative levels of primary production. The resulting database consists of descriptions of the vegetation, land cover, and seasonal, spectral, and site characteristics for each region. These data are used in the construction of an illustrative 1:7,500,000-scaIe map of the seasonal land-cover regions as well as of smaller-scale maps portraying general land cover and seasonality. The seasonal land-cover characteristics database can also be tailored to provide a broad range of other landscape parameters useful in national and global-scale environmental modeling and assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bocharova, N.Yu.; Scotese, C.R.; Pristavakina, E.I.
A digital geographic database for the former USSR was compiled using published geologic and geodynamic maps and the unpublished suture map of Lev Zonenshain (1991). The database includes more than 900 tectonic features: strike-slip faults, sutures, thrusts, fossil and active rifts, fossil and active subduction zones, boundaries of the major and minor Precambrian blocks, ophiolites, and various volcanic complexes. The attributes of each structural unit include type of structure, name, age, tectonic setting and geographical coordinates. Paleozoic and Early Mesozoic reconstructions of the former USSR and adjacent regions were constructed using this tectonic database together with paleomagnetic data and themore » motions of continent over fixed hot spots. Global apparent polar wander paths in European and Siberian coordinates were calculated back to Cambrian time, using the paleomagnetic pole summaries of Van der Voo (1992) and Khramov (1992) and the global plate tectonic model of the Paleomap Project (Scotese and Becker, 1992). Trajectories of intraplate volcanics in South Siberia, Mongolia, Scandinavia and data on the White Mountain plutons and Karoo flood basalts were also taken into account. Using new data, the authors recalculated the stage and finite poles for the rotation of the Siberia and Europe with respect to the hot spot reference frame for the time interval 160 to 450 Ma.« less
Tomato Expression Database (TED): a suite of data presentation and analysis tools
Fei, Zhangjun; Tang, Xuemei; Alba, Rob; Giovannoni, James
2006-01-01
The Tomato Expression Database (TED) includes three integrated components. The Tomato Microarray Data Warehouse serves as a central repository for raw gene expression data derived from the public tomato cDNA microarray. In addition to expression data, TED stores experimental design and array information in compliance with the MIAME guidelines and provides web interfaces for researchers to retrieve data for their own analysis and use. The Tomato Microarray Expression Database contains normalized and processed microarray data for ten time points with nine pair-wise comparisons during fruit development and ripening in a normal tomato variety and nearly isogenic single gene mutants impacting fruit development and ripening. Finally, the Tomato Digital Expression Database contains raw and normalized digital expression (EST abundance) data derived from analysis of the complete public tomato EST collection containing >150 000 ESTs derived from 27 different non-normalized EST libraries. This last component also includes tools for the comparison of tomato and Arabidopsis digital expression data. A set of query interfaces and analysis, and visualization tools have been developed and incorporated into TED, which aid users in identifying and deciphering biologically important information from our datasets. TED can be accessed at . PMID:16381976
Tomato Expression Database (TED): a suite of data presentation and analysis tools.
Fei, Zhangjun; Tang, Xuemei; Alba, Rob; Giovannoni, James
2006-01-01
The Tomato Expression Database (TED) includes three integrated components. The Tomato Microarray Data Warehouse serves as a central repository for raw gene expression data derived from the public tomato cDNA microarray. In addition to expression data, TED stores experimental design and array information in compliance with the MIAME guidelines and provides web interfaces for researchers to retrieve data for their own analysis and use. The Tomato Microarray Expression Database contains normalized and processed microarray data for ten time points with nine pair-wise comparisons during fruit development and ripening in a normal tomato variety and nearly isogenic single gene mutants impacting fruit development and ripening. Finally, the Tomato Digital Expression Database contains raw and normalized digital expression (EST abundance) data derived from analysis of the complete public tomato EST collection containing >150,000 ESTs derived from 27 different non-normalized EST libraries. This last component also includes tools for the comparison of tomato and Arabidopsis digital expression data. A set of query interfaces and analysis, and visualization tools have been developed and incorporated into TED, which aid users in identifying and deciphering biologically important information from our datasets. TED can be accessed at http://ted.bti.cornell.edu.
ERIC Educational Resources Information Center
Nygren, Thomas; Vikström, Lotta
2013-01-01
This article presents problems and possibilities associated with incorporating into history teaching a digital demographic database made for professional historians. We detect and discuss the outcome of how students in Swedish upper secondary schools respond to a teaching approach involving digitized registers comprising 19th century individuals…
ERIC Educational Resources Information Center
Chen, Ching-chih
1996-01-01
Summarizes how the Library of Congress' digital library collections can be accessed globally via the Internet and World Wide Web. Outlines the resources found in each of the various access points: gopher, online catalog, library and legislative Web sites, legal and copyright databases, and FTP (file transfer protocol) sites. (LAM)
ERIC Educational Resources Information Center
Nicholson, Scott
2005-01-01
Archaeologists have used material artifacts found in a physical space to gain an understanding about the people who occupied that space. Likewise, as users wander through a digital library, they leave behind data-based artifacts of their activity in the virtual space. Digital library archaeologists can gather these artifacts and employ inductive…
ERIC Educational Resources Information Center
Kumaran, Maha; Geary, Joe
2011-01-01
Technology has transformed libraries. There are digital libraries, electronic collections, online databases and catalogs, ebooks, downloadable books, and much more. With free technology such as social websites, newspaper collections, downloadable online calendars, clocks and sticky notes, online scheduling, online document sharing, and online…
Digital geologic map and GIS database of Venezuela
Garrity, Christopher P.; Hackley, Paul C.; Urbani, Franco
2006-01-01
The digital geologic map and GIS database of Venezuela captures GIS compatible geologic and hydrologic data from the 'Geologic Shaded Relief Map of Venezuela,' which was released online as U.S. Geological Survey Open-File Report 2005-1038. Digital datasets and corresponding metadata files are stored in ESRI geodatabase format; accessible via ArcGIS 9.X. Feature classes in the geodatabase include geologic unit polygons, open water polygons, coincident geologic unit linework (contacts, faults, etc.) and non-coincident geologic unit linework (folds, drainage networks, etc.). Geologic unit polygon data were attributed for age, name, and lithologic type following the Lexico Estratigrafico de Venezuela. All digital datasets were captured from source data at 1:750,000. Although users may view and analyze data at varying scales, the authors make no guarantee as to the accuracy of the data at scales larger than 1:750,000.
The role of digital cartographic data in the geosciences
Guptill, S.C.
1983-01-01
The increasing demand of the Nation's natural resource developers for the manipulation, analysis, and display of large quantities of earth-science data has necessitated the use of computers and the building of geoscience information systems. These systems require, in digital form, the spatial data on map products. The basic cartographic data shown on quadrangle maps provide a foundation for the addition of geological and geophysical data. If geoscience information systems are to realize their full potential, large amounts of digital cartographic base data must be available. A major goal of the U.S. Geological Survey is to create, maintain, manage, and distribute a national cartographic and geographic digital database. This unified database will contain numerous categories (hydrography, hypsography, land use, etc.) that, through the use of standardized data-element definitions and formats, can be used easily and flexibly to prepare cartographic products and perform geoscience analysis. ?? 1983.
Construction of the Database for Pulsating Variable Stars
NASA Astrophysics Data System (ADS)
Chen, Bing-Qiu; Yang, Ming; Jiang, Bi-Wei
2012-01-01
A database for pulsating variable stars is constructed to favor the study of variable stars in China. The database includes about 230,000 variable stars in the Galactic bulge, LMC and SMC observed in an about 10 yr period by the MACHO(MAssive Compact Halo Objects) and OGLE(Optical Gravitational Lensing Experiment) projects. The software used for the construction is LAMP, i.e., Linux+Apache+MySQL+PHP. A web page is provided for searching the photometric data and light curves in the database through the right ascension and declination of an object. Because of the flexibility of this database, more up-to-date data of variable stars can be incorporated into the database conveniently.
BAO plate archive digitization
NASA Astrophysics Data System (ADS)
Mickaelian, A. M.; Nikoghosyan, E. H.; Gigoyan, K. S.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Khachatryan, K. G.; Vardanyan, A. V.; Gyulzadyan, M. V.; Mikayelyan, G. A.; Farmanyan, S. V.; Knyazyan, A. V.
Astronomical plate archives created on the basis of numerous observations at many observatories are important part of the astronomical heritage. Byurakan Astrophysical Observatory (BAO) plate archive consists of 37,000 photographic plates and films, obtained at 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. A Science Program Board is created to evaluate the observing material, to investigate new possibilities and to propose new projects based on the combined usage of these observations together with other world databases. The Executing Team consists of 11 astronomers and 2 computer scientists and will use 2 EPSON Perfection V750 Pro scanners for the digitization. The project will run during 3 years in 2015-2017 and the final result will be an electronic database and online interactive sky map to be used for further research projects.
ERIC Educational Resources Information Center
Chuang, Tsung-Yen; Huang, Yun-Hsuan
2015-01-01
Mobile technology has rapidly made digital games a popular entertainment to this digital generation, and thus digital game design received considerable attention in both the game industry and design education. Digital game design involves diverse dimensions in which digital game story design (DGSD) particularly attracts our interest, as the…
NASA Technical Reports Server (NTRS)
Rule, T. D.; Kim, J.; Kalkur, T. S.
1998-01-01
Boiling heat transfer is an efficient means of heat transfer because a large amount of heat can be removed from a surface using a relatively small temperature difference between the surface and the bulk liquid. However, the mechanisms that govern boiling heat transfer are not well understood. Measurements of wall temperature and heat flux near the wall would add to the database of knowledge which is necessary to understand the mechanisms of nucleate boiling. A heater array has been developed which contains 96 heater elements within a 2.5 mm square area. The temperature of each heater element is held constant by an electronic control system similar to a hot-wire anemometer. The voltage that is being applied to each heater element can be measured and digitized using a high-speed A/D converter, and this digital information can be compiled into a series of heat-flux maps. Information for up to 10,000 heat flux maps can be obtained each second. The heater control system, the A/D system and the heater array construction are described in detail. Results are presented which show that this is an effective method of measuring the local heat flux during nucleate and transition boiling. Heat flux maps are obtained for pool boiling in FC-72 on a horizontal surface. Local heat flux variations are shown to be three to six times larger than variations in the spatially averaged heat flux.
Integrated technologies for solid waste bin monitoring system.
Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda
2011-06-01
The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.
Pullman, Daryl; Perrot-Daley, Astrid; Hodgkinson, Kathy; Street, Catherine; Rahman, Proton
2013-01-01
Objective To provide a legal and ethical analysis of some of the implementation challenges faced by the Population Therapeutics Research Group (PTRG) at Memorial University (Canada), in using genealogical information offered by individuals for its genetics research database. Materials and methods This paper describes the unique historical and genetic characteristics of the Newfoundland and Labrador founder population, which gave rise to the opportunity for PTRG to build the Newfoundland Genealogy Database containing digitized records of all pre-confederation (1949) census records of the Newfoundland founder population. In addition to building the database, PTRG has developed the Heritability Analytics Infrastructure, a data management structure that stores genotype, phenotype, and pedigree information in a single database, and custom linkage software (KINNECT) to perform pedigree linkages on the genealogy database. Discussion A newly adopted legal regimen in Newfoundland and Labrador is discussed. It incorporates health privacy legislation with a unique research ethics statute governing the composition and activities of research ethics boards and, for the first time in Canada, elevating the status of national research ethics guidelines into law. The discussion looks at this integration of legal and ethical principles which provides a flexible and seamless framework for balancing the privacy rights and welfare interests of individuals, families, and larger societies in the creation and use of research data infrastructures as public goods. Conclusion The complementary legal and ethical frameworks that now coexist in Newfoundland and Labrador provide the legislative authority, ethical legitimacy, and practical flexibility needed to find a workable balance between privacy interests and public goods. Such an approach may also be instructive for other jurisdictions as they seek to construct and use biobanks and related research platforms for genetic research. PMID:22859644
Kosseim, Patricia; Pullman, Daryl; Perrot-Daley, Astrid; Hodgkinson, Kathy; Street, Catherine; Rahman, Proton
2013-01-01
To provide a legal and ethical analysis of some of the implementation challenges faced by the Population Therapeutics Research Group (PTRG) at Memorial University (Canada), in using genealogical information offered by individuals for its genetics research database. This paper describes the unique historical and genetic characteristics of the Newfoundland and Labrador founder population, which gave rise to the opportunity for PTRG to build the Newfoundland Genealogy Database containing digitized records of all pre-confederation (1949) census records of the Newfoundland founder population. In addition to building the database, PTRG has developed the Heritability Analytics Infrastructure, a data management structure that stores genotype, phenotype, and pedigree information in a single database, and custom linkage software (KINNECT) to perform pedigree linkages on the genealogy database. A newly adopted legal regimen in Newfoundland and Labrador is discussed. It incorporates health privacy legislation with a unique research ethics statute governing the composition and activities of research ethics boards and, for the first time in Canada, elevating the status of national research ethics guidelines into law. The discussion looks at this integration of legal and ethical principles which provides a flexible and seamless framework for balancing the privacy rights and welfare interests of individuals, families, and larger societies in the creation and use of research data infrastructures as public goods. The complementary legal and ethical frameworks that now coexist in Newfoundland and Labrador provide the legislative authority, ethical legitimacy, and practical flexibility needed to find a workable balance between privacy interests and public goods. Such an approach may also be instructive for other jurisdictions as they seek to construct and use biobanks and related research platforms for genetic research.
Evaluation of personal digital assistant drug information databases for the managed care pharmacist.
Lowry, Colleen M; Kostka-Rokosz, Maria D; McCloskey, William W
2003-01-01
Personal digital assistants (PDAs) are becoming a necessity for practicing pharmacists. They offer a time-saving and convenient way to obtain current drug information. Several software companies now offer general drug information databases for use on hand held computers. PDAs priced less than 200 US dollars often have limited memory capacity; therefore, the user must choose from a growing list of general drug information database options in order to maximize utility without exceeding memory capacity. This paper reviews the attributes of available general drug information software databases for the PDA. It provides information on the content, advantages, limitations, pricing, memory requirements, and accessibility of drug information software databases. Ten drug information databases were subjectively analyzed and evaluated based on information from the product.s Web site, vendor Web sites, and from our experience. Some of these databases have attractive auxiliary features such as kinetics calculators, disease references, drug-drug and drug-herb interaction tools, and clinical guidelines, which may make them more useful to the PDA user. Not all drug information databases are equal with regard to content, author credentials, frequency of updates, and memory requirements. The user must therefore evaluate databases for completeness, currency, and cost effectiveness before purchase. In addition, consideration should be given to the ease of use and flexibility of individual programs.
Worl, R.G.; Johnson, K.M.
1995-01-01
The paper version of Map Showing Geologic Terranes of the Hailey 1x2 Quadrangle and the western part of the Idaho Falls 1x2 Quadrangle, south-central Idaho was compiled by Ron Worl and Kate Johnson in 1995. The plate was compiled on a 1:250,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a geographic information system database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.
OSTMED.DR®, an Osteopathic Medicine Digital Library.
Fitterling, Lori; Powers, Elaine; Vardell, Emily
2018-01-01
The OSTMED.DR® database provides access to both citation and full-text osteopathic literature, including the Journal of the American Osteopathic Association. Currently, it is a free database searchable using basic and advanced search features.
Building the Digital Library Infrastructure: A Primer.
ERIC Educational Resources Information Center
Tebbetts, Diane R.
1999-01-01
Provides a framework for examining the complex infrastructure needed to successfully implement a digital library. Highlights include database development, online public-access catalogs, interactive technical services, full-text documents, hardware and wiring, licensing, access, and security issues. (Author/LRW)
Barron, Andrew D.; Ramsey, David W.; Smith, James G.
2014-01-01
This digital database contains information used to produce the geologic map published as Sheet 1 in U.S. Geological Survey Miscellaneous Investigations Series Map I-2005. (Sheet 2 of Map I-2005 shows sources of geologic data used in the compilation and is available separately). Sheet 1 of Map I-2005 shows the distribution and relations of volcanic and related rock units in the Cascade Range of Washington at a scale of 1:500,000. This digital release is produced from stable materials originally compiled at 1:250,000 scale that were used to publish Sheet 1. The database therefore contains more detailed geologic information than is portrayed on Sheet 1. This is most noticeable in the database as expanded polygons of surficial units and the presence of additional strands of concealed faults. No stable compilation materials exist for Sheet 1 at 1:500,000 scale. The main component of this digital release is a spatial database prepared using geographic information systems (GIS) applications. This release also contains links to files to view or print the map sheet, main report text, and accompanying mapping reference sheet from Map I-2005. For more information on volcanoes in the Cascade Range in Washington, Oregon, or California, please refer to the U.S. Geological Survey Volcano Hazards Program website.
PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Lin, Lianshan
2013-01-01
To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less
The Construction of Infrastructure for Library's Digital Document Telecommunications.
ERIC Educational Resources Information Center
Changxing, Ying; Zuzao, Lin
This paper discusses the construction of the infrastructure for libraries' digital document telecommunications. The first section describes the topologies of the library LAN (Local Area Network) cabling system, including the main characteristics of the LAN and three classical topologies typically used with LANs, i.e., the bus, star, and ring…
GrainGenes: Changing Times, Changing Databases, Digital Evolution.
USDA-ARS?s Scientific Manuscript database
The GrainGenes database is one of few agricultural databases that had an early start on the Internet and that has changed with the times. Initial goals were to collect a wide range of data relating to the developing maps and attributes of small grains crops, and to make them easily accessible. The ...
The MAP program: building the digital terrain model.
R.H. Twito; R.W. Mifflin; R.J. McGaughey
1987-01-01
PLANS, a software package for integrated timber-harvest planning, uses digital terrain models to provide the topographic data needed to fit harvest and transportation designs to specific terrain. MAP, an integral program in the PLANS package, is used to construct the digital terrain models required by PLANS. MAP establishes digital terrain models using digitizer-traced...
Preliminary Geologic Map of the Topanga 7.5' Quadrangle, Southern California: A Digital Database
Yerkes, R.F.; Campbell, R.H.
1995-01-01
INTRODUCTION This Open-File report is a digital geologic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. This digital map database is compiled from previously published sources combined with some new mapping and modifications in nomenclature. The geologic map database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U. S. Geological Survey. For detailed descriptions of the units, their stratigraphic relations and sources of geologic mapping consult Yerkes and Campbell (1994). More specific information about the units may be available in the original sources. The content and character of the database and methods of obtaining it are described herein. The geologic map database itself, consisting of three ARC coverages and one base layer, can be obtained over the Internet or by magnetic tape copy as described below. The processes of extracting the geologic map database from the tar file, and importing the ARC export coverages (procedure described herein), will result in the creation of an ARC workspace (directory) called 'topnga.' The database was compiled using ARC/INFO version 7.0.3, a commercial Geographic Information System (Environmental Systems Research Institute, Redlands, California), with version 3.0 of the menu interface ALACARTE (Fitzgibbon and Wentworth, 1991, Fitzgibbon, 1991, Wentworth and Fitzgibbon, 1991). It is stored in uncompressed ARC export format (ARC/INFO version 7.x) in a compressed UNIX tar (tape archive) file. The tar file was compressed with gzip, and may be uncompressed with gzip, which is available free of charge via the Internet from the gzip Home Page (http://w3.teaser.fr/~jlgailly/gzip). A tar utility is required to extract the database from the tar file. This utility is included in most UNIX systems, and can be obtained free of charge via the Internet from Internet Literacy's Common Internet File Formats Webpage http://www.matisse.net/files/formats.html). ARC/INFO export files (files with the .e00 extension) can be converted into ARC/INFO coverages in ARC/INFO (see below) and can be read by some other Geographic Information Systems, such as MapInfo via ArcLink and ESRI's ArcView (version 1.0 for Windows 3.1 to 3.11 is available for free from ESRI's web site: http://www.esri.com). 1. Different base layer - The original digital database included separates clipped out of the Los Angeles 1:100,000 sheet. This release includes a vectorized scan of a scale-stable negative of the Topanga 7.5 minute quadrangle. 2. Map projection - The files in the original release were in polyconic projection. The projection used in this release is state plane, which allows for the tiling of adjacent quadrangles. 3. File compression - The files in the original release were compressed with UNIX compression. The files in this release are compressed with gzip.
ERIC Educational Resources Information Center
Brightenburg, Cindy
2016-01-01
The use of digital books is diverse, ranging from casual reading to in-depth primary source research. Digitization of early English printed books in particular, has provided greater access to a previously limited resource for academic faculty and researchers. Internet Archive, a free, internet website and Early English Books Online, a subscription…
Air Weather Service Master Station Catalog: USAFETAC Climatic Database Users Handbook No. 6
1993-03-01
4) . ... .4 • FIELD NO. DESCRIPTION OF FIELD AND COMMENTS 01 STN NUM. A 6- digit number with the first 5 digits assigned to a particular weather...reporting location lAW WMO ,ules plus a sixth digit as follows: 0 = The first five digits are the actual block/station number (WMO number) assigned to...it is considered inactive for that hour. A digit (1-9) tells how many months it has been since a report was received from the station for that hour
Quality control management and communication between radiologists and technologists.
Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M
2008-06-01
The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.
The future of medical diagnostics: large digitized databases.
Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron
2012-09-01
The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.
NASA Astrophysics Data System (ADS)
Ummin, Okumura; Tian, Han; Zhu, Haiyu; Liu, Fuqiang
2018-03-01
Construction safety has always been the first priority in construction process. The common safety problem is the instability of the template support. In order to solve this problem, the digital image measurement technology has been contrived to support real-time monitoring system which can be triggered if the deformation value exceed the specified range. Thus the economic loss could be reduced to the lowest level.
Study of the precision guided communication of digital television
NASA Astrophysics Data System (ADS)
Liu, Lun
2012-04-01
Along with the progress and development of the digital technology, there produced the transmission of the new media by medium of such as the network, mobile phones and the digital television, while among them digital TV has the superiority of other media. The appearance and development of digital TV will induce a profound change in the broadcasting and television industry chain. This paper started with discussing the transformation of digital television in profit model, mode of operation and mode of transmission to construct the precision-guided communication theory; And then analyzes the properties and marketing nature of the precision-guided communication to make the construction of the precision-guided communication marketing mode; And put forward the implementing of the precision-guided communication marketing strategies and concrete steps; At the end of the article the author summarized four conclusions.
Global GIS database; digital atlas of South Pacific
Hearn, P.P.; Hare, T.M.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.
2001-01-01
This CD-ROM contains a digital atlas of the countries of the South Pacific. This atlas is part of a global database compiled from USGS and other data sources at a nominal scale of 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make the atlas easier to use, are also included.
Global GIS database; digital atlas of Africa
Hearn, P.P.; Hare, T.M.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.
2001-01-01
This CD-ROM contains a digital atlas of the countries of Africa. This atlas is part of a global database compiled from USGS and other data sources at a nominal scale of 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make this atlas easier to use, are also included.
Global GIS database; digital atlas of South Asia
Hearn, P.P.; Hare, T.M.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.
2001-01-01
This CD-ROM contains a digital atlas of the countries of South Asia. This atlas is part of a global database compiled from USGS and other data sources at a nominal scale 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make the atlas easier to use, are also included.
Mars global digital dune database and initial science results
Hayward, R.K.; Mullins, K.F.; Fenton, L.K.; Hare, T.M.; Titus, T.N.; Bourke, M.C.; Colaprete, A.; Christensen, P.R.
2007-01-01
A new Mars Global Digital Dune Database (MGD3) constructed using Thermal Emission Imaging System (THEMIS) infrared (IR) images provides a comprehensive and quantitative view of the geographic distribution of moderate- to large-size dune fields (area >1 kM2) that will help researchers to understand global climatic and sedimentary processes that have shaped the surface of Mars. MGD3 extends from 65??N to 65??S latitude and includes ???550 dune fields, covering ???70,000 km2, with an estimated total volume of ???3,600 km3. This area, when combined with polar dune estimates, suggests moderate- to large-size dune field coverage on Mars may total ???800,000 km2, ???6 times less than the total areal estimate of ???5,000,000 km2 for terrestrial dunes. Where availability and quality of THEMIS visible (VIS) or Mars Orbiter Camera. narrow-angle (MOC NA) images allow, we classify dunes and include dune slipface measurements, which are derived from gross dune morphology and represent the prevailing wind direction at the last time of significant dune modification. For dunes located within craters, the azimuth from crater centroid to dune field centroid (referred to as dune centroid azimuth) is calculated and can provide an accurate method for tracking dune migration within smooth-floored craters. These indicators of wind direction are compared to output from a general circulation model (GCM). Dune centroid azimuth values generally correlate to regional wind patterns. Slipface orientations are less well correlated, suggesting that local topographic effects may play a larger role in dune orientation than regional winds. Copyright 2007 by the American Geophysical Union.
Application research for 4D technology in flood forecasting and evaluation
NASA Astrophysics Data System (ADS)
Li, Ziwei; Liu, Yutong; Cao, Hongjie
1998-08-01
In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.
Digital Mapping Techniques '07 - Workshop Proceedings
Soller, David R.
2008-01-01
The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.
Constructing Historical Profiles with Digital Natives
ERIC Educational Resources Information Center
Waring, Scott M.; Bentley, Courtney C.
2012-01-01
The purpose of this study was to examine a group of fifth graders experiences, beliefs, and opinions during the construction of digital historical agent profiles. This research study examined a project in which students were engaged in the learning of historical content and were asked to convey information about the life of someone from the past…
Technology Used for Realization of the Reform in Informal Areas.
NASA Astrophysics Data System (ADS)
Qirko, K.
2008-12-01
ORGANIZATION OF STRUCTURE AND ADMINISTRATION OF ALUIZNI Law no. 9482, date 03.03.2006 " On legalization, urban planning and integration of unauthorized buildings", entered into force on May 15, 2006. The Council of Ministers, with its decision no.289, date 17.05.2006, established the Agency for the Legalization, Urbanization, and Integration of the Informal Zones/Buildings (ALUIZNI), with its twelve local bodies. ALUIZNI began its activity in reliance to Law no. 9482, date 03.03.2006 " On legalization, urban planning and integration of unauthorized buildings", in July 2006. The administration of this agency was completed during this period and it is composed of; General Directory and twelve regional directories. As of today, this institution has 300 employees. The administrative structure of ALUIZNI is organized to achieve the objectives of the reform and to solve the problems arising during its completion. The following sectors have been established to achieve the objectives: Sector of compensation of owners; sector of cartography, sector of geographic system data elaboration (GIS) and Information Technology; sector of urban planning; sector of registration of legalized properties and Human resource sector. Following this vision, digital air photography of the Republic of Albania is in process of realization, from which we will receive, for the first time, orthophoto and digital map, unique for the entire territory of our country. This cartographic product, will serve to all government institutions and private ones. All other systems, such as; system of territory management; system of property registration ; system of population registration; system of addresses; urban planning studies and systems; definition of boundaries of administrative and touristic zones will be established based on this cartographic system. The cartographic product will be of parameters mentioned below, divided in lots:(2.3 MEuro) 1.Lot I: It includes the urban zone, 1200 km2. It will have a resolution of 8cm pixel and it will be produced as a orthophoto and digital vectorized map. 2. Lot II: It includes the rural zone, 12000km2. Orthophoto, with resolution 8cm pixel, will be produced. 3.Lot III: It includes mountainous zone, 15000km2. We will receive orthophoto, with resolution 30cm pixel. All the technical documentation of the process will be produced in a digital manner, based on the digital map and it will be the main databases. We have established the sector of geographic system data elaboration (GIS) and Information Technology, with the purpose to assure transparency, and correctness to the process, and to assure a permanent useful information for various reasons. (1.1MEuro) GIS is a modern technology, which elaborates and makes connections among different information. The main objective of this sector is the establishment of self declaration databases, with 30 characteristics for each of them and a databases for the process, with 40 characteristics for each property, which includes cartographic, geographic and construction data.
Technology and the Modern Library.
ERIC Educational Resources Information Center
Boss, Richard W.
1984-01-01
Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…
A framework for analysis of large database of old art paintings
NASA Astrophysics Data System (ADS)
Da Rugna, Jérome; Chareyron, Ga"l.; Pillay, Ruven; Joly, Morwena
2011-03-01
For many years, a lot of museums and countries organize the high definition digitalization of their own collections. In consequence, they generate massive data for each object. In this paper, we only focus on art painting collections. Nevertheless, we faced a very large database with heterogeneous data. Indeed, image collection includes very old and recent scans of negative photos, digital photos, multi and hyper spectral acquisitions, X-ray acquisition, and also front, back and lateral photos. Moreover, we have noted that art paintings suffer from much degradation: crack, softening, artifact, human damages and, overtime corruption. Considering that, it appears necessary to develop specific approaches and methods dedicated to digital art painting analysis. Consequently, this paper presents a complete framework to evaluate, compare and benchmark devoted to image processing algorithms.
Aricak, Burak
2015-07-01
Forest roads are essential for transport in managed forests, yet road construction causes environmental disturbance, both in the surface area the road covers and in erosion and downslope deposition of road fill material. The factors affecting the deposition distance of eroded road fill are the slope gradient and the density of plant cover. Thus, it is important to take these factors into consideration during road planning to minimize their disturbance. The aim of this study was to use remote sensing and field surveying to predict the locations that would be affected by downslope deposition of eroding road fill and to compile the data into a geographic information system (GIS) database. The construction of 99,500 m of forest roads is proposed for the Kastamonu Regional Forest Directorate in Turkey. Using GeoEye satellite images and a digital elevation model (DEM) for the region, the location and extent of downslope deposition of road fill were determined for the roads as planned. It was found that if the proposed roads were constructed by excavators, the fill material would cover 910,621 m(2) and the affected surface area would be 1,302,740 m(2). Application of the method used here can minimize the adverse effects of forest roads.
Digital Mapping Techniques '11–12 workshop proceedings
Soller, David R.
2014-01-01
At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
The wavelet/scalar quantization compression standard for digital fingerprint images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, J.N.; Brislawn, C.M.
1994-04-01
A new digital image compression standard has been adopted by the US Federal Bureau of Investigation for use on digitized gray-scale fingerprint images. The algorithm is based on adaptive uniform scalar quantization of a discrete wavelet transform image decomposition and is referred to as the wavelet/scalar quantization standard. The standard produces archival quality images at compression ratios of around 20:1 and will allow the FBI to replace their current database of paper fingerprint cards with digital imagery.
ERIC Educational Resources Information Center
Shabajee, Paul; Bollen, Johan; Luce, Rick; Weig, Eric
2002-01-01
Includes four articles that discuss multimedia educational database systems and the use of metadata, including repurposing; the evaluation of digital library use that analyzes the retrieval habits of users; the Kentucky Virtual Library (KYVL) and digital collection project; and the collection of the Division of Parasitic Diseases, Centers for…
Uses of Digital Tools and Literacies in the English Language Arts Classroom
ERIC Educational Resources Information Center
Beach, Richard
2012-01-01
This article reviews research on English language arts teachers' use of digital tools in the classroom to remediate print literacies. Specifically, this review focuses on the affordances of digital tools to foster uses of digital literacies of informational/accessibility, collaboration knowledge construction, multimodal communication, gaming…
Art, Storytelling, and the Digital Economy
ERIC Educational Resources Information Center
Ohier, Jason
2007-01-01
A digital story can be anything that uses digital technology to construct narrative. It comes in many forms, including short movies and documentaries, using still images, voice-over narration, and music. It can be academic, abstract, or highly personal. Digital storytelling provides a powerful media literacy opportunity, as students are required…
Fei, Lin; Zhao, Jing; Leng, Jiahao; Zhang, Shujian
2017-10-12
The ALIPORC full-text database is targeted at a specific full-text database of acupuncture literature in the Republic of China. Starting in 2015, till now, the database has been getting completed, focusing on books relevant with acupuncture, articles and advertising documents, accomplished or published in the Republic of China. The construction of this database aims to achieve the source sharing of acupuncture medical literature in the Republic of China through the retrieval approaches to diversity and accurate content presentation, contributes to the exchange of scholars, reduces the paper damage caused by paging and simplify the retrieval of the rare literature. The writers have made the explanation of the database in light of sources, characteristics and current situation of construction; and have discussed on improving the efficiency and integrity of the database and deepening the development of acupuncture literature in the Republic of China.
Sharing and interoperation of Digital Dongying geospatial data
NASA Astrophysics Data System (ADS)
Zhao, Jun; Liu, Gaohuan; Han, Lit-tao; Zhang, Rui-ju; Wang, Zhi-an
2006-10-01
Digital Dongying project was put forward by Dongying city, Shandong province, and authenticated by Ministry of Information Industry, Ministry of Science and Technology and Ministry of Construction P.R.CHINA in 2002. After five years of building, informationization level of Dongying has reached to the advanced degree. In order to forward the step of digital Dongying building, and to realize geospatial data sharing, geographic information sharing standards are drawn up and applied into realization. Secondly, Digital Dongying Geographic Information Sharing Platform has been constructed and developed, which is a highly integrated platform of WEBGIS. 3S (GIS, GPS, RS), Object oriented RDBMS, Internet, DCOM, etc. It provides an indispensable platform for sharing and interoperation of Digital Dongying Geospatial Data. According to the standards, and based on the platform, sharing and interoperation of "Digital Dongying" geospatial data have come into practice and the good results have been obtained. However, a perfect leadership group is necessary for data sharing and interoperation.
An automated system for terrain database construction
NASA Technical Reports Server (NTRS)
Johnson, L. F.; Fretz, R. K.; Logan, T. L.; Bryant, N. A.
1987-01-01
An automated Terrain Database Preparation System (TDPS) for the construction and editing of terrain databases used in computerized wargaming simulation exercises has been developed. The TDPS system operates under the TAE executive, and it integrates VICAR/IBIS image processing and Geographic Information System software with CAD/CAM data capture and editing capabilities. The terrain database includes such features as roads, rivers, vegetation, and terrain roughness.
Modis, SeaWIFS, and Pathfinder funded activities
NASA Technical Reports Server (NTRS)
Evans, Robert H.
1995-01-01
MODIS (Moderate Resolution Imaging Spectrometer), SeaWIFS (Sea-viewing Wide Field Sensor), Pathfinder, and DSP (Digital Signal Processor) objectives are summarized. An overview of current progress is given for the automatic processing database, client/server status, matchup database, and DSP support.
NASA Technical Reports Server (NTRS)
Djorgovski, S. G.
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has produced real, published results.
A design for the geoinformatics system
NASA Astrophysics Data System (ADS)
Allison, M. L.
2002-12-01
Informatics integrates and applies information technologies with scientific and technical disciplines. A geoinformatics system targets the spatially based sciences. The system is not a master database, but will collect pertinent information from disparate databases distributed around the world. Seamless interoperability of databases promises quantum leaps in productivity not only for scientific researchers but also for many areas of society including business and government. The system will incorporate: acquisition of analog and digital legacy data; efficient information and data retrieval mechanisms (via data mining and web services); accessibility to and application of visualization, analysis, and modeling capabilities; online workspace, software, and tutorials; GIS; integration with online scientific journal aggregates and digital libraries; access to real time data collection and dissemination; user-defined automatic notification and quality control filtering for selection of new resources; and application to field techniques such as mapping. In practical terms, such a system will provide the ability to gather data over the Web from a variety of distributed sources, regardless of computer operating systems, database formats, and servers. Search engines will gather data about any geographic location, above, on, or below ground, covering any geologic time, and at any scale or detail. A distributed network of digital geolibraries can archive permanent copies of databases at risk of being discontinued and those that continue to be maintained by the data authors. The geoinformatics system will generate results from widely distributed sources to function as a dynamic data network. Instead of posting a variety of pre-made tables, charts, or maps based on static databases, the interactive dynamic system creates these products on the fly, each time an inquiry is made, using the latest information in the appropriate databases. Thus, in the dynamic system, a map generated today may differ from one created yesterday and one to be created tomorrow, because the databases used to make it are constantly (and sometimes automatically) being updated.
3D Digital Legos for Teaching Security Protocols
ERIC Educational Resources Information Center
Yu, Li; Harrison, L.; Lu, Aidong; Li, Zhiwei; Wang, Weichao
2011-01-01
We have designed and developed a 3D digital Lego system as an education tool for teaching security protocols effectively in Information Assurance courses (Lego is a trademark of the LEGO Group. Here, we use it only to represent the pieces of a construction set.). Our approach applies the pedagogical methods learned from toy construction sets by…
A georeferenced Landsat digital database for forest insect-damage assessment
NASA Technical Reports Server (NTRS)
Williams, D. L.; Nelson, R. F.; Dottavio, C. L.
1985-01-01
In 1869, the gypsy moth caterpillar was introduced in the U.S. in connection with the experiments of a French scientist. Throughout the insect's period of establishment, gypsy moth populations have periodically increased to epidemic proportions. For programs concerned with preventing the insect's spread, it would be highly desirable to be able to employ a survey technique which could provide timely, accurate, and standardized assessments at a reasonable cost. A project was, therefore, initiated with the aim to demonstrate the usefulness of satellite remotely sensed data for monitoring the insect defoliation of hardwood forests in Pennsylvania. A major effort within this project involved the development of a map-registered Landsat digital database. A complete description of the database developed is provided along with information regarding the employed data management system.
Geophysical Log Database for the Mississippi Embayment Regional Aquifer Study (MERAS)
Hart, Rheannon M.; Clark, Brian R.
2008-01-01
The Mississippi Embayment Regional Aquifer Study (MERAS) is an investigation of ground-water availability and sustainability within the Mississippi embayment as part of the U.S. Geological Survey Ground-Water Resources Program. The MERAS area consists of approximately 70,000 square miles and encompasses parts of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. More than 2,600 geophysical logs of test holes and wells within the MERAS area were compiled into a database and were used to develop a digital hydrogeologic framework from land surface to the top of the Midway Group of upper Paleocene age. The purpose of this report is to document, present, and summarize the geophysical log database, as well as to preserve the geophysical logs in a digital image format for online access.
Automatic Mexican sign language and digits recognition using normalized central moments
NASA Astrophysics Data System (ADS)
Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina
2016-09-01
This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.
Geologic Map of the Wenatchee 1:100,000 Quadrangle, Central Washington: A Digital Database
Tabor, R.W.; Waitt, R.B.; Frizzell, V.A.; Swanson, D.A.; Byerly, G.R.; Bentley, R.D.
2005-01-01
This digital map database has been prepared by R.W. Tabor from the published Geologic map of the Wenatchee 1:100,000 Quadrangle, Central Washington. Together with the accompanying text files as PDF, it provides information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The authors mapped most of the bedrock geology at 1:100,000 scale, but compiled Quaternary units at 1:24,000 scale. The Quaternary contacts and structural data have been much simplified for the 1:100,000-scale map and database. The spatial resolution (scale) of the database is 1:100,000 or smaller. This database depicts the distribution of geologic materials and structures at a regional (1:100,000) scale. The report is intended to provide geologic information for the regional study of materials properties, earthquake shaking, landslide potential, mineral hazards, seismic velocity, and earthquake faults. In addition, the report contains information and interpretations about the regional geologic history and framework. However, the regional scale of this report does not provide sufficient detail for site development purposes.
Pre-Service Teachers Designing and Constructing "Good Digital Games"
ERIC Educational Resources Information Center
Artym, Corbett; Carbonaro, Mike; Boechler, Patricia
2016-01-01
There is a growing interest in the application of digital games to enhance learning across many educational levels. This paper investigates pre-service teachers' ability to operationalize the learning principles that are considered part of a good digital game (Gee, 2007) by designing digital games in Scratch. Forty pre-service teachers, enrolled…
A Concept Analysis of Digital Citizenship for Democratic Citizenship Education in the Internet Age
ERIC Educational Resources Information Center
Choi, Moonsun
2016-01-01
Despite the importance of promoting socially responsible citizenship in the Internet age, there is a paucity of research on how digital citizenship or digital citizens might be defined and/or investigated. This study found 4 major categories that construct digital citizenship: "Ethics," "Media and Information Literacy,"…
A Digital View of History: Drawing and Discussing Models of Historical Concepts
ERIC Educational Resources Information Center
Manfra, Meghan McGlinn; Coven, Robert M.
2011-01-01
Digital history refers to "the study of the past using a variety of electronically reproduced primary source texts, images, and artifacts as well as the constructed narratives, accounts, or presentations that result from digital historical inquiry." Access to digitized primary sources can promote active instruction in historical thinking. A…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beverly Seyler; John Grube
2004-12-10
Oil and gas have been commercially produced in Illinois for over 100 years. Existing commercial production is from more than fifty-two named pay horizons in Paleozoic rocks ranging in age from Middle Ordovician to Pennsylvanian. Over 3.2 billion barrels of oil have been produced. Recent calculations indicate that remaining mobile resources in the Illinois Basin may be on the order of several billion barrels. Thus, large quantities of oil, potentially recoverable using current technology, remain in Illinois oil fields despite a century of development. Many opportunities for increased production may have been missed due to complex development histories, multiple stackedmore » pays, and commingled production which makes thorough exploitation of pays and the application of secondary or improved/enhanced recovery strategies difficult. Access to data, and the techniques required to evaluate and manage large amounts of diverse data are major barriers to increased production of critical reserves in the Illinois Basin. These constraints are being alleviated by the development of a database access system using a Geographic Information System (GIS) approach for evaluation and identification of underdeveloped pays. The Illinois State Geological Survey has developed a methodology that is being used by industry to identify underdeveloped areas (UDAs) in and around petroleum reservoirs in Illinois using a GIS approach. This project utilizes a statewide oil and gas Oracle{reg_sign} database to develop a series of Oil and Gas Base Maps with well location symbols that are color-coded by producing horizon. Producing horizons are displayed as layers and can be selected as separate or combined layers that can be turned on and off. Map views can be customized to serve individual needs and page size maps can be printed. A core analysis database with over 168,000 entries has been compiled and assimilated into the ISGS Enterprise Oracle database. Maps of wells with core data have been generated. Data from over 1,700 Illinois waterflood units and waterflood areas have been entered into an Access{reg_sign} database. The waterflood area data has also been assimilated into the ISGS Oracle database for mapping and dissemination on the ArcIMS website. Formation depths for the Beech Creek Limestone, Ste. Genevieve Limestone and New Albany Shale in all of the oil producing region of Illinois have been calculated and entered into a digital database. Digital contoured structure maps have been constructed, edited and added to the ILoil website as map layers. This technology/methodology addresses the long-standing constraints related to information access and data management in Illinois by significantly simplifying the laborious process that industry presently must use to identify underdeveloped pay zones in Illinois.« less
Construction of In-house Databases in a Corporation
NASA Astrophysics Data System (ADS)
Senoo, Tetsuo
As computer technology, communication technology and others have progressed, many corporations are likely to locate constructing and utilizing their own databases at the center of the information activities, and aim at developing their information activities newly. This paper considers how information management in a corporation is affected under changing management and technology environments, and clarifies and generalizes what in-house databases should be constructed and utilized from the viewpoints of requirements to be furnished, types and forms of information to be dealt, indexing, use type and frequency, evaluation method and so on. The author outlines an information system of Matsushita called MATIS (Matsushita Technical Information System) as an actual example, and describes the present status and some points to be reminded in constructing and utilizing databases of REP, BOOK and SYMP.
MyLibrary: A Web Personalized Digital Library.
ERIC Educational Resources Information Center
Rocha, Catarina; Xexeo, Geraldo; da Rocha, Ana Regina C.
With the increasing availability of information on Internet information providers, like search engines, digital libraries and online databases, it becomes more important to have personalized systems that help users to find relevant information. One type of personalization that is growing in use is recommender systems. This paper presents…
Preserving the 'Athens of Indiana' through Digitization.
ERIC Educational Resources Information Center
Helling, Bill
2003-01-01
Describes a digitization project at the public library in Crawfordsville, Indiana that was designed to preserve their local history collection. Highlights include damage to the collection from fire, termites, use, and age; selecting a scanner and software; creating databases; and making information accessible on the Web. (LRW)
Generation of large scale urban environments to support advanced sensor and seeker simulation
NASA Astrophysics Data System (ADS)
Giuliani, Joseph; Hershey, Daniel; McKeown, David, Jr.; Willis, Carla; Van, Tan
2009-05-01
One of the key aspects for the design of a next generation weapon system is the need to operate in cluttered and complex urban environments. Simulation systems rely on accurate representation of these environments and require automated software tools to construct the underlying 3D geometry and associated spectral and material properties that are then formatted for various objective seeker simulation systems. Under an Air Force Small Business Innovative Research (SBIR) contract, we have developed an automated process to generate 3D urban environments with user defined properties. These environments can be composed from a wide variety of source materials, including vector source data, pre-existing 3D models, and digital elevation models, and rapidly organized into a geo-specific visual simulation database. This intermediate representation can be easily inspected in the visible spectrum for content and organization and interactively queried for accuracy. Once the database contains the required contents, it can then be exported into specific synthetic scene generation runtime formats, preserving the relationship between geometry and material properties. To date an exporter for the Irma simulation system developed and maintained by AFRL/Eglin has been created and a second exporter to Real Time Composite Hardbody and Missile Plume (CHAMP) simulation system for real-time use is currently being developed. This process supports significantly more complex target environments than previous approaches to database generation. In this paper we describe the capabilities for content creation for advanced seeker processing algorithms simulation and sensor stimulation, including the overall database compilation process and sample databases produced and exported for the Irma runtime system. We also discuss the addition of object dynamics and viewer dynamics within the visual simulation into the Irma runtime environment.
Toward unification of taxonomy databases in a distributed computer environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitakami, Hajime; Tateno, Yoshio; Gojobori, Takashi
1994-12-31
All the taxonomy databases constructed with the DNA databases of the international DNA data banks are powerful electronic dictionaries which aid in biological research by computer. The taxonomy databases are, however not consistently unified with a relational format. If we can achieve consistent unification of the taxonomy databases, it will be useful in comparing many research results, and investigating future research directions from existent research results. In particular, it will be useful in comparing relationships between phylogenetic trees inferred from molecular data and those constructed from morphological data. The goal of the present study is to unify the existent taxonomymore » databases and eliminate inconsistencies (errors) that are present in them. Inconsistencies occur particularly in the restructuring of the existent taxonomy databases, since classification rules for constructing the taxonomy have rapidly changed with biological advancements. A repair system is needed to remove inconsistencies in each data bank and mismatches among data banks. This paper describes a new methodology for removing both inconsistencies and mismatches from the databases on a distributed computer environment. The methodology is implemented in a relational database management system, SYBASE.« less
Manchester visual query language
NASA Astrophysics Data System (ADS)
Oakley, John P.; Davis, Darryl N.; Shann, Richard T.
1993-04-01
We report a database language for visual retrieval which allows queries on image feature information which has been computed and stored along with images. The language is novel in that it provides facilities for dealing with feature data which has actually been obtained from image analysis. Each line in the Manchester Visual Query Language (MVQL) takes a set of objects as input and produces another, usually smaller, set as output. The MVQL constructs are mainly based on proven operators from the field of digital image analysis. An example is the Hough-group operator which takes as input a specification for the objects to be grouped, a specification for the relevant Hough space, and a definition of the voting rule. The output is a ranked list of high scoring bins. The query could be directed towards one particular image or an entire image database, in the latter case the bins in the output list would in general be associated with different images. We have implemented MVQL in two layers. The command interpreter is a Lisp program which maps each MVQL line to a sequence of commands which are used to control a specialized database engine. The latter is a hybrid graph/relational system which provides low-level support for inheritance and schema evolution. In the paper we outline the language and provide examples of useful queries. We also describe our solution to the engineering problems associated with the implementation of MVQL.
Facilitating Collaboration, Knowledge Construction and Communication with Web-Enabled Databases.
ERIC Educational Resources Information Center
McNeil, Sara G.; Robin, Bernard R.
This paper presents an overview of World Wide Web-enabled databases that dynamically generate Web materials and focuses on the use of this technology to support collaboration, knowledge construction, and communication. Database applications have been used in classrooms to support learning activities for over a decade, but, although business and…
Body Awareness: Construct and Self-Report Measures
Mehling, Wolf E.; Gopisetty, Viranjini; Daubenmier, Jennifer; Price, Cynthia J.; Hecht, Frederick M.; Stewart, Anita
2009-01-01
Objectives Heightened body awareness can be adaptive and maladaptive. Improving body awareness has been suggested as an approach for treating patients with conditions such as chronic pain, obesity and post-traumatic stress disorder. We assessed the psychometric quality of selected self-report measures and examined their items for underlying definitions of the construct. Data sources PubMed, PsychINFO, HaPI, Embase, Digital Dissertations Database. Review methods Abstracts were screened; potentially relevant instruments were obtained and systematically reviewed. Instruments were excluded if they exclusively measured anxiety, covered emotions without related physical sensations, used observer ratings only, or were unobtainable. We restricted our study to the proprioceptive and interoceptive channels of body awareness. The psychometric properties of each scale were rated using a structured evaluation according to the method of McDowell. Following a working definition of the multi-dimensional construct, an inter-disciplinary team systematically examined the items of existing body awareness instruments, identified the dimensions queried and used an iterative qualitative process to refine the dimensions of the construct. Results From 1,825 abstracts, 39 instruments were screened. 12 were included for psychometric evaluation. Only two were rated as high standard for reliability, four for validity. Four domains of body awareness with 11 sub-domains emerged. Neither a single nor a compilation of several instruments covered all dimensions. Key domains that might potentially differentiate adaptive and maladaptive aspects of body awareness were missing in the reviewed instruments. Conclusion Existing self-report instruments do not address important domains of the construct of body awareness, are unable to discern between adaptive and maladaptive aspects of body awareness, or exhibit other psychometric limitations. Restricting the construct to its proprio- and interoceptive channels, we explore the current understanding of the multi-dimensional construct and suggest next steps for further research. PMID:19440300
Huang, Ji-yan; Zhao, Hou-ming; Zhou, Hai-wen
2014-04-01
To construct a database and a tissue bank of oral mucosa precancerous lesions and to estimate the application values. Patients in the Yangtze delta suffering oral mucosa precancerous lesions were enrolled into this study. The patients' clinical data and samples of oral precancerous mucosa, salivary and blood were collected to create a tissue bank, based on which a database was constructed using Microsoft Access software, Brower/Server structure and ASP language. The tissue bank and database of oral mucosa precancerous lesions were successfully built. The procedure to harvest, store and transport the samples had been standardized. The database showed good interactive interface, convenient for data collection, query and share in the internet. We constructed the tissue bank and database of oral mucosa precancerous lesions for the first time, which not only help preserve the biological resource of oral mucosa precancerous lesions, but also provide enormous convenience in clinical work, researching and teaching. Supported by Research Fund of Science and Technology Committee of Shanghai Municipality (08ZR1416700).
Link, P.K.; Mahoney, J.B.; Bruner, D.J.; Batatian, L.D.; Wilson, Eric; Williams, F.J.C.
1995-01-01
The paper version of the Geologic map of outcrop areas of sedimentary units in the eastern part of the Hailey 1x2 Quadrangle and part of the southern part of the Challis 1x2 Quadrangle, south-central Idaho was compiled by Paul Link and others in 1995. The plate was compiled on a 1:100,000 scale topographic base map. TechniGraphic System, Inc. of Fort Collins Colorado digitized this map under contract for N.Shock. G.Green edited and prepared the digital version for publication as a GIS database. The digital geologic map database can be queried in many ways to produce a variety of geologic maps.
2016-12-01
VARIABILITY OF THE ACOUSTIC PROPAGATION IN THE MEDITERRANEAN SEA IDENTIFIED FROM A SYNOPTIC MONTHLY GRIDDED DATABASE AS COMPARED WITH GDEM by...ANNUAL VARIABILITY OF THE ACOUSTIC PROPAGATION IN THE MEDITERRANEAN SEA IDENTIFIED FROM A SYNOPTIC MONTHLY GRIDDED DATABASE AS COMPARED WITH GDEM 5...profiles obtained from the synoptic monthly gridded World Ocean Database (SMD-WOD) and Generalized Digital Environmental Model (GDEM) temperature (T
Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008
Soller, David R.
2009-01-01
The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
Users as essential contributors to spatial cyberinfrastructures
Poore, Barbara S.
2011-01-01
Current accounts of spatial cyberinfrastructure development tend to overemphasize technologies to the neglect of critical social and cultural issues on which adoption depends. Spatial cyberinfrastructures will have a higher chance of success if users of many types, including nonprofessionals, are made central to the development process. Recent studies in the history of infrastructures reveal key turning points and issues that should be considered in the development of spatial cyberinfrastructure projects. These studies highlight the importance of adopting qualitative research methods to learn how users work with data and digital tools, and how user communities form. The author's empirical research on data sharing networks in the Pacific Northwest salmon crisis at the turn of the 21st century demonstrates that ordinary citizens can contribute critical local knowledge to global databases and should be considered in the design and construction of spatial cyberinfrastructures. PMID:21444825
Users as essential contributors to spatial cyberinfrastructures.
Poore, Barbara S
2011-04-05
Current accounts of spatial cyberinfrastructure development tend to overemphasize technologies to the neglect of critical social and cultural issues on which adoption depends. Spatial cyberinfrastructures will have a higher chance of success if users of many types, including nonprofessionals, are made central to the development process. Recent studies in the history of infrastructures reveal key turning points and issues that should be considered in the development of spatial cyberinfrastructure projects. These studies highlight the importance of adopting qualitative research methods to learn how users work with data and digital tools, and how user communities form. The author's empirical research on data sharing networks in the Pacific Northwest salmon crisis at the turn of the 21st century demonstrates that ordinary citizens can contribute critical local knowledge to global databases and should be considered in the design and construction of spatial cyberinfrastructures.
Users as essential contributors to spatial cyberinfrastructures
Poore, B.S.
2011-01-01
Current accounts of spatial cyberinfrastructure development tend to overemphasize technologies to the neglect of critical social and cultural issues on which adoption depends. Spatial cyberinfrastructures will have a higher chance of success if users of many types, including nonprofessionals, are made central to the development process. Recent studies in the history of infrastructures reveal key turning points and issues that should be considered in the development of spatial cyberinfrastructure projects. These studies highlight the importance of adopting qualitative research methods to learn how users work with data and digital tools, and how user communities form. The author's empirical research on data sharing networks in the Pacific Northwest salmon crisis at the turn of the 21st century demonstrates that ordinary citizens can contribute critical local knowledge to global databases and should be considered in the design and construction of spatial cyberinfrastructures.
2010-01-01
the 1/8°climatological monthly mean temperature and salinity fields from the Generalized Digital Environmental Model ( GDEM ) climatology (NAVOCEANO...1424. Page 24 of 51 NAVOCEANO, 2003. Database description for the generalized digital environmental model ( GDEM --V) Version 3.0. OAML--DBD--72
Do "Digital Certificates" Hold the Key to Colleges' On-Line Activities?
ERIC Educational Resources Information Center
Olsen, Florence
1999-01-01
Examines the increasing use of "digital certificates" to validate computer user identity in various applications on college and university campuses, including letting students register for courses, monitoring access to Internet2, and monitoring access to databases and electronic journals. The methodology has been developed by the…
Saving the Information Commons.
ERIC Educational Resources Information Center
Bollier, David
2003-01-01
Discusses the control of digital content and the stakes for libraries and our democratic culture. Highlights include copyright term extension, the Digital Millennium Copyright Act, use of contract law to limit the public domain, database legislation, trademarks versus the public domain, the void in our cultural vocabulary, and the concept of the…
Cracking the Egg: The South Carolina Digital Library's New Perspective
ERIC Educational Resources Information Center
Vinson, Christopher G.; Boyd, Kate Foster
2008-01-01
This article explores the historical foundations of the South Carolina Digital Library, a collaborative statewide program that ties together academic special collections and archives, public libraries, state government archives, and other cultural resource institutions in an effort to provide the state with a comprehensive database of online…
ERIC Educational Resources Information Center
Sargent, John
The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…
Aboriginal Knowledge Traditions in Digital Environments
ERIC Educational Resources Information Center
Christie, Michael
2005-01-01
According to Manovich (2001), the database and the narrative are natural enemies, each competing for the same territory of human culture. Aboriginal knowledge traditions depend upon narrative through storytelling and other shared performances. The database objectifies and commodifies distillations of such performances and absorbs them into data…
Cenozoic Antarctic DiatomWare/BugCam: An aid for research and teaching
Wise, S.W.; Olney, M.; Covington, J.M.; Egerton, V.M.; Jiang, S.; Ramdeen, D.K.; ,; Schrader, H.; Sims, P.A.; Wood, A.S.; Davis, A.; Davenport, D.R.; Doepler, N.; Falcon, W.; Lopez, C.; Pressley, T.; Swedberg, O.L.; Harwood, D.M.
2007-01-01
Cenozoic Antarctic DiatomWare/BugCam© is an interactive, icon-driven digital-image database/software package that displays over 500 illustrated Cenozoic Antarctic diatom taxa along with original descriptions (including over 100 generic and 20 family-group descriptions). This digital catalog is designed primarily for use by micropaleontologists working in the field (at sea or on the Antarctic continent) where hard-copy literature resources are limited. This new package will also be useful for classroom/lab teaching as well as for any paleontologists making or refining taxonomic identifications at the microscope. The database (Cenozoic Antarctic DiatomWare) is displayed via a custom software program (BugCam) written in Visual Basic for use on PCs running Windows 95 or later operating systems. BugCam is a flexible image display program that utilizes an intuitive thumbnail “tree” structure for navigation through the database. The data are stored on Micrsosoft EXCEL spread sheets, hence no separate relational database program is necessary to run the package
Image query and indexing for digital x rays
NASA Astrophysics Data System (ADS)
Long, L. Rodney; Thoma, George R.
1998-12-01
The web-based medical information retrieval system (WebMIRS) allows interned access to databases containing 17,000 digitized x-ray spine images and associated text data from National Health and Nutrition Examination Surveys (NHANES). WebMIRS allows SQL query of the text, and viewing of the returned text records and images using a standard browser. We are now working (1) to determine utility of data directly derived from the images in our databases, and (2) to investigate the feasibility of computer-assisted or automated indexing of the images to support image retrieval of images of interest to biomedical researchers in the field of osteoarthritis. To build an initial database based on image data, we are manually segmenting a subset of the vertebrae, using techniques from vertebral morphometry. From this, we will derive and add to the database vertebral features. This image-derived data will enhance the user's data access capability by enabling the creation of combined SQL/image-content queries.
Apollo Lunar Sample Photograph Digitization Project Update
NASA Technical Reports Server (NTRS)
Todd, N. S.; Lofgren, G. E.
2012-01-01
This is an update of the progress of a 4-year data restoration project effort funded by the LASER program to digitize photographs of the Apollo lunar rock samples and create high resolution digital images and undertaken by the Astromaterials Acquisition and Curation Office at JSC [1]. The project is currently in its last year of funding. We also provide an update on the derived products that make use of the digitized photos including the Lunar Sample Catalog and Photo Database[2], Apollo Sample data files for GoogleMoon[3].
Physiographic rim of the Grand Canyon, Arizona: a digital database
Billingsley, George H.; Hampton, Haydee M.
1999-01-01
This Open-File report is a digital physiographic map database. This pamphlet serves to introduce and describe the digital data. There is no paper map included in the Open-File report. The report does include, however, PostScript and PDF format plot files, each containing an image of the map. For those interested in a paper plot of information contained in the database or in obtaining the PostScript plot files, please see the section entitled "For Those Who Don't Use Digital Geologic Map Databases" below. This physiographic map of the Grand Canyon is modified from previous versions by Billingsley and Hendricks (1989), and Billingsley and others (1997). The boundary is drawn approximately along the topographic rim of the Grand Canyon and its tributary canyons between Lees Ferry and Lake Mead (shown in red). Several isolated small mesas, buttes, and plateaus are within this area, which overall encompasses about 2,600 square miles. The Grand Canyon lies within the southwestern part of the Colorado Plateaus of northern Arizona between Lees Ferry, Colorado River Mile 0, and Lake Mead, Colorado River Mile 277. The Colorado River is the corridor for raft trips through the Grand Canyon. Limestone rocks of the Kaibab Formation form most of the north and south rims of the Grand Canyon, and a few volcanic rocks form the north rim of parts of the Uinkaret and Shivwits Plateaus. Limestones of the Redwall Limestone and lower Supai Group form the rim of the Hualapai Plateau area, and Limestones of Devonian and Cambrian age form the boundary rim near the mouth of Grand Canyon at the Lake Mead. The natural physiographic boundary of the Grand Canyon is roughly the area a visitor would first view any part of the Grand Canyon and its tributaries.
A novel method for efficient archiving and retrieval of biomedical images using MPEG-7
NASA Astrophysics Data System (ADS)
Meyer, Joerg; Pahwa, Ash
2004-10-01
Digital archiving and efficient retrieval of radiological scans have become critical steps in contemporary medical diagnostics. Since more and more images and image sequences (single scans or video) from various modalities (CT/MRI/PET/digital X-ray) are now available in digital formats (e.g., DICOM-3), hospitals and radiology clinics need to implement efficient protocols capable of managing the enormous amounts of data generated daily in a typical clinical routine. We present a method that appears to be a viable way to eliminate the tedious step of manually annotating image and video material for database indexing. MPEG-7 is a new framework that standardizes the way images are characterized in terms of color, shape, and other abstract, content-related criteria. A set of standardized descriptors that are automatically generated from an image is used to compare an image to other images in a database, and to compute the distance between two images for a given application domain. Text-based database queries can be replaced with image-based queries using MPEG-7. Consequently, image queries can be conducted without any prior knowledge of the keys that were used as indices in the database. Since the decoding and matching steps are not part of the MPEG-7 standard, this method also enables searches that were not planned by the time the keys were generated.
NASA Astrophysics Data System (ADS)
Song, Xiaoning; Feng, Zhen-Hua; Hu, Guosheng; Yang, Xibei; Yang, Jingyu; Qi, Yunsong
2015-09-01
This paper proposes a progressive sparse representation-based classification algorithm using local discrete cosine transform (DCT) evaluation to perform face recognition. Specifically, the sum of the contributions of all training samples of each subject is first taken as the contribution of this subject, then the redundant subject with the smallest contribution to the test sample is iteratively eliminated. Second, the progressive method aims at representing the test sample as a linear combination of all the remaining training samples, by which the representation capability of each training sample is exploited to determine the optimal "nearest neighbors" for the test sample. Third, the transformed DCT evaluation is constructed to measure the similarity between the test sample and each local training sample using cosine distance metrics in the DCT domain. The final goal of the proposed method is to determine an optimal weighted sum of nearest neighbors that are obtained under the local correlative degree evaluation, which is approximately equal to the test sample, and we can use this weighted linear combination to perform robust classification. Experimental results conducted on the ORL database of faces (created by the Olivetti Research Laboratory in Cambridge), the FERET face database (managed by the Defense Advanced Research Projects Agency and the National Institute of Standards and Technology), AR face database (created by Aleix Martinez and Robert Benavente in the Computer Vision Center at U.A.B), and USPS handwritten digit database (gathered at the Center of Excellence in Document Analysis and Recognition at SUNY Buffalo) demonstrate the effectiveness of the proposed method.
Database for volcanic processes and geology of Augustine Volcano, Alaska
McIntire, Jacqueline; Ramsey, David W.; Thoms, Evan; Waitt, Richard B.; Beget, James E.
2012-01-01
This digital release contains information used to produce the geologic map published as Plate 1 in U.S. Geological Survey Professional Paper 1762 (Waitt and Begét, 2009). The main component of this digital release is a geologic map database prepared using geographic information systems (GIS) applications. This release also contains links to files to view or print the map plate, accompanying measured sections, and main report text from Professional Paper 1762. It should be noted that Augustine Volcano erupted in 2006, after the completion of the geologic mapping shown in Professional Paper 1762 and presented in this database. Information on the 2006 eruption can be found in U.S. Geological Survey Professional Paper 1769. For the most up to date information on the status of Alaska volcanoes, please refer to the U.S. Geological Survey Volcano Hazards Program website.
Newspaper archives + text mining = rich sources of historical geo-spatial data
NASA Astrophysics Data System (ADS)
Yzaguirre, A.; Smit, M.; Warren, R.
2016-04-01
Newspaper archives are rich sources of cultural, social, and historical information. These archives, even when digitized, are typically unstructured and organized by date rather than by subject or location, and require substantial manual effort to analyze. The effort of journalists to be accurate and precise means that there is often rich geo-spatial data embedded in the text, alongside text describing events that editors considered to be of sufficient importance to the region or the world to merit column inches. A regional newspaper can add over 100,000 articles to its database each year, and extracting information from this data for even a single country would pose a substantial Big Data challenge. In this paper, we describe a pilot study on the construction of a database of historical flood events (location(s), date, cause, magnitude) to be used in flood assessment projects, for example to calibrate models, estimate frequency, establish high water marks, or plan for future events in contexts ranging from urban planning to climate change adaptation. We then present a vision for extracting and using the rich geospatial data available in unstructured text archives, and suggest future avenues of research.
The Cipa Database for Saving the Heritage of Syria
NASA Astrophysics Data System (ADS)
Silver, Minna; Rinaudo, Fulvio; Morezzi, Emanuele; Quenda, Francesca; Moretti, Maria Laura
2016-06-01
CIPA is contributing with its technical knowledge in saving the heritage of Syria by constructing an open access database based on the data that the CIPA members have collected during various projects in Syria over the years before the civil war in the country broke out in 2011. In this way we wish to support the protection and preservation of the environment, sites, monuments, and artefacts and the memory of the cultural region that has been crucial for the human past and the emergence of civilizations. Apart from the countless human atrocities and loss, damage, destruction and looting of the cultural heritage have taken place in a large scale. The CIPA's initiative is one of the various international projects that have been set up after the conflict started. The Directorate-General of the Antiquities and Museums (DGAM) of Syria as well as UNESCO with its various sub-organizations have been central in facing the challenges during the war. Digital data capture, storage, use and dissemination are in the heart of CIPA's strategies in recording and documenting cultural heritage, also in Syria. It goes without saying that for the conservation and restoration work the high quality data providing metric information is of utmost importance.
National Geochronological Database
Revised by Sloan, Jan; Henry, Christopher D.; Hopkins, Melanie; Ludington, Steve; Original database by Zartman, Robert E.; Bush, Charles A.; Abston, Carl
2003-01-01
The National Geochronological Data Base (NGDB) was established by the United States Geological Survey (USGS) to collect and organize published isotopic (also known as radiometric) ages of rocks in the United States. The NGDB (originally known as the Radioactive Age Data Base, RADB) was started in 1974. A committee appointed by the Director of the USGS was given the mission to investigate the feasibility of compiling the published radiometric ages for the United States into a computerized data bank for ready access by the user community. A successful pilot program, which was conducted in 1975 and 1976 for the State of Wyoming, led to a decision to proceed with the compilation of the entire United States. For each dated rock sample reported in published literature, a record containing information on sample location, rock description, analytical data, age, interpretation, and literature citation was constructed and included in the NGDB. The NGDB was originally constructed and maintained on a mainframe computer, and later converted to a Helix Express relational database maintained on an Apple Macintosh desktop computer. The NGDB and a program to search the data files were published and distributed on Compact Disc-Read Only Memory (CD-ROM) in standard ISO 9660 format as USGS Digital Data Series DDS-14 (Zartman and others, 1995). As of May 1994, the NGDB consisted of more than 18,000 records containing over 30,000 individual ages, which is believed to represent approximately one-half the number of ages published for the United States through 1991. Because the organizational unit responsible for maintaining the database was abolished in 1996, and because we wanted to provide the data in more usable formats, we have reformatted the data, checked and edited the information in some records, and provided this online version of the NGDB. This report describes the changes made to the data and formats, and provides instructions for the use of the database in geographic information system (GIS) applications. The data are provided in .mdb (Microsoft Access), .xls (Microsoft Excel), and .txt (tab-separated value) formats. We also provide a single non-relational file that contains a subset of the data for ease of use.
NASA Astrophysics Data System (ADS)
Pedelì, C.
2013-07-01
In order to make the most of the digital outsourced documents, based on new technologies (e.g.: 3D LASER scanners, photogrammetry, etc.), a new approach was followed and a new ad hoc information system was implemented. The obtained product allow to the final user to reuse and manage the digital documents providing graphic tools and an integrated specific database to manage the entire documentation and conservation process, starting from the condition assessment until the conservation / restoration work. The system is organised on two main modules: Archaeology and Conservation. This paper focus on the features and the advantages of the second one. In particular it is emphasized its logical organisation, the possibility to easily mapping by using a very precise 3D metric platform, to benefit of the integrated relational database which allows to well organise, compare, keep and manage different kind of information at different level. Conservation module can manage along the time the conservation process of a site, monuments, object or excavation and conservation work in progress. An alternative approach called OVO by the author of this paper, force the surveyor to observe and describe the entity decomposing it on functional components, materials and construction techniques. Some integrated tools as the "ICOMOS-ISCS Illustrated glossary … " help the user to describe pathologies with a unified approach and terminology. Also the conservation project phase is strongly supported to envision future intervention and cost. A final section is devoted to record the conservation/restoration work already done or in progress. All information areas of the conservation module are interconnected to each other to allows to the system a complete interchange of graphic and alphanumeric data. The conservation module it self is connected to the archaeological one to create an interdisciplinary daily tool.
NASA Astrophysics Data System (ADS)
Štolc, Svorad; Bajla, Ivan
2010-01-01
In the paper we describe basic functions of the Hierarchical Temporal Memory (HTM) network based on a novel biologically inspired model of the large-scale structure of the mammalian neocortex. The focus of this paper is in a systematic exploration of possibilities how to optimize important controlling parameters of the HTM model applied to the classification of hand-written digits from the USPS database. The statistical properties of this database are analyzed using the permutation test which employs a randomization distribution of the training and testing data. Based on a notion of the homogeneous usage of input image pixels, a methodology of the HTM parameter optimization is proposed. In order to study effects of two substantial parameters of the architecture: the
NASA Astrophysics Data System (ADS)
Pasteka, Roman; Zahorec, Pavol; Mikuska, Jan; Szalaiova, Viktoria; Papco, Juraj; Krajnak, Martin; Kusnirak, David; Panisova, Jaroslava; Vajda, Peter; Bielik, Miroslav
2014-05-01
In this contribution results of the running project "Bouguer anomalies of new generation and the gravimetrical model of Western Carpathians (APVV-0194-10)" are presented. The existing homogenized regional database (212478 points) was enlarged by approximately 107 500 archive detailed gravity measurements. These added gravity values were measured since the year 1976 to the present, therefore they need to be unified and reprocessed. The improved positions of more than 8500 measured points were acquired by digitizing of archive maps (we recognized some local errors within particular data sets). Besides the local errors (due to the wrong positions, heights or gravity of measured points) we have found some areas of systematic errors probably due to the gravity measurement or processing errors. Some of them were confirmed and consequently corrected by field measurements within the frame of current project. Special attention is paid to the recalculation of the terrain corrections - we have used a new developed software as well as the latest version of digital terrain model of Slovakia DMR-3. Main improvement of the new terrain corrections evaluation algorithm is the possibility to calculate it in the real gravimeter position and involving of 3D polyhedral bodies approximation (accepting the spherical approximation of Earth's curvature). We have realized several tests by means of the introduction of non-standard distant relief effects introduction. A new complete Bouguer anomalies map was constructed and transformed by means of higher derivatives operators (tilt derivatives, TDX, theta-derivatives and the new TDXAS transformation), using the regularization approach. A new interesting regional lineament of probably neotectonic character was recognized in the new map of complete Bouguer anomalies and it was confirmed also by realized in-situ field measurements.
Concordance of Commercial Data Sources for Neighborhood-Effects Studies
Schootman, Mario
2010-01-01
Growing evidence supports a relationship between neighborhood-level characteristics and important health outcomes. One source of neighborhood data includes commercial databases integrated with geographic information systems to measure availability of certain types of businesses or destinations that may have either favorable or adverse effects on health outcomes; however, the quality of these data sources is generally unknown. This study assessed the concordance of two commercial databases for ascertaining the presence, locations, and characteristics of businesses. Businesses in the St. Louis, Missouri area were selected based on their four-digit Standard Industrial Classification (SIC) codes and classified into 14 business categories. Business listings in the two commercial databases were matched by standardized business name within specified distances. Concordance and coverage measures were calculated using capture–recapture methods for all businesses and by business type, with further stratification by census-tract-level population density, percent below poverty, and racial composition. For matched listings, distance between listings and agreement in four-digit SIC code, sales volume, and employee size were calculated. Overall, the percent agreement was 32% between the databases. Concordance and coverage estimates were lowest for health-care facilities and leisure/entertainment businesses; highest for popular walking destinations, eating places, and alcohol/tobacco establishments; and varied somewhat by population density. The mean distance (SD) between matched listings was 108.2 (179.0) m with varying levels of agreement in four-digit SIC (percent agreement = 84.6%), employee size (weighted kappa = 0.63), and sales volume (weighted kappa = 0.04). Researchers should cautiously interpret findings when using these commercial databases to yield measures of the neighborhood environment. PMID:20480397
AN ASSESSMENT OF GROUND TRUTH VARIABILITY USING A "VIRTUAL FIELD REFERENCE DATABASE"
A "Virtual Field Reference Database (VFRDB)" was developed from field measurment data that included location and time, physical attributes, flora inventory, and digital imagery (camera) documentation foy 1,01I sites in the Neuse River basin, North Carolina. The sampling f...
Completion of the National Land Cover Database (NLCD) 1992-2001 Land Cover Change Retrofit Product
The Multi-Resolution Land Characteristics Consortium has supported the development of two national digital land cover products: the National Land Cover Dataset (NLCD) 1992 and National Land Cover Database (NLCD) 2001. Substantial differences in imagery, legends, and methods betwe...
[Interface interconnection and data integration in implementing of digital operating room].
Feng, Jingyi; Chen, Hua; Liu, Jiquan
2011-10-01
The digital operating-room, with highly integrated clinical information, is very important for rescuing lives of patients and improving quality of operations. Since equipments in domestic operating-rooms have diversified interface and nonstandard communication protocols, designing and implementing an integrated data sharing program for different kinds of diagnosing, monitoring, and treatment equipments become a key point in construction of digital operating room. This paper addresses interface interconnection and data integration for commonly used clinical equipments from aspects of hardware interface, interface connection and communication protocol, and offers a solution for interconnection and integration of clinical equipments in heterogeneous environment. Based on the solution, a case of an optimal digital operating-room is presented in this paper. Comparing with the international solution for digital operating-room, the solution proposed in this paper is more economical and effective. And finally, this paper provides a proposal for the platform construction of digital perating-room as well as a viewpoint for standardization of domestic clinical equipments.
ClearedLeavesDB: an online database of cleared plant leaf images
2014-01-01
Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985
ClearedLeavesDB: an online database of cleared plant leaf images.
Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S
2014-03-28
Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.
The MAO NASU Plate Archive Database. Current Status and Perspectives
NASA Astrophysics Data System (ADS)
Pakuliak, L. K.; Sergeeva, T. P.
2006-04-01
The preliminary online version of the database of the MAO NASU plate archive is constructed on the basis of the relational database management system MySQL and permits an easy supplement of database with new collections of astronegatives, provides a high flexibility in constructing SQL-queries for data search optimization, PHP Basic Authorization protected access to administrative interface and wide range of search parameters. The current status of the database will be reported and the brief description of the search engine and means of the database integrity support will be given. Methods and means of the data verification and tasks for the further development will be discussed.
47 CFR 74.793 - Digital low power TV and TV translator station protection of broadcast stations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Digital low power TV and TV translator station... DISTRIBUTIONAL SERVICES Low Power TV, TV Translator, and TV Booster Stations § 74.793 Digital low power TV and TV translator station protection of broadcast stations. (a) An application to construct a new digital low power...
A pier-scour database: 2,427 field and laboratory measurements of pier scour
Benedict, Stephen T.; Caldwell, Andral W.
2014-01-01
The U.S. Geological Survey conducted a literature review to identify potential sources of published pier-scour data, and selected data were compiled into a digital spreadsheet called the 2014 USGS Pier-Scour Database (PSDb-2014) consisting of 569 laboratory and 1,858 field measurements. These data encompass a wide range of laboratory and field conditions and represent field data from 23 States within the United States and from 6 other countries. The digital spreadsheet is available on the Internet and offers a valuable resource to engineers and researchers seeking to understand pier-scour relations in the laboratory and field.
NASA Astrophysics Data System (ADS)
Koppers, A. A.; Staudigel, H.; Mills, H.; Keller, M.; Wallace, A.; Bachman, N.; Helly, J.; Helly, M.; Miller, S. P.; Massell Symons, C.
2004-12-01
To bridge the gap between Earth science teachers, librarians, scientists and data archive managers, we have started the ERESE project that will create, archive and make available "Enduring Resources in Earth Science Education" through information technology (IT) portals. In the first phase of this National Science Digital Library (NSDL) project, we are focusing on the development of these ERESE resources for middle and high school teachers to be used in lesson plans with "plate tectonics" and "magnetics" as their main theme. In this presentation, we will show how these new ERESE resources are being generated, how they can be uploaded via online web wizards, how they are archived, how we make them available via the EarthRef.org Digital Archive (ERDA) and Reference Database (ERR), and how they relate to the SIOExplorer database containing data objects for all seagoing cruises carried out by the Scripps Institution of Oceanography. The EarthRef.org web resource uses the vision of a "general description" of the Earth as a geological system to provide an IT infrastructure for the Earth sciences. This emphasizes the marriage of the "scientific process" (and its results) with an educational cyber-infrastructure for teaching Earth sciences, on any level, from middle school to college and graduate levels. Eight different databases reside under EarthRef.org from which ERDA holds any digital object that has been uploaded by other scientists, teachers and students for free, while the ERR holds more than 80,000 publications. For more than 1,500 of these publications, this latter database makes available for downloading JPG/PDF images of the abstracts, data tables, methods and appendices, together with their digitized contents in Microsoft Word and Excel format. Both holdings are being used to store the ERESE objects that are being generated by a group of undergraduate students majoring in Environmental Systems (ESYS) program at the UCSD with an emphasis on the Earth Sciences. These students perform library and internet research in order to design and generate these "Enduring Resources in Earth Science Education" that they test by closely interacting with the research faculty at the Scripps Institution of Oceanography. Typical ERESE resources can be diagrams, model cartoons, maps, data sets for analyses, and glossary items and essays to explain certain Earth Science concepts and are ready to be used in the classroom.
Katzman, G L
2001-03-01
The goal of the project was to create a method by which an in-house digital teaching file could be constructed that was simple, inexpensive, independent of hypertext markup language (HTML) restrictions, and appears identical on multiple platforms. To accomplish this, Microsoft PowerPoint and Adobe Acrobat were used in succession to assemble digital teaching files in the Acrobat portable document file format. They were then verified to appear identically on computers running Windows, Macintosh Operating Systems (OS), and the Silicon Graphics Unix-based OS as either a free-standing file using Acrobat Reader software or from within a browser window using the Acrobat browser plug-in. This latter display method yields a file viewed through a browser window, yet remains independent of underlying HTML restrictions, which may confer an advantage over simple HTML teaching file construction. Thus, a hybrid of HTML-distributed Adobe Acrobat generated WWW documents may be a viable alternative for digital teaching file construction and distribution.
Global GIS database; digital atlas of Central and South America
Hearn,, Paul P.; Hare, T.; Schruben, P.; Sherrill, D.; LaMar, C.; Tsushima, P.
2000-01-01
This CD-ROM contains a digital atlas of the countries of Central and South America. This atlas is part of a global database compiled from USGS and other data sources at the nominal scale of 1:1 million and is intended to be used as a regional-scale reference and analytical tool by government officials, researchers, the private sector, and the general public. The atlas includes free GIS software or may also be used with ESRI's ArcView software. Customized ArcView tools, specifically designed to make the atlas easier to use, are also included. The atlas contains the following datasets: country political boundaries, digital shaded relief map, elevation, slope, hydrology, locations of cities and towns, airfields, roads, railroads, utility lines, population density, geology, ecological regions, historical seismicity, volcanoes, ore deposits, oil and gas fields, climate data, landcover, vegetation index, and lights at night.
ERIC Educational Resources Information Center
Brooks, Sam; Dorst, Thomas J.
2002-01-01
Discusses the role of consortia in academic libraries, specifically the Illinois Digital Academic Library (IDAL), and describes a study conducted by the IDAL that investigated issues surrounding full text database research including stability of content, vendor communication, embargo periods, publisher concerns, quality of content, linking and…
Local Places, Global Connections: Libraries in the Digital Age. What's Going On Series.
ERIC Educational Resources Information Center
Benton Foundation, Washington, DC.
Libraries have long been pivotal community institutions--public spaces where people can come together to learn, reflect, and interact. Today, information is rapidly spreading beyond books and journals to digital government archives, business databases, electronic sound and image collections, and the flow of electronic impulses over computer…
Fact or Fiction? Libraries Can Thrive in the Digital Age
ERIC Educational Resources Information Center
Harris, Christopher
2014-01-01
Today's school library uses an increasing number of digital resources to supplement a print collection that is moving more toward fiction and literary non-fiction. Supplemental resources, including streaming video, online resources, subscription databases, audiobooks, e-books, and even games, round out the new collections. Despite the best…
NASA Astrophysics Data System (ADS)
Bürmen, Miran; Usenik, Peter; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan
2011-03-01
Dental caries is a disease characterized by demineralization of enamel crystals leading to the penetration of bacteria into the dentin and pulp. If left untreated, the disease can lead to pain, infection and tooth loss. Early detection of enamel demineralization resulting in increased enamel porosity, commonly known as white spots, is a difficult diagnostic task. Several papers reported on near infrared (NIR) spectroscopy to be a potentially useful noninvasive spectroscopic technique for early detection of caries lesions. However, the conducted studies were mostly qualitative and did not include the critical assessment of the spectral variability of the sound and carious dental tissues and influence of the water content. Such assessment is essential for development and validation of reliable qualitative and especially quantitative diagnostic tools based on NIR spectroscopy. In order to characterize the described spectral variability, a standardized diffuse reflectance hyper-spectral database was constructed by imaging 12 extracted human teeth with natural lesions of various degrees in the spectral range from 900 to 1700 nm with spectral resolution of 10 nm. Additionally, all the teeth were imaged by digital color camera. The influence of water content on the acquired spectra was characterized by monitoring the teeth during the drying process. The images were assessed by an expert, thereby obtaining the gold standard. By analyzing the acquired spectra we were able to accurately model the spectral variability of the sound dental tissues and identify the advantages and limitations of NIR hyper-spectral imaging.
The Identity Mapping Project: Demographic differences in patterns of distributed identity.
Gilbert, Richard L; Dionisio, John David N; Forney, Andrew; Dorin, Philip
2015-01-01
The advent of cloud computing and a multi-platform digital environment is giving rise to a new phase of human identity called "The Distributed Self." In this conception, aspects of the self are distributed into a variety of 2D and 3D digital personas with the capacity to reflect any number of combinations of now malleable personality traits. In this way, the source of human identity remains internal and embodied, but the expression or enactment of the self becomes increasingly external, disembodied, and distributed on demand. The Identity Mapping Project (IMP) is an interdisciplinary collaboration between psychology and computer Science designed to empirically investigate the development of distributed forms of identity. Methodologically, it collects a large database of "identity maps" - computerized graphical representations of how active someone is online and how their identity is expressed and distributed across 7 core digital domains: email, blogs/personal websites, social networks, online forums, online dating sites, character based digital games, and virtual worlds. The current paper reports on gender and age differences in online identity based on an initial database of distributed identity profiles.
Sackstein, M
2006-10-01
Over the last five years digital photography has become ubiquitous. For the family photo album, a 4 or 5 megapixel camera costing about 2000 NIS will produce satisfactory results for most people. However, for intra-oral photography the common wisdom holds that only professional photographic equipment is up to the task. Such equipment typically costs around 12,000 NIS and includes the camera body, an attachable macro lens and a ringflash. The following article challenges this conception. Although professional equipment does produce the most exemplary results, a highly effective database of clinical pictures can be compiled even with a "non-professional" digital camera. Since the year 2002, my clinical work has been routinely documented with digital cameras of the Nikon CoolPix series. The advantages are that these digicams are economical both in price and in size and allow easy transport and operation when compared to their expensive and bulky professional counterparts. The details of how to use a non-professional digicam to produce and maintain an effective clinical picture database, for documentation, monitoring, demonstration and professional fulfillment, are described below.
Burisch, Johan; Cukovic-Cavka, Silvija; Kaimakliotis, Ioannis; Shonová, Olga; Andersen, Vibeke; Dahlerup, Jens F; Elkjaer, Margarita; Langholz, Ebbe; Pedersen, Natalia; Salupere, Riina; Kolho, Kaija-Leena; Manninen, Pia; Lakatos, Peter Laszlo; Shuhaibar, Mary; Odes, Selwyn; Martinato, Matteo; Mihu, Ion; Magro, Fernando; Belousova, Elena; Fernandez, Alberto; Almer, Sven; Halfvarson, Jonas; Hart, Ailsa; Munkholm, Pia
2011-08-01
The EpiCom-study investigates a possible East-West-gradient in Europe in the incidence of IBD and the association with environmental factors. A secured web-based database is used to facilitate and centralize data registration. To construct and validate a web-based inception cohort database available in both English and Russian language. The EpiCom database has been constructed in collaboration with all 34 participating centers. The database was translated into Russian using forward translation, patient questionnaires were translated by simplified forward-backward translation. Data insertion implies fulfillment of international diagnostic criteria, disease activity, medical therapy, quality of life, work productivity and activity impairment, outcome of pregnancy, surgery, cancer and death. Data is secured by the WinLog3 System, developed in cooperation with the Danish Data Protection Agency. Validation of the database has been performed in two consecutive rounds, each followed by corrections in accordance with comments. The EpiCom database fulfills the requirements of the participating countries' local data security agencies by being stored at a single location. The database was found overall to be "good" or "very good" by 81% of the participants after the second validation round and the general applicability of the database was evaluated as "good" or "very good" by 77%. In the inclusion period January 1st -December 31st 2010 1336 IBD patients have been included in the database. A user-friendly, tailor-made and secure web-based inception cohort database has been successfully constructed, facilitating remote data input. The incidence of IBD in 23 European countries can be found at www.epicom-ecco.eu. Copyright © 2011 European Crohn's and Colitis Organisation. All rights reserved.
APPLICATION OF A "VITURAL FIELD REFERENCE DATABASE" TO ASSESS LAND-COVER MAP ACCURACIES
An accuracy assessment was performed for the Neuse River Basin, NC land-cover/use
(LCLU) mapping results using a "Virtual Field Reference Database (VFRDB)". The VFRDB was developed using field measurement and digital imagery (camera) data collected at 1,409 sites over a perio...
Guidelines for establishing and maintaining construction quality databases.
DOT National Transportation Integrated Search
2006-11-01
The main objective of this study was to develop and present guidelines for State highway agencies (SHAs) in establishing and maintaining database systems geared towards construction quality issues for asphalt and concrete paving projects. To accompli...
Access control based on attribute certificates for medical intranet applications.
Mavridis, I; Georgiadis, C; Pangalos, G; Khair, M
2001-01-01
Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy.
An Integrated Bathymetric and Topographic Digital Terrain Model of the Canadian Arctic Archipelago
NASA Astrophysics Data System (ADS)
Alm, G.; Macnab, R.; Jakobsson, M.; Kleman, J.; McCracken, M.
2002-12-01
Currently, the International Bathymetric Chart of the Arctic Ocean (IBCAO) [Jakobsson et al. 2000], contains the most up-to-date digital bathymetric model of the entire Canadian Arctic Archipelago. IBCAO is a seamless bathymetric/topographic Digital Terrain Model (DTM) that incorporates three primary data sets: all available bathymetric data at the time of compilation; the US Geological Survey GTOPO30 topographic data; and the World Vector Shoreline for coastline representation. The horizontal grid cell size is 2.5 x 2.5 km on a Polar Stereographic projection, which is adequate for regional visualization and analysis, but which may not be sufficient for certain geoscientific and oceanographic applications. However, the database that was constructed during the IBCAO project holds bathymetric data of a high quality throughout most of the Canadian Arctic Archipelago, justifying a compilation resolution that is better than 2.5 x 2.5 km. This data is primarily from historical hydrographic surveys that were carried out by the Canadian Hydrographic Survey (CHS). The construction of a higher resolution bathymetry/topography DTM of the Canadian Arctic Archipelago (complete with an error estimation of interpolated grid cells) requires a consideration of historical metadata which contains detailed descriptions of horizontal and vertical datums, positioning systems, and the depth sounding systems that were deployed during individual surveys. A significant portion of this metadata does not exist in digital form; it was not available during the IBCAO compilation, although due to the relatively low resolution of the original DTM (2.5 x 2.5 km), its absence was considered a lesser problem. We have performed "data detective" work and have extracted some of the more crucial metadata from CHS archives and are thus able to present a preliminary version of a seamless Digital Terrain Model of the Canadian Arctic Archipelago. This represents a significant improvement over the original IBCAO DTM in this area. The use of a merged seamless bathymetry/topography model substantially facilitates the overlay and incorporation of other spatially referenced geological and geophysical datasets. For example, one intended use of the model is to merge the results from the mapping of regional glacial morphology features, in order to further address the glacial history of the region. Jakobsson, M., Cherkis, N., Woodward, J., Coakley, B., and Macnab, R., 2000, A new grid of Arctic bathymetry: A significant resource for scientists and mapmakers, EOS Transactions, American Geophysical Union, v. 81, no. 9, p. 89, 93, 96.
The place-value of a digit in multi-digit numbers is processed automatically.
Kallai, Arava Y; Tzelgov, Joseph
2012-09-01
The automatic processing of the place-value of digits in a multi-digit number was investigated in 4 experiments. Experiment 1 and two control experiments employed a numerical comparison task in which the place-value of a non-zero digit was varied in a string composed of zeros. Experiment 2 employed a physical comparison task in which strings of digits varied in their physical sizes. In both types of tasks, the place-value of the non-zero digit in the string was irrelevant to the task performed. Interference of the place-value information was found in both tasks. When the non-zero digit occupied a lower place-value, it was recognized slower as a larger digit or as written in a larger font size. We concluded that place-value in a multi-digit number is processed automatically. These results support the notion of a decomposed representation of multi-digit numbers in memory. PsycINFO Database Record (c) 2012 APA, all rights reserved.
Parallel database search and prime factorization with magnonic holographic memory devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khitun, Alexander
In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploitmore » wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.« less
Parallel database search and prime factorization with magnonic holographic memory devices
NASA Astrophysics Data System (ADS)
Khitun, Alexander
2015-12-01
In this work, we describe the capabilities of Magnonic Holographic Memory (MHM) for parallel database search and prime factorization. MHM is a type of holographic device, which utilizes spin waves for data transfer and processing. Its operation is based on the correlation between the phases and the amplitudes of the input spin waves and the output inductive voltage. The input of MHM is provided by the phased array of spin wave generating elements allowing the producing of phase patterns of an arbitrary form. The latter makes it possible to code logic states into the phases of propagating waves and exploit wave superposition for parallel data processing. We present the results of numerical modeling illustrating parallel database search and prime factorization. The results of numerical simulations on the database search are in agreement with the available experimental data. The use of classical wave interference may results in a significant speedup over the conventional digital logic circuits in special task data processing (e.g., √n in database search). Potentially, magnonic holographic devices can be implemented as complementary logic units to digital processors. Physical limitations and technological constrains of the spin wave approach are also discussed.
[Research report of experimental database establishment of digitized virtual Chinese No.1 female].
Zhong, Shi-zhen; Yuan, Lin; Tang, Lei; Huang, Wen-hua; Dai, Jing-xing; Li, Jian-yi; Liu, Chang; Wang, Xing-hai; Li, Hua; Luo, Shu-qian; Qin, Dulie; Zeng, Shao-qun; Wu, Tao; Zhang, Mei-chao; Wu, Kun-cheng; Jiao, Pei-feng; Lu, Yun-tao; Chen, Hao; Li, Pei-liang; Gao, Yuan; Wang, Tong; Fan, Ji-hong
2003-03-01
To establish digitized virtual Chinese No.1 female (VCH-F1) image database. A 19 years old female cadaver was scanned by CT, MRI, and perfused with red filling material through formal artery before freezing and em- bedding. The whole body was cut by JZ1500A vertical milling machine with a 0.2 mm inter-spacing. All the images was produced by Fuji FinePix S2 Pro camera. The body index of VCH-F1 was 94%. We cut 8 556 sections of the whole body, and each image was 17.5 MB in size and the whole database reached 149.7 GB. We have totally 6 versions of the database for different applications. Compared with other databases, VCH-F1 has good representation of the Chinese body shape, colorful filling material in blood vessels providing enough information for future registration and segmentation. Vertical embedding and cutting helped to retain normal human physiological posture, and the image quality and operation efficiency were improved by using various techniques such as one-time freezing and fixation, double-temperature icehouse, large-diameter milling disc and whole body cutting.
Constructing a Geology Ontology Using a Relational Database
NASA Astrophysics Data System (ADS)
Hou, W.; Yang, L.; Yin, S.; Ye, J.; Clarke, K.
2013-12-01
In geology community, the creation of a common geology ontology has become a useful means to solve problems of data integration, knowledge transformation and the interoperation of multi-source, heterogeneous and multiple scale geological data. Currently, human-computer interaction methods and relational database-based methods are the primary ontology construction methods. Some human-computer interaction methods such as the Geo-rule based method, the ontology life cycle method and the module design method have been proposed for applied geological ontologies. Essentially, the relational database-based method is a reverse engineering of abstracted semantic information from an existing database. The key is to construct rules for the transformation of database entities into the ontology. Relative to the human-computer interaction method, relational database-based methods can use existing resources and the stated semantic relationships among geological entities. However, two problems challenge the development and application. One is the transformation of multiple inheritances and nested relationships and their representation in an ontology. The other is that most of these methods do not measure the semantic retention of the transformation process. In this study, we focused on constructing a rule set to convert the semantics in a geological database into a geological ontology. According to the relational schema of a geological database, a conversion approach is presented to convert a geological spatial database to an OWL-based geological ontology, which is based on identifying semantics such as entities, relationships, inheritance relationships, nested relationships and cluster relationships. The semantic integrity of the transformation was verified using an inverse mapping process. In a geological ontology, an inheritance and union operations between superclass and subclass were used to present the nested relationship in a geochronology and the multiple inheritances relationship. Based on a Quaternary database of downtown of Foshan city, Guangdong Province, in Southern China, a geological ontology was constructed using the proposed method. To measure the maintenance of semantics in the conversation process and the results, an inverse mapping from the ontology to a relational database was tested based on a proposed conversation rule. The comparison of schema and entities and the reduction of tables between the inverse database and the original database illustrated that the proposed method retains the semantic information well during the conversation process. An application for abstracting sandstone information showed that semantic relationships among concepts in the geological database were successfully reorganized in the constructed ontology. Key words: geological ontology; geological spatial database; multiple inheritance; OWL Acknowledgement: This research is jointly funded by the Specialized Research Fund for the Doctoral Program of Higher Education of China (RFDP) (20100171120001), NSFC (41102207) and the Fundamental Research Funds for the Central Universities (12lgpy19).
Digital mapping techniques '06 - Workshop proceedings
Soller, David R.
2007-01-01
The Digital Mapping Techniques `06 (DMT`06) workshop was attended by more than 110 technical experts from 51 agencies, universities, and private companies, including representatives from 27 state geological surveys (see Appendix A of these Proceedings). This workshop was similar in nature to the previous nine meetings, which were held in Lawrence, Kansas (Soller, 1997), Champaign, Illinois (Soller, 1998), Madison, Wisconsin (Soller, 1999), Lexington, Kentucky (Soller, 2000), Tuscaloosa, Alabama (Soller, 2001), Salt Lake City, Utah (Soller, 2002), Millersville, Pennsylvania (Soller, 2003), Portland, Oregon (Soller, 2004), and Baton Rouge, Louisiana (Soller, 2005). This year?s meeting was hosted by the Ohio Geological Survey, from June 11-14, 2006, on the Ohio State University campus in Columbus, Ohio. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops.Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, the latter of which was formed in August 1996 to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database - and for the State and Federal geological surveys - to provide more high-quality digital maps to the public.At the 2006 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, "publishing" includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.
ERIC Educational Resources Information Center
Lancor, Rachael; Lancor, Brian
2014-01-01
In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…
Digital Media and "Girling" at an Elite Girls' School
ERIC Educational Resources Information Center
Charles, Claire
2007-01-01
In this article, I draw on Judith Butler's notion of performativity to investigate the role of digital technologies in processes of gendered subjectification (or "girling") in elite girls' education. Elite girls' schooling is a site where the potential of digital technologies in mediating student-led constructions and explorations of…
The Profiles in Science Digital Library: Behind the Scenes.
Gallagher, Marie E; Moffatt, Christie
2012-01-01
This demonstration shows the Profiles in Science ® digital library. Profiles in Science contains digitized selections from the personal manuscript collections of prominent biomedical researchers, medical practitioners, and those fostering science and health. The Profiles in Science Web site is the delivery mechanism for content derived from the digital library system. The system is designed according to our basic principles for digital library development [1]. The digital library includes the rules and software used for digitizing items, creating and editing database records and performing quality control as well as serving the digital content to the public. Among the types of data managed by the digital library are detailed item-level, collection-level and cross-collection metadata, digitized photographs, papers, audio clips, movies, born-digital electronic files, optical character recognized (OCR) text, and annotations (see Figure 1). The digital library also tracks the status of each item, including digitization quality, sensitivity of content, and copyright. Only items satisfying all required criteria are released to the public through the World Wide Web. External factors have influenced all aspects of the digital library's infrastructure.
Stochastic Downscaling of Digital Elevation Models
NASA Astrophysics Data System (ADS)
Rasera, Luiz Gustavo; Mariethoz, Gregoire; Lane, Stuart N.
2016-04-01
High-resolution digital elevation models (HR-DEMs) are extremely important for the understanding of small-scale geomorphic processes in Alpine environments. In the last decade, remote sensing techniques have experienced a major technological evolution, enabling fast and precise acquisition of HR-DEMs. However, sensors designed to measure elevation data still feature different spatial resolution and coverage capabilities. Terrestrial altimetry allows the acquisition of HR-DEMs with centimeter to millimeter-level precision, but only within small spatial extents and often with dead ground problems. Conversely, satellite radiometric sensors are able to gather elevation measurements over large areas but with limited spatial resolution. In the present study, we propose an algorithm to downscale low-resolution satellite-based DEMs using topographic patterns extracted from HR-DEMs derived for example from ground-based and airborne altimetry. The method consists of a multiple-point geostatistical simulation technique able to generate high-resolution elevation data from low-resolution digital elevation models (LR-DEMs). Initially, two collocated DEMs with different spatial resolutions serve as an input to construct a database of topographic patterns, which is also used to infer the statistical relationships between the two scales. High-resolution elevation patterns are then retrieved from the database to downscale a LR-DEM through a stochastic simulation process. The output of the simulations are multiple equally probable DEMs with higher spatial resolution that also depict the large-scale geomorphic structures present in the original LR-DEM. As these multiple models reflect the uncertainty related to the downscaling, they can be employed to quantify the uncertainty of phenomena that are dependent on fine topography, such as catchment hydrological processes. The proposed methodology is illustrated for a case study in the Swiss Alps. A swissALTI3D HR-DEM (with 5 m resolution) and a SRTM-derived LR-DEM from the Western Alps are used to downscale a SRTM-based LR-DEM from the eastern part of the Alps. The results show that the method is capable of generating multiple high-resolution synthetic DEMs that reproduce the spatial structure and statistics of the original DEM.
NASA Astrophysics Data System (ADS)
Dong, Huaimin; Sun, Jianmeng; Lin, Zhenzhou; Fang, Hui; Li, Yafen; Cui, Likai; Yan, Weichao
2018-02-01
Natural gas hydrate is being considered as an alternative energy source for sustainable development and has become a focus of research throughout the world. In this paper, based on CT scanning images of hydrate reservoir rocks, combined with the microscopic distribution of hydrate, a diffusion limited aggregation (DLA) model was used to construct 3D hydrate digital rocks of different distribution types, and the finite-element method was used to simulate their electrical characteristics in order to study the influence of different hydrate distribution types, hydrate saturation and formation of water salinity on electrical properties. The results show that the hydrate digital rocks constructed using the DLA model can be used to characterize the microscopic distribution of different types of hydrates. Under the same conditions, the resistivity of the adhesive hydrate digital rock is higher than the cemented and scattered type digital rocks, and the resistivity of the scattered hydrate digital rock is the smallest among the three types. Besides, the difference in the resistivity of the different types of hydrate digital rocks increases with an increase in hydrate saturation, especially when the saturation is larger than 55%, and the rate of increase of each of the hydrate types is quite different. Similarly, the resistivity of the three hydrate types decreases with an increase in the formation of water salinity. The single distribution hydrate digital rock constructed, combined with the law of microscopic distribution and influence of saturation on the electrical properties, can effectively improve the accuracy of logging identification of hydrate reservoirs and is of great significance for the estimation of hydrate reserves.
NYC Reservoirs Watershed Areas (HUC 12)
This NYC Reservoirs Watershed Areas (HUC 12) GIS layer was derived from the 12-Digit National Watershed Boundary Database (WBD) at 1:24,000 for EPA Region 2 and Surrounding States. HUC 12 polygons were selected from the source based on interactively comparing these HUC 12s in our GIS with images of the New York City's Water Supply System Map found at http://www.nyc.gov/html/dep/html/drinking_water/wsmaps_wide.shtml. The 12 digit Hydrologic Units (HUCs) for EPA Region 2 and surrounding states (Northeastern states, parts of the Great Lakes, Puerto Rico and the USVI) are a subset of the National Watershed Boundary Database (WBD), downloaded from the Natural Resources Conservation Service (NRCS) Geospatial Gateway and imported into the EPA Region 2 Oracle/SDE database. This layer reflects 2009 updates to the WBD that included new boundary data for New York and New Jersey.
Dangers of Noncritical Use of Historical Plague Data
Roosen, Joris
2018-01-01
Researchers have published several articles using historical data sets on plague epidemics using impressive digital databases that contain thousands of recorded outbreaks across Europe over the past several centuries. Through the digitization of preexisting data sets, scholars have unprecedented access to the historical record of plague occurrences. However, although these databases offer new research opportunities, noncritical use and reproduction of preexisting data sets can also limit our understanding of how infectious diseases evolved. Many scholars have performed investigations using Jean-Noël Biraben’s data, which contains information on mentions of plague from various kinds of sources, many of which were not cited. When scholars fail to apply source criticism or do not reflect on the content of the data they use, the reliability of their results becomes highly questionable. Researchers using these databases going forward need to verify and restrict content spatially and temporally, and historians should be encouraged to compile the work.
Digital Management and Curation of the National Rock and Ore Collections at NMNH, Smithsonian
NASA Astrophysics Data System (ADS)
Cottrell, E.; Andrews, B.; Sorensen, S. S.; Hale, L. J.
2011-12-01
The National Museum of Natural History, Smithsonian Institution, is home to the world's largest curated rock collection. The collection houses 160,680 physical rock and ore specimen lots ("samples"), all of which already have a digital record that can be accessed by the public through a searchable web interface (http://collections.mnh.si.edu/search/ms/). In addition, there are 66 accessions pending that when catalogued will add approximately 60,000 specimen lots. NMNH's collections are digitally managed on the KE EMu° platform which has emerged as the premier system for managing collections in natural history museums worldwide. In 2010 the Smithsonian released an ambitious 5 year Digitization Strategic Plan. In Mineral Sciences, new digitization efforts in the next five years will focus on integrating various digital resources for volcanic specimens. EMu sample records will link to the corresponding records for physical eruption information housed within the database of Smithsonian's Global Volcanism Program (GVP). Linkages are also planned between our digital records and geochemical databases (like EarthChem or PetDB) maintained by third parties. We anticipate that these linkages will increase the use of NMNH collections as well as engender new scholarly directions for research. Another large project the museum is currently undertaking involves the integration of the functionality of in-house designed Transaction Management software with the EMu database. This will allow access to the details (borrower, quantity, date, and purpose) of all loans of a given specimen through its catalogue record. We hope this will enable cross-referencing and fertilization of research ideas while avoiding duplicate efforts. While these digitization efforts are critical, we propose that the greatest challenge to sample curation is not posed by digitization and that a global sample registry alone will not ensure that samples are available for reuse. We suggest instead that the ability of the Earth science community to identify and preserve important collections and make them available for future study is limited by personnel and space resources from the level of the individual PI to the level of national facilities. Moreover, when it comes to specimen "estate planning," the cultural attitudes of scientists, institutions, and funding agencies are often inadequate to provide for long-term specimen curation - even if specimen discovery is enabled by digital registry. Timely access to curated samples requires that adequate resources be devoted to the physical care of specimens (facilities) and to the personnel costs associated with curation - from the conservation, storage, and inventory management of specimens, to the dispersal of samples for research, education, and exhibition.
Historical geoscientific collections - requirements on digital cataloging and problems
NASA Astrophysics Data System (ADS)
Ehling, A.
2011-12-01
The Federal Institute for Geosciences and Natural Resources maintains comprehensive geoscientific collections: the historical collections of Prussian Geological Survey in Berlin (19th and 20th century; about 2 mio specimen) and the geoscientific collections of the 20th century in Hannover (about 800.000 specimen). Nowadays, where financial support is strictly bound to efficiency and rentability on one side and the soaring (among young people - nearly exclusive) use of the web for the research, it is mandatory to provide the information about the available stock of specimen on the web. The digital cataloging has being carried out since 20 years: up to now about 40 % of the stock has been documented in 20 access-databases. The experiences of 20 years digital cataloging as well as the contact with professional users allow to formulate the requirements on a modern digital database with all accordingly problems. The main problems are different kinds of specimen: minerals, rocks, fossils, drill cores with diverging descriptions; obsolescent names of minerals, rocks and geographical sites; generations of various inventory numbers; inhomogeneous data (quantity and quality). Out of it result requirements to much, well educated manpower on the one side and an intelligent digital solution on the other side: it should have an internationally useable standard considering all the described local problems.
Bigdeli, Shoaleh; Kaufman, David
2017-01-01
Background: The application of digital educational games in health professions education is on expansion and game-based education usage is increasing. Methods: Diverse databases were searched and the related papers were reviewed. Results: Considering the growing popularity of educational games in medical education, we attempted to classify their benefits, flaws, and engaging factors. Conclusion: Advantages, disadvantages, and engagement factors of educational digital games used for health professions education must be the focus of attention in designing games for health professions discipline.
Bigdeli, Shoaleh; Kaufman, David
2017-01-01
Background: The application of digital educational games in health professions education is on expansion and game-based education usage is increasing. Methods: Diverse databases were searched and the related papers were reviewed. Results: Considering the growing popularity of educational games in medical education, we attempted to classify their benefits, flaws, and engaging factors. Conclusion: Advantages, disadvantages, and engagement factors of educational digital games used for health professions education must be the focus of attention in designing games for health professions discipline. PMID:29951418
Bolm, Karen S.
2002-01-01
The map area is located in southeastern Arizona. This report describes the map units, the methods used to convert the geologic map data into a digital format, and the ArcInfo GIS file structures and relationships; and it explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet. See figures 2 and 3 for page-size versions of the map compilation.
ERIC Educational Resources Information Center
O'Neill, Edward T.; Lavoie, Brian F.; Bennett, Rick; Staples, Thornton; Wayland, Ross; Payette, Sandra; Dekkers, Makx; Weibel, Stuart; Searle, Sam; Thompson, Dave; Rudner, Lawrence M.
2003-01-01
Includes five articles that examine key trends in the development of the public Web: size and growth, internationalization, and metadata usage; Flexible Extensible Digital Object and Repository Architecture (Fedora) for use in digital libraries; developments in the Dublin Core Metadata Initiative (DCMI); the National Library of New Zealand Te Puna…
The influence of lumber grade on machine productivity in the rough mill
Philip H. Steele; Jan Wiedenbeck; Rubin Shmulsky; Anura Perera; Anura Perera
1999-01-01
Lumber grade effect on hardwood-part processing time was investigated with a digitally described lumber database in conjunction with a crosscut-first rough mill yield optimization simulator. In this study, the digital lumber sample was subdivided into five hardwood lumber grades. Three cutting bills with varying degrees of difficulty were Cut." The three cutting...
ERIC Educational Resources Information Center
Xia, Wei
2003-01-01
Provides an overview of research conducted at Victoria University of Wellington regarding differing perceptions and expectations of user communities and librarians related to the usability of digital services. Considers access to services, currency of information on the Web site, the online public access catalog, databases, electronic journals,…
Digital hand atlas for web-based bone age assessment: system design and implementation
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente
2000-04-01
A frequently used assessment method of skeletal age is atlas matching by a radiological examination of a hand image against a small set of Greulich-Pyle patterns of normal standards. The method however can lead to significant deviation in age assessment, due to a variety of observers with different levels of training. The Greulich-Pyle atlas based on middle upper class white populations in the 1950s, is also not fully applicable for children of today, especially regarding the standard development in other racial groups. In this paper, we present our system design and initial implementation of a digital hand atlas and computer-aided diagnostic (CAD) system for Web-based bone age assessment. The digital atlas will remove the disadvantages of the currently out-of-date one and allow the bone age assessment to be computerized and done conveniently via Web. The system consists of a hand atlas database, a CAD module and a Java-based Web user interface. The atlas database is based on a large set of clinically normal hand images of diverse ethnic groups. The Java-based Web user interface allows users to interact with the hand image database form browsers. Users can use a Web browser to push a clinical hand image to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, is then extracted and compared with patterns from the atlas database to assess the bone age.
SPACEWAY: Providing affordable and versatile communication solutions
NASA Astrophysics Data System (ADS)
Fitzpatrick, E. J.
1995-08-01
By the end of this decade, Hughes' SPACEWAY network will provide the first interactive 'bandwidth on demand' communication services for a variety of applications. High quality digital voice, interactive video, global access to multimedia databases, and transborder workgroup computing will make SPACEWAY an essential component of the computer-based workplace of the 21st century. With relatively few satellites to construct, insure, and launch -- plus extensive use of cost-effective, tightly focused spot beams on the world's most populated areas -- the high capacity SPACEWAY system can pass its significant cost savings onto its customers. The SPACEWAY network is different from other proposed global networks in that its geostationary orbit location makes it a truly market driven system: each satellite will make available extensive telecom services to hundreds of millions of people within the continuous view of that satellite, providing immediate capacity within a specific region of the world.
Manikin families representing obese airline passengers in the US.
Park, Hanjun; Park, Woojin; Kim, Yongkang
2014-01-01
Aircraft passenger spaces designed without proper anthropometric analyses can create serious problems for obese passengers, including: possible denial of boarding, excessive body pressures and contact stresses, postural fixity and related health hazards, and increased risks of emergency evacuation failure. In order to help address the obese passenger's accommodation issues, this study developed male and female manikin families that represent obese US airline passengers. Anthropometric data of obese individuals obtained from the CAESAR anthropometric database were analyzed through PCA-based factor analyses. For each gender, a 99% enclosure cuboid was constructed, and a small set of manikins was defined on the basis of each enclosure cuboid. Digital human models (articulated human figures) representing the manikins were created using a human CAD software program. The manikin families were utilized to develop design recommendations for selected aircraft seat dimensions. The manikin families presented in this study would greatly facilitate anthropometrically accommodating large airline passengers.
Database of the Geology and Thermal Activity of Norris Geyser Basin, Yellowstone National Park
Flynn, Kathryn; Graham Wall, Brita; White, Donald E.; Hutchinson, Roderick A.; Keith, Terry E.C.; Clor, Laura; Robinson, Joel E.
2008-01-01
This dataset contains contacts, geologic units and map boundaries from Plate 1 of USGS Professional Paper 1456, 'The Geology and Remarkable Thermal Activity of Norris Geyser Basin, Yellowstone National Park, Wyoming.' The features are contained in the Annotation, basins_poly, contours, geology_arc, geology_poly, point_features, and stream_arc feature classes as well as a table of geologic units and their descriptions. This dataset was constructed to produce a digital geologic map as a basis for studying hydrothermal processes in Norris Geyser Basin. The original map does not contain registration tic marks. To create the geodatabase, the original scanned map was georegistered to USGS aerial photographs of the Norris Junction quadrangle collected in 1994. Manmade objects, i.e. roads, parking lots, and the visitor center, along with stream junctions and other hydrographic features, were used for registration.
Conflation and integration of archived geologic maps and associated uncertainties
Shoberg, Thomas G.
2016-01-01
Old, archived geologic maps are often available with little or no associated metadata. This creates special problems in terms of extracting their data to use with a modern database. This research focuses on some problems and uncertainties associated with conflating older geologic maps in regions where modern geologic maps are, as yet, non-existent as well as vertically integrating the conflated maps with layers of modern GIS data (in this case, The National Map of the U.S. Geological Survey). Ste. Genevieve County, Missouri was chosen as the test area. It is covered by six archived geologic maps constructed in the years between 1928 and 1994. Conflating these maps results in a map that is internally consistent with these six maps, is digitally integrated with hydrography, elevation and orthoimagery data, and has a 95% confidence interval useful for further data set integration.
SPACEWAY: Providing affordable and versatile communication solutions
NASA Technical Reports Server (NTRS)
Fitzpatrick, E. J.
1995-01-01
By the end of this decade, Hughes' SPACEWAY network will provide the first interactive 'bandwidth on demand' communication services for a variety of applications. High quality digital voice, interactive video, global access to multimedia databases, and transborder workgroup computing will make SPACEWAY an essential component of the computer-based workplace of the 21st century. With relatively few satellites to construct, insure, and launch -- plus extensive use of cost-effective, tightly focused spot beams on the world's most populated areas -- the high capacity SPACEWAY system can pass its significant cost savings onto its customers. The SPACEWAY network is different from other proposed global networks in that its geostationary orbit location makes it a truly market driven system: each satellite will make available extensive telecom services to hundreds of millions of people within the continuous view of that satellite, providing immediate capacity within a specific region of the world.
Analog-to-digital clinical data collection on networked workstations with graphic user interface.
Lunt, D
1991-02-01
An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.
A robust dataset-agnostic heart disease classifier from Phonocardiogram.
Banerjee, Rohan; Dutta Choudhury, Anirban; Deshpande, Parijat; Bhattacharya, Sakyajit; Pal, Arpan; Mandana, K M
2017-07-01
Automatic classification of normal and abnormal heart sounds is a popular area of research. However, building a robust algorithm unaffected by signal quality and patient demography is a challenge. In this paper we have analysed a wide list of Phonocardiogram (PCG) features in time and frequency domain along with morphological and statistical features to construct a robust and discriminative feature set for dataset-agnostic classification of normal and cardiac patients. The large and open access database, made available in Physionet 2016 challenge was used for feature selection, internal validation and creation of training models. A second dataset of 41 PCG segments, collected using our in-house smart phone based digital stethoscope from an Indian hospital was used for performance evaluation. Our proposed methodology yielded sensitivity and specificity scores of 0.76 and 0.75 respectively on the test dataset in classifying cardiovascular diseases. The methodology also outperformed three popular prior art approaches, when applied on the same dataset.
Incorporating the APS Catalog of the POSS I and Image Archive in ADS
NASA Technical Reports Server (NTRS)
Humphreys, Roberta M.
1998-01-01
The primary purpose of this contract was to develop the software to both create and access an on-line database of images from digital scans of the Palomar Sky Survey. This required modifying our DBMS (called Star Base) to create an image database from the actual raw pixel data from the scans. The digitized images are processed into a set of coordinate-reference index and pixel files that are stored in run-length files, thus achieving an efficient lossless compression. For efficiency and ease of referencing, each digitized POSS I plate is then divided into 900 subplates. Our custom DBMS maps each query into the corresponding POSS plate(s) and subplate(s). All images from the appropriate subplates are retrieved from disk with byte-offsets taken from the index files. These are assembled on-the-fly into a GIF image file for browser display, and a FITS format image file for retrieval. The FITS images have a pixel size of 0.33 arcseconds. The FITS header contains astrometric and photometric information. This method keeps the disk requirements manageable while allowing for future improvements. When complete, the APS Image Database will contain over 130 Gb of data. A set of web pages query forms are available on-line, as well as an on-line tutorial and documentation. The database is distributed to the Internet by a high-speed SGI server and a high-bandwidth disk system. URL is http://aps.umn.edu/IDB/. The image database software is written in perl and C and has been compiled on SGI computers with MIX5.3. A copy of the written documentation is included and the software is on the accompanying exabyte tape.
NASA Astrophysics Data System (ADS)
Slater, T. F.; Tatge, C. B.; Ratcliff, M.; Slater, S. J.
2016-12-01
Dedicated sky watchers through the centuries have long sought to find the best teaching methods to efficiently and effectively transfer vast amounts of accumulated star knowledge to the next generation of sky watchers. Although detailed maps specifying the names and locations of stars have been carefully displayed on spherical globes for thousands of years, it is the 1923 installation of a Zeiss-made, large, mechanical star projector in Munich that is often cited as the first modern projection planetarium for teaching astronomy. In the 1930's, impressive planetariums were installed Chicago, Los Angeles and New York, which then in turn served as a catalyst for additional planetarium construction. Planetarium construction increased rapidly in the United States due to federal funding to schools and museums through the 1958 US National Defense Education Act and the US went from one planetarium in 1930, to six in 1940, to about 100 in 1960, increasing to 200 in 1963, 450 by 1967—even before humans had landed on the Moon—and more than 1,000 by 1975. Today, nearly 3,000 permanent planetarium facilities are available to show the stars and heavenly motions to children and adults alike across the world, with perhaps another thousand portable planetariums adding to the available teaching venues. Simultaneous with their construction, discipline-based astronomy education have been trying to better understand, and ultimately improve, how people learn astronomy in the planetarium. A systematic analysis of planetarium education research articles, dissertations, and theses found in the recently constructed, community-wide, international Study of Astronomical Reasoning iSTAR database at istardatabase.org reveal that many of the systematic studies conducted in the 1960s and 1970s using domes served by servo-mechanical star projects have been reproduced again in recent decades in theaters using digital video projection showing nearly the same results: student-passive, information-download lectures are largely ineffective at enhancing student learning and student attitudes toward science whether they occur in a traditional classroom or multi-media planetarium theater.
Ibbotson, Paul
2013-01-01
We use the Google Ngram database, a corpus of 5,195,769 digitized books containing ~4% of all books ever published, to test three ideas that are hypothesized to account for linguistic generalizations: verbal semantics, pre-emption and skew. Using 828,813 tokens of un-forms as a test case for these mechanisms, we found verbal semantics was a good predictor of the frequency of un-forms in the English language over the past 200 years—both in terms of how the frequency changed over time and their frequency rank. We did not find strong evidence for the direct competition of un-forms and their top pre-emptors, however the skew of the un-construction competitors was inversely correlated with the acceptability of the un-form. We suggest a cognitive explanation for this, namely, that the more the set of relevant pre-emptors is skewed then the more easily it is retrieved from memory. This suggests that it is not just the frequency of pre-emptive forms that must be taken into account when trying to explain usage patterns but their skew as well. PMID:24399991
DOT National Transportation Integrated Search
2002-05-01
Knowledge of surface and subsurface geology is fundamental to the planning and development of new or modified transportation systems. Toward this : end, we have compiled a model GIS database consisting of important geologic, cartographic, environment...
Back to the Scriptorium: Database Marketplace 2009
ERIC Educational Resources Information Center
Tenopir, Carol; Baker, Gayle; Grogg, Jill E.
2009-01-01
The 2009 database marketplace is bounded by two extremes: massive digitization projects to increase access, and retrenchment owing to budget worries. Picture medieval monks hunched over their desks in the scriptorium as they labor to copy manuscripts. A 21st-century version of this activity is being repeated daily in the world's libraries and…
"There's so Much Data": Exploring the Realities of Data-Based School Governance
ERIC Educational Resources Information Center
Selwyn, Neil
2016-01-01
Educational governance is commonly predicated around the generation, collation and processing of data through digital technologies. Drawing upon an empirical study of two Australian secondary schools, this paper explores the different forms of data-based governance that are being enacted by school leaders, managers, administrators and teachers.…
Computerization of the Arkansas Fishes Database
Henry W. Robison; L. Gayle Henderson; Melvin L. Warren; Janet S. Rader
2004-01-01
Abstract - Until recently, distributional data for the fishes of Arkansas existed in the form of museum records, field notebooks of various ichthyologists, and published fish survey data; none of which was in a digital format. In 1995, a relational database system was used to design a PC platform data entry module for the capture of information on...
Sentence-Based Metadata: An Approach and Tool for Viewing Database Designs.
ERIC Educational Resources Information Center
Boyle, John M.; Gunge, Jakob; Bryden, John; Librowski, Kaz; Hanna, Hsin-Yi
2002-01-01
Describes MARS (Museum Archive Retrieval System), a research tool which enables organizations to exchange digital images and documents by means of a common thesaurus structure, and merge the descriptive data and metadata of their collections. Highlights include theoretical basis; searching the MARS database; and examples in European museums.…
Carlson, Mary H.; Zientek, Michael L.; Causey, J. Douglas; Kayser, Helen Z.; Spanski, Gregory T.; Wilson, Anna B.; Van Gosen, Bradley S.; Trautwein, Charles M.
2007-01-01
This report compiles selected results from 13 U.S. Geological Survey (USGS) mineral resource assessment studies conducted in Idaho and Montana into consistent spatial databases that can be used in a geographic information system. The 183 spatial databases represent areas of mineral potential delineated in these studies and include attributes on mineral deposit type, level of mineral potential, certainty, and a reference. The assessments were conducted for five 1? x 2? quadrangles (Butte, Challis, Choteau, Dillon, and Wallace), several U.S. Forest Service (USFS) National Forests (including Challis, Custer, Gallatin, Helena, and Payette), and one Bureau of Land Management (BLM) Resource Area (Dillon). The data contained in the spatial databases are based on published information: no new interpretations are made. This digital compilation is part of an ongoing effort to provide mineral resource information formatted for use in spatial analysis. In particular, this is one of several reports prepared to address USFS needs for science information as forest management plans are revised in the Northern Rocky Mountains.
Ethical implications of digital images for teaching and learning purposes: an integrative review.
Kornhaber, Rachel; Betihavas, Vasiliki; Baber, Rodney J
2015-01-01
Digital photography has simplified the process of capturing and utilizing medical images. The process of taking high-quality digital photographs has been recognized as efficient, timely, and cost-effective. In particular, the evolution of smartphone and comparable technologies has become a vital component in teaching and learning of health care professionals. However, ethical standards in relation to digital photography for teaching and learning have not always been of the highest standard. The inappropriate utilization of digital images within the health care setting has the capacity to compromise patient confidentiality and increase the risk of litigation. Therefore, the aim of this review was to investigate the literature concerning the ethical implications for health professionals utilizing digital photography for teaching and learning. A literature search was conducted utilizing five electronic databases, PubMed, Embase (Excerpta Medica Database), Cumulative Index to Nursing and Allied Health Literature, Educational Resources Information Center, and Scopus, limited to English language. Studies that endeavored to evaluate the ethical implications of digital photography for teaching and learning purposes in the health care setting were included. The search strategy identified 514 papers of which nine were retrieved for full review. Four papers were excluded based on the inclusion criteria, leaving five papers for final analysis. Three key themes were developed: knowledge deficit, consent and beyond, and standards driving scope of practice. The assimilation of evidence in this review suggests that there is value for health professionals utilizing digital photography for teaching purposes in health education. However, there is limited understanding of the process of obtaining and storage and use of such mediums for teaching purposes. Disparity was also highlighted related to policy and guideline identification and development in clinical practice. Therefore, the implementation of policy to guide practice requires further research.
A Hybrid Approach using Collaborative filtering and Content based Filtering for Recommender System
NASA Astrophysics Data System (ADS)
Geetha, G.; Safa, M.; Fancy, C.; Saranya, D.
2018-04-01
In today’s digital world, it has become an irksome task to find the content of one's liking in an endless variety of content that are being consumed like books, videos, articles, movies, etc. On the other hand there has been an emerging growth among the digital content providers who want to engage as many users on their service as possible for the maximum time. This gave birth to the recommender system comes wherein the content providers recommend users the content according to the users’ taste and liking. In this paper we have proposed a movie recommendation system. A movie recommendation is important in our social life due to its features such as suggesting a set of movies to users based on their interest, or the popularities of the movies. In this paper we are proposing a movie recommendation system that has the ability to recommend movies to a new user as well as the other existing users. It mines movie databases to collect all the important information, such as, popularity and attractiveness, which are required for recommendation. We use content-based and collaborative filtering and also hybrid filtering, which is a combination of the results of these two techniques, to construct a system that provides more precise recommendations concerning movies.
Harp, E.L.; Reid, M.E.; McKenna, J.P.; Michael, J.A.
2009-01-01
Loss of life and property caused by landslides triggered by extreme rainfall events demonstrates the need for landslide-hazard assessment in developing countries where recovery from such events often exceeds the country's resources. Mapping landslide hazards in developing countries where the need for landslide-hazard mitigation is great but the resources are few is a challenging, but not intractable problem. The minimum requirements for constructing a physically based landslide-hazard map from a landslide-triggering storm, using the simple methods we discuss, are: (1) an accurate mapped landslide inventory, (2) a slope map derived from a digital elevation model (DEM) or topographic map, and (3) material strength properties of the slopes involved. Provided that the landslide distribution from a triggering event can be documented and mapped, it is often possible to glean enough topographic and geologic information from existing databases to produce a reliable map that depicts landslide hazards from an extreme event. Most areas of the world have enough topographic information to provide digital elevation models from which to construct slope maps. In the likely event that engineering properties of slope materials are not available, reasonable estimates can be made with detailed field examination by engineering geologists or geotechnical engineers. Resulting landslide hazard maps can be used as tools to guide relocation and redevelopment, or, more likely, temporary relocation efforts during severe storm events such as hurricanes/typhoons to minimize loss of life and property. We illustrate these methods in two case studies of lethal landslides in developing countries: Tegucigalpa, Honduras (during Hurricane Mitch in 1998) and the Chuuk Islands, Micronesia (during Typhoon Chata'an in 2002).
Pharmacology Students' Perceptions of Creating Multimodal Digital Explanations
ERIC Educational Resources Information Center
Nielsen, W.; Hoban G.; Hyland, C. J. T.
2017-01-01
Students can now digitally construct their own representations of scientific concepts using a variety of modes including writing, diagrams, 2-D and 3-D models, images or speech, all of which communicate meaning. In this study, final-year chemistry students studying a pharmacology subject created a ''blended media'' digital product as an assignment…
Development of Digital Instruction for Environment for Global Warming Alleviation
ERIC Educational Resources Information Center
Praneetham, Chuleewan; Thathong, Kongsak
2016-01-01
Technological education and instruction are widely used in the present education trend. Using of digital instruction for environmental subject can encourage students in learning and raise their awareness and attitude on environmental issues. The purposes of this research were: 1) to construct and develop the digital instruction for environment for…
ERIC Educational Resources Information Center
Figg, Candace; McCartney, Robin
2010-01-01
University researchers, teacher candidates, language and technology instructors, student learners, and families from diverse backgrounds partnered in an invitational teaching/learning experience--middle school student learners teaching their VIPs (very important persons) how to create stories and construct digital movies with reference to their…
Digital Print Concepts: Conceptualizing a Modern Framework for Measuring Emerging Knowledge
ERIC Educational Resources Information Center
Javorsky, Kristin H.
2014-01-01
This dissertation sought to produce and empirically test a theoretical model for the literacy construct of print concepts that would take into account the unique affordances of digital picture books for emergent readers. The author used an exploratory study of twenty randomly selected digital story applications to identify print conventions, text…
ERIC Educational Resources Information Center
Kajder, Sara; Bull, Glen; Albaugh, Susan
2005-01-01
A digital story consists of a series of still images combined with a narrated soundtrack to tell a story. This document contains a sequence of seven steps for digital storytelling based on a two-year project in Curry School's Center for Technology and Teacher Education at the University of Virginia. The strategies outlined offer a starting point…
PrimaryAccess: Creating Digital Documentaries in the Social Studies Classroom
ERIC Educational Resources Information Center
Ferster, Bill; Hammond, Tom; Bull, Glen
2006-01-01
The authors have been working with teachers who are drawing upon various online resources to construct digital documentaries that explore some facet of history, such as Japanese American internment or the liberation of Western Europe during World War II. These activities build on students' interest in digital media, recognizing a fundamental shift…
Analysis and Development of a Web-Enabled Planning and Scheduling Database Application
2013-09-01
establishes an entity—relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web- enabled interface for...development, develop, design, process, re- engineering, reengineering, MySQL , structured query language, SQL, myPHPadmin. 15. NUMBER OF PAGES 107 16...relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web-enabled interface for the population of
1988-09-01
CIT C 15 Name of local city. InCSrATE C 2 Name of local state as tw letter abbreviatiom. SIC ZIP C 10 Loa ZIP code. Five or nine digits . InC_ PHKtE C 15...record: 10 Database Dictimary for C: \\ BASE\\PAS1E.MF Field Nane Type Width Decimal Coments PMSCODE C 2 Third and fourth digits of PAS code. ON C 3...Version: 3.01 Date: 09/01/88 Time: 21:34 Report Libary : C: ASE\\GPO.RP1 Date: 08/28/88 Time: 11:32 PRMNT OFTICNS CflRL-PRINrM Nmber of copies: 1 Starting
Huen, Jenny My; Lai, Eliza Sy; Shum, Angie Ky; So, Sam Wk; Chan, Melissa Ky; Wong, Paul Wc; Law, Y W; Yip, Paul Sf
2016-10-07
Digital game-based learning (DGBL) makes use of the entertaining power of digital games for educational purposes. Effectiveness assessment of DGBL programs has been underexplored and no attempt has been made to simultaneously model both important components of DGBL: learning attainment (ie, educational purposes of DGBL) and engagement of users (ie, entertaining power of DGBL) in evaluating program effectiveness. This study aimed to describe and evaluate an Internet-based DGBL program, Professor Gooley and the Flame of Mind, which promotes mental health to adolescents in a positive youth development approach. In particular, we investigated whether user engagement in the DGBL program could enhance their attainment on each of the learning constructs per DGBL module and subsequently enhance their mental health as measured by psychological well-being. Users were assessed on their attainment on each learning construct, psychological well-being, and engagement in each of the modules. One structural equation model was constructed for each DGBL module to model the effect of users' engagement and attainment on the learning construct on their psychological well-being. Of the 498 secondary school students that registered and participated from the first module of the DGBL program, 192 completed all 8 modules of the program. Results from structural equation modeling suggested that a higher extent of engagement in the program activities facilitated users' attainment on the learning constructs on most of the modules and in turn enhanced their psychological well-being after controlling for users' initial psychological well-being and initial attainment on the constructs. This study provided evidence that Internet intervention for mental health, implemented with the technologies and digital innovations of DGBL, could enhance youth mental health. Structural equation modeling is a promising approach in evaluating the effectiveness of DGBL programs.
Clauson, Kevin A; Polen, Hyla H; Marsh, Wallace A
2007-12-01
To evaluate personal digital assistant (PDA) drug information databases used to support clinical decision-making, and to compare the performance of PDA databases with their online versions. Prospective evaluation with descriptive analysis. Five drug information databases available for PDAs and online were evaluated according to their scope (inclusion of correct answers), completeness (on a 3-point scale), and ease of use; 158 question-answer pairs across 15 weighted categories of drug information essential to health care professionals were used to evaluate these databases. An overall composite score integrating these three measures was then calculated. Scores for the PDA databases and for each PDA-online pair were compared. Among the PDA databases, composite rankings, from highest to lowest, were as follows: Lexi-Drugs, Clinical Pharmacology OnHand, Epocrates Rx Pro, mobileMicromedex (now called Thomson Clinical Xpert), and Epocrates Rx free version. When we compared database pairs, online databases that had greater scope than their PDA counterparts were Clinical Pharmacology (137 vs 100 answers, p<0.001), Micromedex (132 vs 96 answers, p<0.001), Lexi-Comp Online (131 vs 119 answers, p<0.001), and Epocrates Online Premium (103 vs 98 answers, p=0.001). Only Micromedex online was more complete than its PDA version (p=0.008). Regarding ease of use, the Lexi-Drugs PDA database was superior to Lexi-Comp Online (p<0.001); however, Epocrates Online Premium, Epocrates Online Free, and Micromedex online were easier to use than their PDA counterparts (p<0.001). In terms of composite scores, only the online versions of Clinical Pharmacology and Micromedex demonstrated superiority over their PDA versions (p>0.01). Online and PDA drug information databases assist practitioners in improving their clinical decision-making. Lexi-Drugs performed significantly better than all of the other PDA databases evaluated. No PDA database demonstrated superiority to its online counterpart; however, the online versions of Clinical Pharmacology and Micromedex were superior to their PDA versions in answering questions.
A case study for a digital seabed database: Bohai Sea engineering geology database
NASA Astrophysics Data System (ADS)
Tianyun, Su; Shikui, Zhai; Baohua, Liu; Ruicai, Liang; Yanpeng, Zheng; Yong, Wang
2006-07-01
This paper discusses the designing plan of ORACLE-based Bohai Sea engineering geology database structure from requisition analysis, conceptual structure analysis, logical structure analysis, physical structure analysis and security designing. In the study, we used the object-oriented Unified Modeling Language (UML) to model the conceptual structure of the database and used the powerful function of data management which the object-oriented and relational database ORACLE provides to organize and manage the storage space and improve its security performance. By this means, the database can provide rapid and highly effective performance in data storage, maintenance and query to satisfy the application requisition of the Bohai Sea Oilfield Paradigm Area Information System.
Cheng, Ching-Wu; Leu, Sou-Sen; Cheng, Ying-Mei; Wu, Tsung-Chih; Lin, Chen-Chung
2012-09-01
Construction accident research involves the systematic sorting, classification, and encoding of comprehensive databases of injuries and fatalities. The present study explores the causes and distribution of occupational accidents in the Taiwan construction industry by analyzing such a database using the data mining method known as classification and regression tree (CART). Utilizing a database of 1542 accident cases during the period 2000-2009, the study seeks to establish potential cause-and-effect relationships regarding serious occupational accidents in the industry. The results of this study show that the occurrence rules for falls and collapses in both public and private project construction industries serve as key factors to predict the occurrence of occupational injuries. The results of the study provide a framework for improving the safety practices and training programs that are essential to protecting construction workers from occasional or unexpected accidents. Copyright © 2011 Elsevier Ltd. All rights reserved.
Access Control based on Attribute Certificates for Medical Intranet Applications
Georgiadis, Christos; Pangalos, George; Khair, Marie
2001-01-01
Background Clinical information systems frequently use intranet and Internet technologies. However these technologies have emphasized sharing and not security, despite the sensitive and private nature of much health information. Digital certificates (electronic documents which recognize an entity or its attributes) can be used to control access in clinical intranet applications. Objectives To outline the need for access control in distributed clinical database systems, to describe the use of digital certificates and security policies, and to propose the architecture for a system using digital certificates, cryptography and security policy to control access to clinical intranet applications. Methods We have previously developed a security policy, DIMEDAC (Distributed Medical Database Access Control), which is compatible with emerging public key and privilege management infrastructure. In our implementation approach we propose the use of digital certificates, to be used in conjunction with DIMEDAC. Results Our proposed access control system consists of two phases: the ways users gain their security credentials; and how these credentials are used to access medical data. Three types of digital certificates are used: identity certificates for authentication; attribute certificates for authorization; and access-rule certificates for propagation of access control policy. Once a user is identified and authenticated, subsequent access decisions are based on a combination of identity and attribute certificates, with access-rule certificates providing the policy framework. Conclusions Access control in clinical intranet applications can be successfully and securely managed through the use of digital certificates and the DIMEDAC security policy. PMID:11720951
Construction of the iSTAR international Study of Astronomical Reasoning Database
NASA Astrophysics Data System (ADS)
Slater, S. J.; Tatge, C. B.; Slater, T. F.; Bretones, P. S.; Schleigh, S.
2016-12-01
Perhaps more than any other science discipline-based education research field, the scholarly literature base describing and documenting astronomy education research is highly fragmented and widely dispersed across numerous journals. The resulting wide diversity of journals that publish astronomy education research presents an arduous challenge for scholars trying to best understand what work has been done and what work still needs to be done. Moreover, a vast amount of education research on the teaching and learning of astronomy exists in dissertations that were never published and even more exists in the realm of un-disseminated grey literature hosted in conference proceedings and society newsletters going back decades. With a few notable exceptions far less extensive than the current project, there has been no comprehensive repository for cataloging astronomy education research methods and results to date. In response, an international cadre of scholars coordinated by the CAPER Center for Astronomy & Physics Education Research are creating the underlying structure for an online database in order to conduct an international Study of Astronomy Reasoning, iSTAR, project. The online iSTAR database serves as an online host to bring together in one place digital copies of hard to locate journal articles, isolated dissertations and theses, and professional meeting contributions to extend the world's scholars abilities to more easily find and utilize a far broader collection of astronomy education research literature than has been previously available. Works are categorized by research method, nature of study-participants, educational learning venue studied, country and language of the study, and other fruitfully useful categories. Scholars wishing to add their own literature resources are encouraged to contribute to the online database located at istardatabase.org
Topography changes monitoring of small islands using camera drone
NASA Astrophysics Data System (ADS)
Bang, E.
2017-12-01
Drone aerial photogrammetry was conducted for monitoring topography changes of small islands in the east sea of Korea. Severe weather and sea wave is eroding the islands and sometimes cause landslide and falling rock. Due to rugged cliffs in all direction and bad accessibility, ground based survey methods are less efficient in monitoring topography changes of the whole area. Camera drones can provide digital images and movie in every corner of the islands, and drone aerial photogrammetry is powerful to get precise digital surface model (DSM) for a limited area. We have got a set of digital images to construct a textured 3D model of the project area every year since 2014. Flight height is in less than 100m from the top of those islands to get enough ground sampling distance (GSD). Most images were vertically captured with automatic flights, but we also flied drones around the islands with about 30°-45° camera angle for constructing 3D model better. Every digital image has geo-reference, but we set several ground control points (GCPs) on the islands and their coordinates were measured with RTK surveying methods to increase the absolute accuracy of the project. We constructed 3D textured model using photogrammetry tool, which generates 3D spatial information from digital images. From the polygonal model, we could get DSM with contour lines. Thematic maps such as hill shade relief map, aspect map and slope map were also processed. Those maps make us understand topography condition of the project area better. The purpose of this project is monitoring topography change of these small islands. Elevation difference map between DSMs of each year is constructed. There are two regions showing big negative difference value. By comparing constructed textured models and captured digital images around these regions, it is checked that a region have experienced real topography change. It is due to huge rock fall near the center of the east island. The size of fallen rock can be measured on the digital model exactly, which is about 13m*6m*2m (height*width*thickness). We believe that drone aerial photogrammetry can be an efficient topography changes detection method for a complicated terrain area.
Studying Venus using a GIS database
NASA Technical Reports Server (NTRS)
Price, Maribeth; Suppe, John
1993-01-01
A Geographic Information System (GIS) can significantly enhance geological studies on Venus because it facilitates concurrent analysis of many sources of data, as demonstrated by our work on topographic and deformation characteristics of tesserae. We are creating a database of structures referenced to real-world coordinates to encourage the archival of Venusian studies in digital format and to foster quantitative analysis of many combinations of data. Contributions to this database from all aspects of Venusian science are welcome.
Application GIS on university planning: building a spatial database aided spatial decision
NASA Astrophysics Data System (ADS)
Miao, Lei; Wu, Xiaofang; Wang, Kun; Nong, Yu
2007-06-01
With the development of university and its size enlarging, kinds of resource need to effective management urgently. Spacial database is the right tool to assist administrator's spatial decision. And it's ready for digital campus with integrating existing OMS. It's researched about the campus planning in detail firstly. Following instanced by south china agriculture university it is practiced that how to build the geographic database of the campus building and house for university administrator's spatial decision.
Toward a standard reference database for computer-aided mammography
NASA Astrophysics Data System (ADS)
Oliveira, Júlia E. E.; Gueld, Mark O.; de A. Araújo, Arnaldo; Ott, Bastian; Deserno, Thomas M.
2008-03-01
Because of the lack of mammography databases with a large amount of codified images and identified characteristics like pathology, type of breast tissue, and abnormality, there is a problem for the development of robust systems for computer-aided diagnosis. Integrated to the Image Retrieval in Medical Applications (IRMA) project, we present an available mammography database developed from the union of: The Mammographic Image Analysis Society Digital Mammogram Database (MIAS), The Digital Database for Screening Mammography (DDSM), the Lawrence Livermore National Laboratory (LLNL), and routine images from the Rheinisch-Westfälische Technische Hochschule (RWTH) Aachen. Using the IRMA code, standardized coding of tissue type, tumor staging, and lesion description was developed according to the American College of Radiology (ACR) tissue codes and the ACR breast imaging reporting and data system (BI-RADS). The import was done automatically using scripts for image download, file format conversion, file name, web page and information file browsing. Disregarding the resolution, this resulted in a total of 10,509 reference images, and 6,767 images are associated with an IRMA contour information feature file. In accordance to the respective license agreements, the database will be made freely available for research purposes, and may be used for image based evaluation campaigns such as the Cross Language Evaluation Forum (CLEF). We have also shown that it can be extended easily with further cases imported from a picture archiving and communication system (PACS).
Trainable Cataloging for Digital Image Libraries with Applications to Volcano Detection
NASA Technical Reports Server (NTRS)
Burl, M. C.; Fayyad, U. M.; Perona, P.; Smyth, P.
1995-01-01
Users of digital image libraries are often not interested in image data per se but in derived products such as catalogs of objects of interest. Converting an image database into a usable catalog is typically carried out manually at present. For many larger image databases the purely manual approach is completely impractical. In this paper we describe the development of a trainable cataloging system: the user indicates the location of the objects of interest for a number of training images and the system learns to detect and catalog these objects in the rest of the database. In particular we describe the application of this system to the cataloging of small volcanoes in radar images of Venus. The volcano problem is of interest because of the scale (30,000 images, order of 1 million detectable volcanoes), technical difficulty (the variability of the volcanoes in appearance) and the scientific importance of the problem. The problem of uncertain or subjective ground truth is of fundamental importance in cataloging problems of this nature and is discussed in some detail. Experimental results are presented which quantify and compare the detection performance of the system relative to human detection performance. The paper concludes by discussing the limitations of the proposed system and the lessons learned of general relevance to the development of digital image libraries.
Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009
Soller, David R.
2011-01-01
As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
Rossi, Ernest; Mortimer, Jane; Rossi, Kathryn
2013-04-01
Culturomics is a new scientific discipline of the digital humanities-the use of computer algorithms to search for meaning in large databases of text and media. This new digital discipline is used to explore 200 years of the history of hypnosis and psychotherapy in over five million digitized books from more than 40 university libraries around the world. It graphically compares the frequencies of English words about hypnosis, hypnotherapy, psychoanalysis, psychotherapy, and their founders from 1800 to 2008. This new perspective explore issues such as: Who were the major innovators in the history of therapeutic hypnosis, psychoanalysis, and psychotherapy? How well does this new digital approach to the humanities correspond to traditional histories of hypnosis and psychotherapy?
Johnson, Bruce R.; Derkey, Pamela D.
1998-01-01
Geologic data from the geologic map of the Spokane 1:100,000-scale quadrangle compiled by Joseph (1990) were entered into a geographic information system (GIS) as part of a larger effort to create regional digital geology for the Pacific Northwest. The map area is located in eastern Washington and extends across the state border into western Idaho (Fig. 1). This open-file report describes the methods used to convert the geologic map data into a digital format, documents the file structures, and explains how to download the digital files from the U.S. Geological Survey public access World Wide Web site on the Internet.
Stoeser, Douglas B.; Green, Gregory N.; Morath, Laurie C.; Heran, William D.; Wilson, Anna B.; Moore, David W.; Van Gosen, Bradley S.
2005-01-01
The growth in the use of Geographic Information Systems (GIS) has highlighted the need for regional and national digital geologic maps attributed with age and lithology information. Such maps can be conveniently used to generate derivative maps for purposes including mineral-resource assessment, metallogenic studies, tectonic studies, and environmental research. This Open-File Report is a preliminary version of part of a series of integrated state geologic map databases that cover the entire United States. The only national-scale digital geologic maps that portray most or all of the United States for the conterminous U.S. are the digital version of the King and Beikman (1974a, b) map at a scale of 1:2,500,000, as digitized by Schruben and others (1994) and the digital version of the Geologic Map of North America (Reed and others, 2005a, b) compiled at a scale of 1:5,000,000 which is currently being prepared by the U.S. Geological Survey. The present series of maps is intended to provide the next step in increased detail. State geologic maps that range in scale from 1:100,000 to 1:1,000,000 are available for most of the country, and digital versions of these state maps are the basis of this product. In a few cases, new digital compilations were prepared (e.g. OH, SC, SD) or existing paper maps were digitized (e.g. KY, TX). For Alaska and Hawaii, new regional maps are being compiled and ultimately new state maps will be produced. The digital geologic maps are presented in standardized formats as ARC/INFO (.e00) export files and as ArcView shape (.shp) files. Accompanying these spatial databases are a set of five supplemental data tables that relate the map units to detailed lithologic and age information. The maps for the CONUS have been fitted to a common set of state boundaries based on the 1:100,000 topographic map series of the United States Geological Survey (USGS). When the individual state maps are merged, the combined attribute tables can be used directly with the merged maps to make derivative maps. No attempt has been made to reconcile differences in mapped geology across state lines. This is the first version of this product and it will be subsequently updated to include four additional states (North Dakota, South Dakota, Nebraska, and Iowa)
Consequences of "going digital" for pathology professionals - entering the cloud.
Laurinavicius, Arvydas; Raslavicus, Paul
2012-01-01
New opportunities and the adoption of digital technologies will transform the way pathology professionals and services work. Many areas of our daily life as well as medical professions have experienced this change already which has resulted in a paradigm shift in many activities. Pathology is an image-based discipline, therefore, arrival of digital imaging into this domain promises major shift in our work and required mentality. Recognizing the physical and digital duality of the pathology workflow, we can prepare for the imminent increase of the digital component, synergize and enjoy its benefits. Development of a new generation of laboratory information systems along with seamless integration of digital imaging, decision-support, and knowledge databases will enable pathologists to work in a distributed environment. The paradigm of "cloud pathology" is proposed as an ultimate vision of digital pathology workstations plugged into the integrated multidisciplinary patient care systems.
ERIC Educational Resources Information Center
Yang, Fang-Ying; Tsai, Meng-Jung; Chiou, Guo-Li; Lee, Silvia Wen-Yu; Chang, Cheng-Chieh; Chen, Li-Ling
2018-01-01
The main purpose of this study was to provide instructional suggestions for supporting science learning in digital environments based on a review of eye tracking studies in e-learning related areas. Thirty-three eye-tracking studies from 2005 to 2014 were selected from the Social Science Citation Index (SSCI) database for review. Through a…
ERIC Educational Resources Information Center
Henthorne, Eileen
1995-01-01
Describes a project at the Princeton University libraries that converted the pre-1981 public card catalog, using digital imaging and optical character recognition technology, to fully tagged and indexed records of text in MARC format that are available on an online database and will be added to the online catalog. (LRW)
Digital mining claim density map for federal lands in Nevada: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Nevada as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in Utah: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Utah as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in Wyoming: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Wyoming as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in Colorado: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Colorado as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in California: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in California as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in New Mexico: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in New Mexico as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in Washington: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Washington as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in Arizona: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Arizona as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill, and tunnel sites must be recorded at the appropriate BLM State office. BLM maintains a cumulative computer listing of mining claims in the MCRS database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital Mapping Techniques '10-Workshop Proceedings, Sacramento, California, May 16-19, 2010
Soller, David R.; Soller, David R.
2012-01-01
The Digital Mapping Techniques '10 (DMT'10) workshop was attended by 110 technical experts from 40 agencies, universities, and private companies, including representatives from 19 State geological surveys (see Appendix A). This workshop, hosted by the California Geological Survey, May 16-19, 2010, in Sacramento, California, was similar in nature to the previous 13 meetings (see Appendix B). The meeting was coordinated by the U.S. Geological Survey's (USGS) National Geologic Map Database project. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure that I note that the objective was again successfully met, as attendees continued to share and exchange knowledge and information, and renew friendships and collegial work begun at past DMT workshops. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products ("publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
Horsch, Alexander; Hapfelmeier, Alexander; Elter, Matthias
2011-11-01
Breast cancer is globally a major threat for women's health. Screening and adequate follow-up can significantly reduce the mortality from breast cancer. Human second reading of screening mammograms can increase breast cancer detection rates, whereas this has not been proven for current computer-aided detection systems as "second reader". Critical factors include the detection accuracy of the systems and the screening experience and training of the radiologist with the system. When assessing the performance of systems and system components, the choice of evaluation methods is particularly critical. Core assets herein are reference image databases and statistical methods. We have analyzed characteristics and usage of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM) from the University of South Florida, in literature indexed in Medline, IEEE Xplore, SpringerLink, and SPIE, with respect to type of computer-aided diagnosis (CAD) (detection, CADe, or diagnostics, CADx), selection of database subsets, choice of evaluation method, and quality of descriptions. 59 publications presenting 106 evaluation studies met our selection criteria. In 54 studies (50.9%), the selection of test items (cases, images, regions of interest) extracted from the DDSM was not reproducible. Only 2 CADx studies, not any CADe studies, used the entire DDSM. The number of test items varies from 100 to 6000. Different statistical evaluation methods are chosen. Most common are train/test (34.9% of the studies), leave-one-out (23.6%), and N-fold cross-validation (18.9%). Database-related terminology tends to be imprecise or ambiguous, especially regarding the term "case". Overall, both the use of the DDSM as data source for evaluation of mammography CAD systems, and the application of statistical evaluation methods were found highly diverse. Results reported from different studies are therefore hardly comparable. Drawbacks of the DDSM (e.g. varying quality of lesion annotations) may contribute to the reasons. But larger bias seems to be caused by authors' own decisions upon study design. RECOMMENDATIONS/CONCLUSION: For future evaluation studies, we derive a set of 13 recommendations concerning the construction and usage of a test database, as well as the application of statistical evaluation methods.
Development of an Integrated Hydrologic Modeling System for Rainfall-Runoff Simulation
NASA Astrophysics Data System (ADS)
Lu, B.; Piasecki, M.
2008-12-01
This paper aims to present the development of an integrated hydrological model which involves functionalities of digital watershed processing, online data retrieval, hydrologic simulation and post-event analysis. The proposed system is intended to work as a back end to the CUAHSI HIS cyberinfrastructure developments. As a first step into developing this system, a physics-based distributed hydrologic model PIHM (Penn State Integrated Hydrologic Model) is wrapped into OpenMI(Open Modeling Interface and Environment ) environment so as to seamlessly interact with OpenMI compliant meteorological models. The graphical user interface is being developed from the openGIS application called MapWindows which permits functionality expansion through the addition of plug-ins. . Modules required to set up through the GUI workboard include those for retrieving meteorological data from existing database or meteorological prediction models, obtaining geospatial data from the output of digital watershed processing, and importing initial condition and boundary condition. They are connected to the OpenMI compliant PIHM to simulate rainfall-runoff processes and includes a module for automatically displaying output after the simulation. Online databases are accessed through the WaterOneFlow web services, and the retrieved data are either stored in an observation database(OD) following the schema of Observation Data Model(ODM) in case for time series support, or a grid based storage facility which may be a format like netCDF or a grid-based-data database schema . Specific development steps include the creation of a bridge to overcome interoperability issue between PIHM and the ODM, as well as the embedding of TauDEM (Terrain Analysis Using Digital Elevation Models) into the model. This module is responsible for developing watershed and stream network using digital elevation models. Visualizing and editing geospatial data is achieved by the usage of MapWinGIS, an ActiveX control developed by MapWindow team. After applying to the practical watershed, the performance of the model can be tested by the post-event analysis module.
A Computer-Assisted Learning Model Based on the Digital Game Exponential Reward System
ERIC Educational Resources Information Center
Moon, Man-Ki; Jahng, Surng-Gahb; Kim, Tae-Yong
2011-01-01
The aim of this research was to construct a motivational model which would stimulate voluntary and proactive learning using digital game methods offering players more freedom and control. The theoretical framework of this research lays the foundation for a pedagogical learning model based on digital games. We analyzed the game reward system, which…
47 CFR 74.710 - Digital low power TV and TV translator station protection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Digital low power TV and TV translator station... SERVICES Low Power TV, TV Translator, and TV Booster Stations § 74.710 Digital low power TV and TV translator station protection. (a) An application to construct a new low power TV, TV translator, or TV...
Use of Knowledge Bases in Education of Database Management
ERIC Educational Resources Information Center
Radványi, Tibor; Kovács, Emod
2008-01-01
In this article we present a segment of Sulinet Digital Knowledgebase curriculum system in which you can find the sections of subject-matter which aid educating the database management. You can follow the order of the course from the beginning when some topics appearance and raise in elementary school, through the topics accomplish in secondary…
Using Screencasting to Promote Database Trials and Library Resources
ERIC Educational Resources Information Center
Emanuel, Michelle
2013-01-01
At the University of Mississippi, screencasting was used to promote a database trial to the ARTStor Digital Library. Using Jing, a free product used for recording and posting screencasts, and a Snowball USB microphone, 11 videos averaging 3 minutes in length were posted to an online topic guide. Screencasting was used as a quick, creative, and…
The Cardiac Safety Research Consortium ECG database.
Kligfield, Paul; Green, Cynthia L
2012-01-01
The Cardiac Safety Research Consortium (CSRC) ECG database was initiated to foster research using anonymized, XML-formatted, digitized ECGs with corresponding descriptive variables from placebo- and positive-control arms of thorough QT studies submitted to the US Food and Drug Administration (FDA) by pharmaceutical sponsors. The database can be expanded to other data that are submitted directly to CSRC from other sources, and currently includes digitized ECGs from patients with genotyped varieties of congenital long-QT syndrome; this congenital long-QT database is also linked to ambulatory electrocardiograms stored in the Telemetric and Holter ECG Warehouse (THEW). Thorough QT data sets are available from CSRC for unblinded development of algorithms for analysis of repolarization and for blinded comparative testing of algorithms developed for the identification of moxifloxacin, as used as a positive control in thorough QT studies. Policies and procedures for access to these data sets are available from CSRC, which has developed tools for statistical analysis of blinded new algorithm performance. A recently approved CSRC project will create a data set for blinded analysis of automated ECG interval measurements, whose initial focus will include comparison of four of the major manufacturers of automated electrocardiographs in the United States. CSRC welcomes application for use of the ECG database for clinical investigation. Copyright © 2012 Elsevier Inc. All rights reserved.
Unified Database for Rejected Image Analysis Across Multiple Vendors in Radiography.
Little, Kevin J; Reiser, Ingrid; Liu, Lili; Kinsey, Tiffany; Sánchez, Adrian A; Haas, Kateland; Mallory, Florence; Froman, Carmen; Lu, Zheng Feng
2017-02-01
Reject rate analysis has been part of radiography departments' quality control since the days of screen-film radiography. In the era of digital radiography, one might expect that reject rate analysis is easily facilitated because of readily available information produced by the modality during the examination procedure. Unfortunately, this is not always the case. The lack of an industry standard and the wide variety of system log entries and formats have made it difficult to implement a robust multivendor reject analysis program, and logs do not always include all relevant information. The increased use of digital detectors exacerbates this problem because of higher reject rates associated with digital radiography compared with computed radiography. In this article, the authors report on the development of a unified database for vendor-neutral reject analysis across multiple sites within an academic institution and share their experience from a team-based approach to reduce reject rates. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Using a Semi-Realistic Database to Support a Database Course
ERIC Educational Resources Information Center
Yue, Kwok-Bun
2013-01-01
A common problem for university relational database courses is to construct effective databases for instructions and assignments. Highly simplified "toy" databases are easily available for teaching, learning, and practicing. However, they do not reflect the complexity and practical considerations that students encounter in real-world…
EST databases and web tools for EST projects.
Shen, Yao-Qing; O'Brien, Emmet; Koski, Liisa; Lang, B Franz; Burger, Gertraud
2009-01-01
This chapter outlines key considerations for constructing and implementing an EST database. Instead of showing the technological details step by step, emphasis is put on the design of an EST database suited to the specific needs of EST projects and how to choose the most suitable tools. Using TBestDB as an example, we illustrate the essential factors to be considered for database construction and the steps for data population and annotation. This process employs technologies such as PostgreSQL, Perl, and PHP to build the database and interface, and tools such as AutoFACT for data processing and annotation. We discuss these in comparison to other available technologies and tools, and explain the reasons for our choices.
A Novel Approach: Chemical Relational Databases, and the ...
Mutagenicity and carcinogenicity databases are crucial resources for toxicologists and regulators involved in chemicals risk assessment. Until recently, existing public toxicity databases have been constructed primarily as
Digital Archiving: Where the Past Lives Again
NASA Astrophysics Data System (ADS)
Paxson, K. B.
2012-06-01
The process of digital archiving for variable star data by manual entry with an Excel spreadsheet is described. Excel-based tools including a Step Magnitude Calculator and a Julian Date Calculator for variable star observations where magnitudes and Julian dates have not been reduced are presented. Variable star data in the literature and the AAVSO International Database prior to 1911 are presented and reviewed, with recent archiving work being highlighted. Digitization using optical character recognition software conversion is also demonstrated, with editing and formatting suggestions for the OCR-converted text.
A generalized approach to computer synthesis of digital holograms
NASA Technical Reports Server (NTRS)
Hopper, W. A.
1973-01-01
Hologram is constructed by taking number of digitized sample points and blending them together to form ''continuous'' picture. New system selects better set of sample points resulting in improved hologram from same amount of information.
Informatics in radiology: Efficiency metrics for imaging device productivity.
Hu, Mengqi; Pavlicek, William; Liu, Patrick T; Zhang, Muhong; Langer, Steve G; Wang, Shanshan; Place, Vicki; Miranda, Rafael; Wu, Teresa Tong
2011-01-01
Acute awareness of the costs associated with medical imaging equipment is an ever-present aspect of the current healthcare debate. However, the monitoring of productivity associated with expensive imaging devices is likely to be labor intensive, relies on summary statistics, and lacks accepted and standardized benchmarks of efficiency. In the context of the general Six Sigma DMAIC (design, measure, analyze, improve, and control) process, a World Wide Web-based productivity tool called the Imaging Exam Time Monitor was developed to accurately and remotely monitor imaging efficiency with use of Digital Imaging and Communications in Medicine (DICOM) combined with a picture archiving and communication system. Five device efficiency metrics-examination duration, table utilization, interpatient time, appointment interval time, and interseries time-were derived from DICOM values. These metrics allow the standardized measurement of productivity, to facilitate the comparative evaluation of imaging equipment use and ongoing efforts to improve efficiency. A relational database was constructed to store patient imaging data, along with device- and examination-related data. The database provides full access to ad hoc queries and can automatically generate detailed reports for administrative and business use, thereby allowing staff to monitor data for trends and to better identify possible changes that could lead to improved productivity and reduced costs in association with imaging services. © RSNA, 2011.
Reviving a medical wearable computer for teaching purposes.
Frenger, Paul
2014-01-01
In 1978 the author constructed a medical wearable computer using an early CMOS microprocessor and support chips. This device was targeted for use by health-conscious consumers and other early adopters. Its expandable functions included weight management, blood pressure control, diabetes care, medication reminders, smoking cessation, pediatric growth and development, simple medical database, digital communication with a doctors office and emergency alert system. Various physiological sensors could be plugged-into the calculator-sized chassis. The device was shown to investor groups but funding was not obtained; by 1992 the author ceased pursuing it. The Computing and Mathematics Chair at a local University, a NASA acquaintance, approached the author to mentor a CS capstone course for Summer 2012. With the authors guidance, five students proceeded to convert this medical wearable computer design to an iPhone-based implementation using the Apple Xcode Developer Kit and other utilities. The final student device contained a body mass index (BMI) calculator, an emergency alert for 911 or other first responders, a medication reminder, a Doctors appointment feature, a medical database, medical Internet links, and a pediatric growth & development guide. The students final imple-mentation was successfully demonstrated on an actual iPhone 4 at the CS capstone meeting in mid-Summer.
NASA Technical Reports Server (NTRS)
Simpson, James J.; Harkins, Daniel N.
1993-01-01
Historically, locating and browsing satellite data has been a cumbersome and expensive process. This has impeded the efficient and effective use of satellite data in the geosciences. SSABLE is a new interactive tool for the archive, browse, order, and distribution of satellite date based upon X Window, high bandwidth networks, and digital image rendering techniques. SSABLE provides for automatically constructing relational database queries to archived image datasets based on time, data, geographical location, and other selection criteria. SSABLE also provides a visual representation of the selected archived data for viewing on the user's X terminal. SSABLE is a near real-time system; for example, data are added to SSABLE's database within 10 min after capture. SSABLE is network and machine independent; it will run identically on any machine which satisfies the following three requirements: 1) has a bitmapped display (monochrome or greater); 2) is running the X Window system; and 3) is on a network directly reachable by the SSABLE system. SSABLE has been evaluated at over 100 international sites. Network response time in the United States and Canada varies between 4 and 7 s for browse image updates; reported transmission times to Europe and Australia typically are 20-25 s.
NASA Astrophysics Data System (ADS)
Posner, A. J.
2017-12-01
The Middle Rio Grande River (MRG) traverses New Mexico from Cochiti to Elephant Butte reservoirs. Since the 1100s, cultivating and inhabiting the valley of this alluvial river has required various river training works. The mid-20th century saw a concerted effort to tame the river through channelization, Jetty Jacks, and dam construction. A challenge for river managers is to better understand the interactions between a river training works, dam construction, and the geomorphic adjustments of a desert river driven by spring snowmelt and summer thunderstorms carrying water and large sediment inputs from upstream and ephemeral tributaries. Due to its importance to the region, a vast wealth of data exists for conditions along the MRG. The investigation presented herein builds upon previous efforts by combining hydraulic model results, digitized planforms, and stream gage records in various statistical and conceptual models in order to test our understanding of this complex system. Spatially continuous variables were clipped by a set of river cross section data that is collected at decadal intervals since the early 1960s, creating a spatially homogenous database upon which various statistical testing was implemented. Conceptual models relate forcing variables and response variables to estimate river planform changes. The developed database, represents a unique opportunity to quantify and test geomorphic conceptual models in the unique characteristics of the MRG. The results of this investigation provides a spatially distributed characterization of planform variable changes, permitting managers to predict planform at a much higher resolution than previously available, and a better understanding of the relationship between flow regime and planform changes such as changes to longitudinal slope, sinuosity, and width. Lastly, data analysis and model interpretation led to the development of a new conceptual model for the impact of ephemeral tributaries in alluvial rivers.
2012-09-01
relative performance of several conventional SQL and NoSQL databases with a set of one billion file block hashes. Digital Forensics, Sector Hashing, Full... NoSQL databases with a set of one billion file block hashes. v THIS PAGE INTENTIONALLY LEFT BLANK vi Table of Contents List of Acronyms and...Operating System NOOP No Operation assembly instruction NoSQL “Not only SQL” model for non-relational database management NSRL National Software
[Nursing workloads and working conditions: integrative review].
Schmoeller, Roseli; Trindade, Letícia de Lima; Neis, Márcia Binder; Gelbcke, Francine Lima; de Pires, Denise Elvira Pires
2011-06-01
This study reviews theoretical production concerning workloads and working conditions for nurses. For that, an integrative review was carried out using scientific articles, theses and dissertations indexed in two Brazilian databases, Virtual Health Care Library (Biblioteca Virtual de Saúde) and Digital Database of Dissertations (Banco Digital de Teses), over the last ten years. From 132 identified studies, 27 were selected. Results indicate workloads as responsible for professional weariness, affecting the occurrence of work accidents and health problems. In order to adequate workloads studies indicate some strategies, such as having an adequate numbers of employees, continuing education, and better working conditions. The challenge is to continue research that reveal more precisely the relationships between workloads, working conditions, and health of the nursing team.
Automatic detection of anomalies in screening mammograms
2013-01-01
Background Diagnostic performance in breast screening programs may be influenced by the prior probability of disease. Since breast cancer incidence is roughly half a percent in the general population there is a large probability that the screening exam will be normal. That factor may contribute to false negatives. Screening programs typically exhibit about 83% sensitivity and 91% specificity. This investigation was undertaken to determine if a system could be developed to pre-sort screening-images into normal and suspicious bins based on their likelihood to contain disease. Wavelets were investigated as a method to parse the image data, potentially removing confounding information. The development of a classification system based on features extracted from wavelet transformed mammograms is reported. Methods In the multi-step procedure images were processed using 2D discrete wavelet transforms to create a set of maps at different size scales. Next, statistical features were computed from each map, and a subset of these features was the input for a concerted-effort set of naïve Bayesian classifiers. The classifier network was constructed to calculate the probability that the parent mammography image contained an abnormality. The abnormalities were not identified, nor were they regionalized. The algorithm was tested on two publicly available databases: the Digital Database for Screening Mammography (DDSM) and the Mammographic Images Analysis Society’s database (MIAS). These databases contain radiologist-verified images and feature common abnormalities including: spiculations, masses, geometric deformations and fibroid tissues. Results The classifier-network designs tested achieved sensitivities and specificities sufficient to be potentially useful in a clinical setting. This first series of tests identified networks with 100% sensitivity and up to 79% specificity for abnormalities. This performance significantly exceeds the mean sensitivity reported in literature for the unaided human expert. Conclusions Classifiers based on wavelet-derived features proved to be highly sensitive to a range of pathologies, as a result Type II errors were nearly eliminated. Pre-sorting the images changed the prior probability in the sorted database from 37% to 74%. PMID:24330643
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anaya, O.; Moreno, G.E.L.; Madrigal, M.M.
1999-11-01
In the last years, several definitions of power have been proposed for more accurate measurement of electrical quantities in presence of harmonics pollution on power lines. Nevertheless, only few instruments have been constructed considering these definitions. This paper describes a new microcontroller-based digital instrument, which include definitions based on Harley Transform. The algorithms are fully processed using Fast Hartley Transform (FHT) and 16 bit-microcontroller platform. The constructed prototype was compared with commercial harmonics analyzer instrument.
Techniques to measure complex-plane fields
NASA Astrophysics Data System (ADS)
Dudley, Angela; Majola, Nombuso; Chetty, Naven; Forbes, Andrew
2014-10-01
In this work we construct coherent superpositions of Gaussian and vortex modes which can be described to occupy the complex-plane. We demonstrate how these fields can be experimentally constructed in a digital, controllable manner with a spatial light modulator. Once these fields have been generated we illustrate, with three separate techniques, how the constituent components of these fields can be extracted, namely by measuring the intensity of the field at two adjacent points; performing a modal decomposition and a new digital Stokes measurement.
The trend of digital control system design for nuclear power plants in Korea
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, S. H.; Jung, H. Y.; Yang, C. Y.
2006-07-01
Currently there are 20 nuclear power plants (NPPs) in operation, and 6 more units are under construction in Korea. The control systems of those NPPs have also been developed together with the technology advancement. Control systems started with On-Off control using the relay logic, had been evolved into Solid-State logic using TTL ICs, and applied with the micro-processors since the Yonggwang NPP Units 3 and 4 which started its construction in 1989. Multiplexers are also installed at the local plant areas to collect field input and to send output signals while communicating with the controllers located in the system cabinetsmore » near the main control room in order to reduce the field wiring cables. The design of the digital control system technology for the NPPs in Korea has been optimized to maximize the operability as well as the safety through the design, construction, start-up and operation experiences. Both Shin-Kori Units 1 and 2 and Shin-Wolsong Units 1 and 2 NPP projects under construction are being progressed at the same time. Digital Plant Control Systems of these projects have adopted multi-loop controllers, redundant loop configuration, and soft control system for the radwaste system. Programmable Logic Controller (PLC) and Distributed Control System (DCS) are applied with soft control system in Shin-Kori Units 3 and 4. This paper describes the evolvement of control system at the NPPs in Korea and the experience and design improvement through the observation of the latest failure of the digital control system. In addition, design concept and its trend of the digital control system being applied to the NPP in Korea are introduced. (authors)« less
[Construction of chemical information database based on optical structure recognition technique].
Lv, C Y; Li, M N; Zhang, L R; Liu, Z M
2018-04-18
To create a protocol that could be used to construct chemical information database from scientific literature quickly and automatically. Scientific literature, patents and technical reports from different chemical disciplines were collected and stored in PDF format as fundamental datasets. Chemical structures were transformed from published documents and images to machine-readable data by using the name conversion technology and optical structure recognition tool CLiDE. In the process of molecular structure information extraction, Markush structures were enumerated into well-defined monomer molecules by means of QueryTools in molecule editor ChemDraw. Document management software EndNote X8 was applied to acquire bibliographical references involving title, author, journal and year of publication. Text mining toolkit ChemDataExtractor was adopted to retrieve information that could be used to populate structured chemical database from figures, tables, and textual paragraphs. After this step, detailed manual revision and annotation were conducted in order to ensure the accuracy and completeness of the data. In addition to the literature data, computing simulation platform Pipeline Pilot 7.5 was utilized to calculate the physical and chemical properties and predict molecular attributes. Furthermore, open database ChEMBL was linked to fetch known bioactivities, such as indications and targets. After information extraction and data expansion, five separate metadata files were generated, including molecular structure data file, molecular information, bibliographical references, predictable attributes and known bioactivities. Canonical simplified molecular input line entry specification as primary key, metadata files were associated through common key nodes including molecular number and PDF number to construct an integrated chemical information database. A reasonable construction protocol of chemical information database was created successfully. A total of 174 research articles and 25 reviews published in Marine Drugs from January 2015 to June 2016 collected as essential data source, and an elementary marine natural product database named PKU-MNPD was built in accordance with this protocol, which contained 3 262 molecules and 19 821 records. This data aggregation protocol is of great help for the chemical information database construction in accuracy, comprehensiveness and efficiency based on original documents. The structured chemical information database can facilitate the access to medical intelligence and accelerate the transformation of scientific research achievements.
Cathedrals, Casinos, Colleges and Classrooms: Questions for the Architects of Digital Campuses
ERIC Educational Resources Information Center
McCluskey, Frank; Winter, Melanie
2013-01-01
The bricks and mortar classroom has a long and storied history. The digital classroom is so new and different it may be wrong to even call it a "classroom". The authors argue that architecture influences behavior. So in constructing our new digital classrooms we must pay attention to the architecture and what job we want that…
ERIC Educational Resources Information Center
Hall, Taffey
2013-01-01
The purpose of this study was to explore the construction of a collaborative Baptist digital library and archive on the Internet. The study investigated how a central electronic location of digitized Baptist primary source materials could look and work on the Internet and how such a project could benefit Baptist history professors, the primary…
First results of MAO NASU SS bodies photographic archive digitizing
NASA Astrophysics Data System (ADS)
Pakuliak, L.; Andruk, V.; Shatokhina, S.; Golovnya, V.; Yizhakevych, O.; Kulyk, I.
2013-05-01
MAO NASU glass archive encloses about 1800 photographic plates with planets and their satellites (including near 80 images of Uranus, Pluto and Neptune), about 1700 plates with minor planets and about 900 plates with comets. Plates were made during 1949-1999 using 11 telescopes of different focus, mostly the Double Wide-angle Astrograph (F/D=2000/400) and the Double Long-focus Astrograph (F/D=5500/400) of MAO NASU. Observational sites are Kyiv, Lviv (Ukraine), Biurakan (Armenia), Abastumani (Georgia), Mt. Maidanak (Uzbekistan), Quito (Equador). Tables contain data about the most significant numbers of plates sub-divided by years and objects. The database with metadata of plates (DBGPA) is available on the computer cluster of MAO (http://gua.db.ukr-vo.org) via open access. The database accumulates archives of four Ukrainian observatories, involving the UkrVO national project. Together with the archive managing system, the database serves as a test area for JDA - Joint Digital Archive - the core of the UkrVO.
[Establishment of Oncomelania hupensis snail database based on smartphone and Google Earth].
Wang, Wei-chun; Zhan, Ti; Zhu, Ying-fu
2015-02-01
To establish an Oncomelania hupensis snail database based on smartphone and Google Earth. The HEAD GPS software was loaded in the smartphone first. The GPS data of the snails were collected by the smartphone. The original data were exported to the computer with the format of KMIUKMZ. Then the data were converted into Excel file format by using some software. Finally, the results based on laboratory were filled, and the digital snail data were established. The data were converted into KML, and then were showed by Google Earth visually. The snail data of a 5 hm2-beach along the Yangtze River were collected and the distribution of the snails based on Google Earth was obtained. The database of the snails was built. The query function was implemented about the number of the total snails, the living snails and the schistosome infected snails of each survey frame. The digital management of the snail data is realized by using the smartphone and Google Earth.
Silva-Lopes, Victor W; Monteiro-Leal, Luiz H
2003-07-01
The development of new technology and the possibility of fast information delivery by either Internet or Intranet connections are changing education. Microanatomy education depends basically on the correct interpretation of microscopy images by students. Modern microscopes coupled to computers enable the presentation of these images in a digital form by creating image databases. However, the access to this new technology is restricted entirely to those living in cities and towns with an Information Technology (IT) infrastructure. This study describes the creation of a free Internet histology database composed by high-quality images and also presents an inexpensive way to supply it to a greater number of students through Internet/Intranet connections. By using state-of-the-art scientific instruments, we developed a Web page (http://www2.uerj.br/~micron/atlas/atlasenglish/index.htm) that, in association with a multimedia microscopy laboratory, intends to help in the reduction of the IT educational gap between developed and underdeveloped regions. Copyright 2003 Wiley-Liss, Inc.
Our journey to digital curation of the Jeghers Medical Index.
Gawdyda, Lori; Carter, Kimbroe; Willson, Mark; Bedford, Denise
2017-07-01
Harold Jeghers, a well-known medical educator of the twentieth century, maintained a print collection of about one million medical articles from the late 1800s to the 1990s. This case study discusses how a print collection of these articles was transformed to a digital database. Staff in the Jeghers Medical Index, St. Elizabeth Youngstown Hospital, converted paper articles to Adobe portable document format (PDF)/A-1a files. Optical character recognition was used to obtain searchable text. The data were then incorporated into a specialized database. Lastly, articles were matched to PubMed bibliographic metadata through automation and human review. An online database of the collection was ultimately created. The collection was made part of a discovery search service, and semantic technologies have been explored as a method of creating access points. This case study shows how a small medical library made medical writings of the nineteenth and twentieth centuries available in electronic format for historic or semantic research, highlighting the efficiencies of contemporary information technology.
ERIC Educational Resources Information Center
Haugh, Dana
2016-01-01
The shift from physical materials to digital holdings has slowly infiltrated libraries across the globe, and librarians are struggling to make sense of these intangible, and sometimes fleeting, resources. Materials budgets have shifted to accommodate large journal and database subscriptions, single-title article access, and, most recently, e-book…
ERIC Educational Resources Information Center
Nunn, Samuel
1993-01-01
Assessed the impact of the Mobile Digital Terminal technology (computers used to communicate with remote crime databases) on motor vehicle theft clearance (arresting a perpetrator) and recovery rates in Fort Worth (Texas), using a time series analysis. Impact has been ambiguous, with little evidence of improved clearance or recovery. (SLD)
Digital Mapping Techniques '05--Workshop Proceedings, Baton Rouge, Louisiana, April 24-27, 2005
Soller, David R.
2005-01-01
Intorduction: The Digital Mapping Techniques '05 (DMT'05) workshop was attended by more than 100 technical experts from 47 agencies, universities, and private companies, including representatives from 25 state geological surveys (see Appendix A). This workshop was similar in nature to the previous eight meetings, held in Lawrence, Kansas (Soller, 1997), in Champaign, Illinois (Soller, 1998), in Madison, Wisconsin (Soller, 1999), in Lexington, Kentucky (Soller, 2000), in Tuscaloosa, Alabama (Soller, 2001), in Salt Lake City, Utah (Soller, 2002), in Millersville, Pennsylvania (Soller, 2003), and in Portland, Oregon (Soller, 2004). This year's meeting was hosted by the Louisiana Geological Survey, from April 24-27, 2005, on the Louisiana State University campus in Baton Rouge, Louisiana. As in the previous meetings, the objective was to foster informal discussion and exchange of technical information. It is with great pleasure I note that the objective was successfully met, as attendees continued to share and exchange knowledge and information, and to renew friendships and collegial work begun at past DMT workshops. Each DMT workshop has been coordinated by the Association of American State Geologists (AASG) and U.S. Geological Survey (USGS) Data Capture Working Group, which was formed in August 1996, to support the AASG and the USGS in their effort to build a National Geologic Map Database (see Soller and Berg, this volume, and http://ngmdb.usgs.gov/info/standards/datacapt/). The Working Group was formed because increased production efficiencies, standardization, and quality of digital map products were needed for the database?and for the State and Federal geological surveys?to provide more high-quality digital maps to the public. At the 2005 meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; 6) continued development of the National Geologic Map Database; and 7) progress toward building and implementing a standard geologic map data model and standard science language for the U.S. and for North America.
Horvath , E.A.; Fosnight, E.A.; Klingebiel, A.A.; Moore, D.G.; Stone, J.E.; Reybold, W.U.; Petersen, G.W.
1987-01-01
A methodology has been developed to create a spatial database by referencing digital elevation, Landsat multispectral scanner data, and digitized soil premap delineations of a number of adjacent 7.5-min quadrangle areas to a 30-m Universal Transverse Mercator projection. Slope and aspect transformations are calculated from elevation data and grouped according to field office specifications. An unsupervised classification is performed on a brightness and greenness transformation of the spectral data. The resulting spectral, slope, and aspect maps of each of the 7.5-min quadrangle areas are then plotted and submitted to the field office to be incorporated into the soil premapping stages of a soil survey. A tabular database is created from spatial data by generating descriptive statistics for each data layer within each soil premap delineation. The tabular data base is then entered into a data base management system to be accessed by the field office personnel during the soil survey and to be used for subsequent resource management decisions.Large amounts of data are collected and archived during resource inventories for public land management. Often these data are stored as stacks of maps or folders in a file system in someone's office, with the maps in a variety of formats, scales, and with various standards of accuracy depending on their purpose. This system of information storage and retrieval is cumbersome at best when several categories of information are needed simultaneously for analysis or as input to resource management models. Computers now provide the resource scientist with the opportunity to design increasingly complex models that require even more categories of resource-related information, thus compounding the problem.Recently there has been much emphasis on the use of geographic information systems (GIS) as an alternative method for map data archives and as a resource management tool. Considerable effort has been devoted to the generation of tabular databases, such as the U.S. Department of Agriculture's SCS/S015 (Soil Survey Staff, 1983), to archive the large amounts of information that are collected in conjunction with mapping of natural resources in an easily retrievable manner.During the past 4 years the U.S. Geological Survey's EROS Data Center, in a cooperative effort with the Bureau of Land Management (BLM) and the Soil Conservation Service (SCS), developed a procedure that uses spatial and tabular databases to generate elevation, slope, aspect, and spectral map products that can be used during soil premapping. The procedure results in tabular data, residing in a database management system, that are indexed to the final soil delineations and help quantify soil map unit composition.The procedure was developed and tested on soil surveys on over 600 000 ha in Wyoming, Nevada, and Idaho. A transfer of technology from the EROS Data Center to the BLM will enable the Denver BLM Service Center to use this procedure in soil survey operations on BLM lands. Also underway is a cooperative effort between the EROS Data Center and SCS to define and evaluate maps that can be produced as derivatives of digital elevation data for 7.5-min quadrangle areas, such as those used during the premapping stage of the soil surveys mentioned above, the idea being to make such products routinely available.The procedure emphasizes the applications of digital elevation and spectral data to order-three soil surveys on rangelands, and will:Incorporate digital terrain and spectral data into a spatial database for soil surveys.Provide hardcopy products (that can be generated from digital elevation model and spectral data) that are useful during the soil pre-mapping process.Incorporate soil premaps into a spatial database that can be accessed during the soil survey process along with terrain and spectral data.Summarize useful quantitative information for soil mapping and for making interpretations for resource management.
Visualization and interaction tools for aerial photograph mosaics
NASA Astrophysics Data System (ADS)
Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António
1997-05-01
This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.
Monica Lipscomb Smith; Weiqi Zhou; Mary Cadenasso; J. Morgan Grove; Lawrence Band
2010-01-01
We compared the National Land Cover Database (NLCD) 2001 land cover, impervious, and canopy data products to land cover data derived from 0.6-m resolution three-band digital imagery and ancillary data. We conducted this comparison at the 1 km2, 9 km2, and gauged watershed scales within the Baltimore Ecosystem Study to...
ERIC Educational Resources Information Center
Blau, Ina; Hameiri, Mira
2017-01-01
Digital educational data management has become an integral part of school practices. Accessing school database by teachers, students, and parents from mobile devices promotes data-driven educational interactions based on real-time information. This paper analyses mobile access of educational database in a large sample of 429 schools during an…
Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia
NASA Astrophysics Data System (ADS)
Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.
2016-09-01
Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.
Data Mining on Distributed Medical Databases: Recent Trends and Future Directions
NASA Astrophysics Data System (ADS)
Atilgan, Yasemin; Dogan, Firat
As computerization in healthcare services increase, the amount of available digital data is growing at an unprecedented rate and as a result healthcare organizations are much more able to store data than to extract knowledge from it. Today the major challenge is to transform these data into useful information and knowledge. It is important for healthcare organizations to use stored data to improve quality while reducing cost. This paper first investigates the data mining applications on centralized medical databases, and how they are used for diagnostic and population health, then introduces distributed databases. The integration needs and issues of distributed medical databases are described. Finally the paper focuses on data mining studies on distributed medical databases.
Database of Novel and Emerging Adsorbent Materials
National Institute of Standards and Technology Data Gateway
SRD 205 NIST/ARPA-E Database of Novel and Emerging Adsorbent Materials (Web, free access) The NIST/ARPA-E Database of Novel and Emerging Adsorbent Materials is a free, web-based catalog of adsorbent materials and measured adsorption properties of numerous materials obtained from article entries from the scientific literature. Search fields for the database include adsorbent material, adsorbate gas, experimental conditions (pressure, temperature), and bibliographic information (author, title, journal), and results from queries are provided as a list of articles matching the search parameters. The database also contains adsorption isotherms digitized from the cataloged articles, which can be compared visually online in the web application or exported for offline analysis.
Individuals on alert: digital epidemiology and the individualization of surveillance.
Samerski, Silja
2018-06-14
This article examines how digital epidemiology and eHealth coalesce into a powerful health surveillance system that fundamentally changes present notions of body and health. In the age of Big Data and Quantified Self, the conceptual and practical distinctions between individual and population body, personal and public health, surveillance and health care are diminishing. Expanding on Armstrong's concept of "surveillance medicine" to "quantified self medicine" and drawing on my own research on the symbolic power of statistical constructs in medical encounters, this article explores the impact of digital health surveillance on people's perceptions, actions and subjectivities. It discusses the epistemic confusions and paradoxes produced by a health care system that increasingly treats patients as risk profiles and prompts them to do the same, namely to perceive and manage themselves as a bundle of health and security risks. Since these risks are necessarily constructed in reference to epidemiological data that postulate a statistical gaze, they also construct or make-up disembodied "individuals on alert".
A new scale for the assessment of conjunctival bulbar redness.
Macchi, Ilaria; Bunya, Vatinee Y; Massaro-Giordano, Mina; Stone, Richard A; Maguire, Maureen G; Zheng, Yuanjie; Chen, Min; Gee, James; Smith, Eli; Daniel, Ebenezer
2018-06-05
Current scales for assessment of bulbar conjunctival redness have limitations for evaluating digital images. We developed a scale suited for evaluating digital images and compared it to the Validated Bulbar Redness (VBR) scale. From a digital image database of 4889 color corrected bulbar conjunctival images, we identified 20 images with varied degrees of redness. These images, ten each of nasal and temporal views, constitute the Digital Bulbar Redness (DBR) scale. The chromaticity of these images was assessed with an established image processing algorithm. Using 100 unique, randomly selected images from the database, three trained, non-physician graders applied the DBR scale and printed VBR scale. Agreement was assessed with weighted Kappa statistics (K w ). The DBR scale scores provide linear increments of 10 from 10-100 when redness is measured objectively with an established image processing algorithm. Exact agreement of all graders was 38% and agreement with no more than a difference of ten units between graders was 91%. K w for agreement between any two graders ranged from 0.57 to 0.73 for the DBR scale and from 0.38 to 0.66 for the VBR scale. The DBR scale allowed direct comparison of digital to digital images, could be used in dim lighting, had both temporal and nasal conjunctival reference images, and permitted viewing reference and test images at the same magnification. The novel DBR scale, with its objective linear chromatic steps, demonstrated improved reproducibility, fewer visualization artifacts and improved ease of use over the VBR scale for assessing conjunctival redness. Copyright © 2018. Published by Elsevier Inc.
A pilot GIS database of active faults of Mt. Etna (Sicily): A tool for integrated hazard evaluation
NASA Astrophysics Data System (ADS)
Barreca, Giovanni; Bonforte, Alessandro; Neri, Marco
2013-02-01
A pilot GIS-based system has been implemented for the assessment and analysis of hazard related to active faults affecting the eastern and southern flanks of Mt. Etna. The system structure was developed in ArcGis® environment and consists of different thematic datasets that include spatially-referenced arc-features and associated database. Arc-type features, georeferenced into WGS84 Ellipsoid UTM zone 33 Projection, represent the five main fault systems that develop in the analysed region. The backbone of the GIS-based system is constituted by the large amount of information which was collected from the literature and then stored and properly geocoded in a digital database. This consists of thirty five alpha-numeric fields which include all fault parameters available from literature such us location, kinematics, landform, slip rate, etc. Although the system has been implemented according to the most common procedures used by GIS developer, the architecture and content of the database represent a pilot backbone for digital storing of fault parameters, providing a powerful tool in modelling hazard related to the active tectonics of Mt. Etna. The database collects, organises and shares all scientific currently available information about the active faults of the volcano. Furthermore, thanks to the strong effort spent on defining the fields of the database, the structure proposed in this paper is open to the collection of further data coming from future improvements in the knowledge of the fault systems. By layering additional user-specific geographic information and managing the proposed database (topological querying) a great diversity of hazard and vulnerability maps can be produced by the user. This is a proposal of a backbone for a comprehensive geographical database of fault systems, universally applicable to other sites.
Electronic atlas of the Russian Arctic coastal zone: natural conditions and technogenic risk
NASA Astrophysics Data System (ADS)
Drozdov, D. S.; Rivkin, F. M.; Rachold, V.
2004-12-01
The Arctic coast is characterized by a diversity of geological-geomorphological structures and geocryological conditions, which are expected to respond differently to changes in the natural environment and in anthropogenic impacts. At present, oil fields are prospected and developed and permanent and temporary ports are constructed in the Arctic regions of Russia. Thus, profound understanding of the processes involved and measures of nature conservation for the coastal zone of the Arctic Seas are required. One of the main field of Arctic coastal investigations and database formation of coastal conditions is the mapping of the coasts. This poster presents a set of digital maps including geology, quaternary sediments, landscapes, engineering-geology, vegetation, geocryology and a series of regional sources, which have been selected to characterize the Russian Arctic coast. The area covered in this work includes the 200-km-wide band along the entire Russian Arctic coast from the Norwegian boundary in the west to the Bering Strait in the east. Methods included the collection of the majority of available hard copies of cartographic material and their digital formats and the transformation of these sources into a uniform digital graphic format. The atlas consists of environmental maps and maps of engineering-geological zoning. The set of environmental maps includes geology, quaternary sediments, landscapes and vegetation of the Russian Arctic coast at a scale of 1:4000000. The set of engineering-geocryological maps includes a map of engineering-geocryological zoning of the Russian Arctic coast, a map of the intensity of destructive coastal process and a map of industrial impact risk assessment ( 1:8000000 scale). Detailed mapping has been performed for key sites (at a scale of 1:100000) in order to enable more precise estimates of the intensity of destructive coastal process and industrial impact. The engineering-geocryological map of the Russian Arctic coast was compiled based on the analysis of geotechnical and geocryological conditions in the areas adjacent to the coastal band. Industrial impact assessment has been estimated differently for each engineering-geocryological region distinguished on the coast, considering technological features of construction and engineering facilities: aerial construction, highways and airdromes, underground (with positive and negative pipe temperatures) and surface pipelines and quarries. The atlas is being used as a base for the circum-Arctic segmentation of the coastline and the analyses of coastal dynamics within the Arctic Coastal Dynamics (ACD) Project. The work has been supported by INTAS (project number 01-2332).
The virtual case: a new method to completely digitize cytological and histological slides.
Demichelis, F; Barbareschi, M; Dalla Palma, P; Forti, S
2002-08-01
The purpose of this study was to present a new method for handling histological/cytological cases. Thanks to the introduction of information technology in pathology, including the amenities afforded by robotic microscopes and digital imaging, tissue slides can be represented and evaluated using digital techniques in order to construct virtual cases through completely automated procedures. A virtual case (VC) is composed of a collection of digital images representing a histological/cytological slide at all magnification levels together with all relevant clinical data. In the present study, we describe an automated system to manage robotic microscope and image acquisition for the proper construction of VCs. These can then be viewed on a computer by means of an interface ("user-friendly") that allows one to select the more appropriate fields and to examine them at different magnifications, rapidly going from panoramic views to high resolution and vice versa. In comparison with glass slides, VCs have several advantages arising from their digital nature and can be considered a common platform for a wide range of applications such as teleconsultation, education, research, and quality control and proficiency tests.
Data mining and visualization of average images in a digital hand atlas
NASA Astrophysics Data System (ADS)
Zhang, Aifeng; Gertych, Arkadiusz; Liu, Brent J.; Huang, H. K.
2005-04-01
We have collected a digital hand atlas containing digitized left hand radiographs of normally developed children grouped accordingly by age, sex, and race. A set of features stored in a database reflecting patient's stage of skeletal development has been calculated by automatic image processing procedures. This paper addresses a new concept, "average" image in the digital hand atlas. The "average" reference image in the digital atlas is selected for each of the groups of normal developed children with the best representative skeletal maturity based on bony features. A data mining procedure was designed and applied to find the average image through average feature vector matching. It also provides a temporary solution for the missing feature problem through polynomial regression. As more cases are added to the digital hand atlas, it can grow to provide clinicians accurate reference images to aid the bone age assessment process.
Digital Inclusion & Health Communication: A Rapid Review of Literature.
Borg, Kim; Boulet, Mark; Smith, Liam; Bragge, Peter
2018-06-11
Information and communication technologies can be a valuable tool for enhancing health communication. However, not everyone is utilising the wide suite of digital opportunities. This disparity has the potential to exacerbate existing social and health inequalities, particularly among vulnerable groups such as those who are in poor health and the elderly. This review aimed to systematically identify the common barriers to, and facilitators of, digital inclusion. A comprehensive database search yielded 969 citations. Following screening, seven systematic reviews and three non-systematic reviews were identified. Collectively, the reviews found that physical access continues to be a barrier to digital inclusion. However, provision of access alone is insufficient, as digital ability and attitude were also potential barriers. Social support, direct user experience and collaborative learning/design were identified as key strategies to improve inclusion. These review findings provide guidance for health communication practitioners in designing and implementing effective programmes in the digital environment.
Occupational exposure to silica in construction workers: a literature-based exposure database.
Beaudry, Charles; Lavoué, Jérôme; Sauvé, Jean-François; Bégin, Denis; Senhaji Rhazi, Mounia; Perrault, Guy; Dion, Chantal; Gérin, Michel
2013-01-01
We created an exposure database of respirable crystalline silica levels in the construction industry from the literature. We extracted silica and dust exposure levels in publications reporting silica exposure levels or quantitative evaluations of control effectiveness published in or after 1990. The database contains 6118 records (2858 of respirable crystalline silica) extracted from 115 sources, summarizing 11,845 measurements. Four hundred and eighty-eight records represent summarized exposure levels instead of individual values. For these records, the reported summary parameters were standardized into a geometric mean and a geometric standard deviation. Each record is associated with 80 characteristics, including information on trade, task, materials, tools, sampling strategy, analytical methods, and control measures. Although the database was constructed in French, 38 essential variables were standardized and translated into English. The data span the period 1974-2009, with 92% of the records corresponding to personal measurements. Thirteen standardized trades and 25 different standardized tasks are associated with at least five individual silica measurements. Trade-specific respirable crystalline silica geometric means vary from 0.01 (plumber) to 0.30 mg/m³ (tunnel construction skilled labor), while tasks vary from 0.01 (six categories, including sanding and electrical maintenance) to 1.59 mg/m³ (abrasive blasting). Despite limitations associated with the use of literature data, this database can be analyzed using meta-analytical and multivariate techniques and currently represents the most important source of exposure information about silica exposure in the construction industry. It is available on request to the research community.
NASA Astrophysics Data System (ADS)
Gulen, L.; EMME WP2 Team*
2011-12-01
The Earthquake Model of the Middle East (EMME) Project is a regional project of the GEM (Global Earthquake Model) project (http://www.emme-gem.org/). The EMME project covers Turkey, Georgia, Armenia, Azerbaijan, Syria, Lebanon, Jordan, Iran, Pakistan, and Afghanistan. Both EMME and SHARE projects overlap and Turkey becomes a bridge connecting the two projects. The Middle East region is tectonically and seismically very active part of the Alpine-Himalayan orogenic belt. Many major earthquakes have occurred in this region over the years causing casualties in the millions. The EMME project consists of three main modules: hazard, risk, and socio-economic modules. The EMME project uses PSHA approach for earthquake hazard and the existing source models have been revised or modified by the incorporation of newly acquired data. The most distinguishing aspect of the EMME project from the previous ones is its dynamic character. This very important characteristic is accomplished by the design of a flexible and scalable database that permits continuous update, refinement, and analysis. An up-to-date earthquake catalog of the Middle East region has been prepared and declustered by the WP1 team. EMME WP2 team has prepared a digital active fault map of the Middle East region in ArcGIS format. We have constructed a database of fault parameters for active faults that are capable of generating earthquakes above a threshold magnitude of Mw≥5.5. The EMME project database includes information on the geometry and rates of movement of faults in a "Fault Section Database", which contains 36 entries for each fault section. The "Fault Section" concept has a physical significance, in that if one or more fault parameters change, a new fault section is defined along a fault zone. So far 6,991 Fault Sections have been defined and 83,402 km of faults are fully parameterized in the Middle East region. A separate "Paleo-Sites Database" includes information on the timing and amounts of fault displacement for major fault zones. A digital reference library, that includes the pdf files of relevant papers, reports and maps, is also prepared. A logic tree approach is utilized to encompass different interpretations for the areas where there is no consensus. Finally seismic source zones in the Middle East region have been delineated using all available data. *EMME Project WP2 Team: Levent Gülen, Murat Utkucu, M. Dinçer Köksal, Hilal Yalçin, Yigit Ince, Mine Demircioglu, Shota Adamia, Nino Sadradze, Aleksandre Gvencadze, Arkadi Karakhanyan, Mher Avanesyan, Tahir Mammadli, Gurban Yetirmishli, Arif Axundov, Khaled Hessami, M. Asif Khan, M. Sayab.
Learning and Digital Portfolios
ERIC Educational Resources Information Center
Moran, Wendy; Vozzo, Les; Reid, Jo-Anne; Pietsch, Marilyn; Hatton, Caroline
2013-01-01
Utilising appropriate Information Communication Technologies (ICT) as instructional tools in teacher education can be a challenging yet worthwhile endeavour. This paper reports the difficulties and benefits of a recent inter-university project requiring preservice primary teachers to construct professional digital portfolios using the support of…
Sweetkind, Donald S.
2017-09-08
As part of a U.S. Geological Survey study in cooperation with the Bureau of Reclamation, a digital three-dimensional hydrogeologic framework model was constructed for the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico. This model was constructed to define the aquifer system geometry and subsurface lithologic characteristics and distribution for use in a regional numerical hydrologic model. The model includes five hydrostratigraphic units: river channel alluvium, three informal subdivisions of Santa Fe Group basin fill, and an undivided pre-Santa Fe Group bedrock unit. Model input data were compiled from published cross sections, well data, structure contour maps, selected geophysical data, and contiguous compilations of surficial geology and structural features in the study area. These data were used to construct faulted surfaces that represent the upper and lower subsurface hydrostratigraphic unit boundaries. The digital three-dimensional hydrogeologic framework model is constructed through combining faults, the elevation of the tops of each hydrostratigraphic unit, and boundary lines depicting the subsurface extent of each hydrostratigraphic unit. The framework also compiles a digital representation of the distribution of sedimentary facies within each hydrostratigraphic unit. The digital three-dimensional hydrogeologic model reproduces with reasonable accuracy the previously published subsurface hydrogeologic conceptualization of the aquifer system and represents the large-scale geometry of the subsurface aquifers. The model is at a scale and resolution appropriate for use as the foundation for a numerical hydrologic model of the study area.
Digital Versus Conventional Impressions in Fixed Prosthodontics: A Review.
Ahlholm, Pekka; Sipilä, Kirsi; Vallittu, Pekka; Jakonen, Minna; Kotiranta, Ulla
2018-01-01
To conduct a systematic review to evaluate the evidence of possible benefits and accuracy of digital impression techniques vs. conventional impression techniques. Reports of digital impression techniques versus conventional impression techniques were systematically searched for in the following databases: Cochrane Central Register of Controlled Trials, PubMed, and Web of Science. A combination of controlled vocabulary, free-text words, and well-defined inclusion and exclusion criteria guided the search. Digital impression accuracy is at the same level as conventional impression methods in fabrication of crowns and short fixed dental prostheses (FDPs). For fabrication of implant-supported crowns and FDPs, digital impression accuracy is clinically acceptable. In full-arch impressions, conventional impression methods resulted in better accuracy compared to digital impressions. Digital impression techniques are a clinically acceptable alternative to conventional impression methods in fabrication of crowns and short FDPs. For fabrication of implant-supported crowns and FDPs, digital impression systems also result in clinically acceptable fit. Digital impression techniques are faster and can shorten the operation time. Based on this study, the conventional impression technique is still recommended for full-arch impressions. © 2016 by the American College of Prosthodontists.
Converting analog interpretive data to digital formats for use in database and GIS applications
Flocks, James G.
2004-01-01
There is a growing need by researchers and managers for comprehensive and unified nationwide datasets of scientific data. These datasets must be in a digital format that is easily accessible using database and GIS applications, providing the user with access to a wide variety of current and historical information. Although most data currently being collected by scientists are already in a digital format, there is still a large repository of information in the literature and paper archive. Converting this information into a format accessible by computer applications is typically very difficult and can result in loss of data. However, since scientific data are commonly collected in a repetitious, concise matter (i.e., forms, tables, graphs, etc.), these data can be recovered digitally by using a conversion process that relates the position of an attribute in two-dimensional space to the information that the attribute signifies. For example, if a table contains a certain piece of information in a specific row and column, then the space that the row and column occupies becomes an index of that information. An index key is used to identify the relation between the physical location of the attribute and the information the attribute contains. The conversion process can be achieved rapidly, easily and inexpensively using widely available digitizing and spreadsheet software, and simple programming code. In the geological sciences, sedimentary character is commonly interpreted from geophysical profiles and descriptions of sediment cores. In the field and laboratory, these interpretations were typically transcribed to paper. The information from these paper archives is still relevant and increasingly important to scientists, engineers and managers to understand geologic processes affecting our environment. Direct scanning of this information produces a raster facsimile of the data, which allows it to be linked to the electronic world. But true integration of the content with database and GIS software as point, vector or text information is commonly lost. Sediment core descriptions and interpretation of geophysical profiles are usually portrayed as lines, curves, symbols and text information. They have vertical and horizontal dimensions associated with depth, category, time, or geographic position. These dimensions are displayed in consistent positions, which can be digitized and converted to a digital format, such as a spreadsheet. Once this data is in a digital, tabulated form it can easily be made available to a wide variety of imaging and data manipulation software for compilation and world-wide dissemination.
Construction of a Linux based chemical and biological information system.
Molnár, László; Vágó, István; Fehér, András
2003-01-01
A chemical and biological information system with a Web-based easy-to-use interface and corresponding databases has been developed. The constructed system incorporates all chemical, numerical and textual data related to the chemical compounds, including numerical biological screen results. Users can search the database by traditional textual/numerical and/or substructure or similarity queries through the web interface. To build our chemical database management system, we utilized existing IT components such as ORACLE or Tripos SYBYL for database management and Zope application server for the web interface. We chose Linux as the main platform, however, almost every component can be used under various operating systems.
East-China Geochemistry Database (ECGD):A New Networking Database for North China Craton
NASA Astrophysics Data System (ADS)
Wang, X.; Ma, W.
2010-12-01
North China Craton is one of the best natural laboratories that research some Earth Dynamic questions[1]. Scientists made much progress in research on this area, and got vast geochemistry data, which are essential for answering many fundamental questions about the age, composition, structure, and evolution of the East China area. But the geochemical data have long been accessible only through the scientific literature and theses where they have been widely dispersed, making it difficult for the broad Geosciences community to find, access and efficiently use the full range of available data[2]. How to effectively store, manage, share and reuse the existing geochemical data in the North China Craton area? East-China Geochemistry Database(ECGD) is a networking geochemical scientific database system that has been designed based on WebGIS and relational database for the structured storage and retrieval of geochemical data and geological map information. It is integrated the functions of data retrieval, spatial visualization and online analysis. ECGD focus on three areas: 1.Storage and retrieval of geochemical data and geological map information. Research on the characters of geochemical data, including its composing and connecting of each other, we designed a relational database, which based on geochemical relational data model, to store a variety of geological sample information such as sampling locality, age, sample characteristics, reference, major elements, rare earth elements, trace elements and isotope system et al. And a web-based user-friendly interface is provided for constructing queries. 2.Data view. ECGD is committed to online data visualization by different ways, especially to view data in digital map with dynamic way. Because ECGD was integrated WebGIS technology, the query results can be mapped on digital map, which can be zoomed, translation and dot selection. Besides of view and output query results data by html, txt or xls formats, researchers also can generate classification thematic maps using query results, according different parameters. 3.Data analysis on-line. Here we designed lots of geochemical online analysis tools, including geochemical diagrams, CIPW computing, and so on, which allows researchers to analyze query data without download query results. Operation of all these analysis tools is very easy; users just do it by click mouse one or two time. In summary, ECGD provide a geochemical platform for researchers, whom to know where various data are, to view various data in a synthetic and dynamic way, and analyze interested data online. REFERENCES [1] S. Gao, R.L. Rudnick, and W.L. Xu, “Recycling deep cratonic lithosphere and generation of intraplate magmatism in the North China Craton,” Earth and Planetary Science Letters,270,41-53,2008. [2] K.A. Lehnert, U. Harms, and E. Ito, “Promises, Achievements, and Challenges of Networking Global Geoinformatics Resources - Experiences of GeosciNET and EarthChem,” Geophysical Research Abstracts, Vol.10, EGU2008-A-05242,2008.
Naessens, James M; Visscher, Sue L; Peterson, Stephanie M; Swanson, Kristi M; Johnson, Matthew G; Rahman, Parvez A; Schindler, Joe; Sonneborn, Mark; Fry, Donald E; Pine, Michael
2015-08-01
Assess algorithms for linking patients across de-identified databases without compromising confidentiality. Hospital discharges from 11 Mayo Clinic hospitals during January 2008-September 2012 (assessment and validation data). Minnesota death certificates and hospital discharges from 2009 to 2012 for entire state (application data). Cross-sectional assessment of sensitivity and positive predictive value (PPV) for four linking algorithms tested by identifying readmissions and posthospital mortality on the assessment data with application to statewide data. De-identified claims included patient gender, birthdate, and zip code. Assessment records were matched with institutional sources containing unique identifiers and the last four digits of Social Security number (SSNL4). Gender, birthdate, and five-digit zip code identified readmissions with a sensitivity of 98.0 percent and a PPV of 97.7 percent and identified postdischarge mortality with 84.4 percent sensitivity and 98.9 percent PPV. Inclusion of SSNL4 produced nearly perfect identification of readmissions and deaths. When applied statewide, regions bordering states with unavailable hospital discharge data had lower rates. Addition of SSNL4 to administrative data, accompanied by appropriate data use and data release policies, can enable trusted repositories to link data with nearly perfect accuracy without compromising patient confidentiality. States maintaining centralized de-identified databases should add SSNL4 to data specifications. © Health Research and Educational Trust.
Preserving privacy of online digital physiological signals using blind and reversible steganography.
Shiu, Hung-Jr; Lin, Bor-Sing; Huang, Chien-Hung; Chiang, Pei-Ying; Lei, Chin-Laung
2017-11-01
Physiological signals such as electrocardiograms (ECG) and electromyograms (EMG) are widely used to diagnose diseases. Presently, the Internet offers numerous cloud storage services which enable digital physiological signals to be uploaded for convenient access and use. Numerous online databases of medical signals have been built. The data in them must be processed in a manner that preserves patients' confidentiality. A reversible error-correcting-coding strategy will be adopted to transform digital physiological signals into a new bit-stream that uses a matrix in which is embedded the Hamming code to pass secret messages or private information. The shared keys are the matrix and the version of the Hamming code. An online open database, the MIT-BIH arrhythmia database, was used to test the proposed algorithms. The time-complexity, capacity and robustness are evaluated. Comparisons of several evaluations subject to related work are also proposed. This work proposes a reversible, low-payload steganographic scheme for preserving the privacy of physiological signals. An (n, m)-hamming code is used to insert (n - m) secret bits into n bits of a cover signal. The number of embedded bits per modification is higher than in comparable methods, and the computational power is efficient and the scheme is secure. Unlike other Hamming-code based schemes, the proposed scheme is both reversible and blind. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Clyde, Jerremie; Wilkinson, Glenn R.
2012-01-01
The gamic mode is an innovative way of authoring scholarly history that goes beyond the printed text or digital simulations by using digital game technologies to allow the reader to interact with a scholarly argument through meaningful choice and trial and error. The gamic mode makes the way in which the past is constructed as history explicit by…
Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases
NASA Astrophysics Data System (ADS)
Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.
2013-12-01
NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.
78 FR 70020 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
...; citizenship; physical characteristics; employment and military service history; credit references and credit... digital images, and in electronic databases. Background investigation forms are maintained in the...
Image correlation method for DNA sequence alignment.
Curilem Saldías, Millaray; Villarroel Sassarini, Felipe; Muñoz Poblete, Carlos; Vargas Vásquez, Asticio; Maureira Butler, Iván
2012-01-01
The complexity of searches and the volume of genomic data make sequence alignment one of bioinformatics most active research areas. New alignment approaches have incorporated digital signal processing techniques. Among these, correlation methods are highly sensitive. This paper proposes a novel sequence alignment method based on 2-dimensional images, where each nucleic acid base is represented as a fixed gray intensity pixel. Query and known database sequences are coded to their pixel representation and sequence alignment is handled as object recognition in a scene problem. Query and database become object and scene, respectively. An image correlation process is carried out in order to search for the best match between them. Given that this procedure can be implemented in an optical correlator, the correlation could eventually be accomplished at light speed. This paper shows an initial research stage where results were "digitally" obtained by simulating an optical correlation of DNA sequences represented as images. A total of 303 queries (variable lengths from 50 to 4500 base pairs) and 100 scenes represented by 100 x 100 images each (in total, one million base pair database) were considered for the image correlation analysis. The results showed that correlations reached very high sensitivity (99.01%), specificity (98.99%) and outperformed BLAST when mutation numbers increased. However, digital correlation processes were hundred times slower than BLAST. We are currently starting an initiative to evaluate the correlation speed process of a real experimental optical correlator. By doing this, we expect to fully exploit optical correlation light properties. As the optical correlator works jointly with the computer, digital algorithms should also be optimized. The results presented in this paper are encouraging and support the study of image correlation methods on sequence alignment.
Automated extraction of chemical structure information from digital raster images
Park, Jungkap; Rosania, Gus R; Shedden, Kerby A; Nguyen, Mandee; Lyu, Naesung; Saitou, Kazuhiro
2009-01-01
Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links to scientific research articles. PMID:19196483
Petit, Pascal; Bicout, Dominique J; Persoons, Renaud; Bonneterre, Vincent; Barbeau, Damien; Maître, Anne
2017-05-01
Similar exposure groups (SEGs) are needed to reliably assess occupational exposures and health risks. However, the construction of SEGs can turn out to be rather challenging because of the multifactorial variability of exposures. The objective of this study is to put forward a semi-empirical approach developed to construct and implement a SEG database for exposure assessments. An occupational database of airborne levels of polycyclic aromatic hydrocarbons (PAHs) was used as an illustrative and working example. The approach that was developed consisted of four steps. The first three steps addressed the construction and implementation of the occupational Exporisq-HAP database (E-HAP). E-HAP was structured into three hierarchical levels of exposure groups, each of which was based on exposure determinants, along 16 dimensions that represented the sampled PAHs. A fourth step was implemented to identify and generate SEGs using the geometric standard deviation (GSD) of PAH concentrations. E-HAP was restructured into 16 (for 16 sampled PAHs) 3 × 3 matrices: three hierarchical levels of description versus three degrees of dispersion, which included low (the SEG database: GSD ≤ 3), medium (3 < GSD ≤ 6), and high (GSD > 6). Benzo[a]pyrene (BaP) was the least dispersed particulate PAH with 41.5% of groups that could be considered as SEGs, 48.5% of groups of medium dispersion, and only 8% with high dispersion. These results were comparable for BaP, BaP equivalent toxic, or the sum of all carcinogenic PAHs but were different when individual gaseous PAHs or ∑PAHG were chosen. Within the framework of risk assessment, such an approach, based on groundwork studies, allows for both the construction of an SEG database and the identification of exposure groups that require improvements in either the description level or the homogeneity degree toward SEG. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
A Global Digital Database and Atlas of Quaternary Dune Fields and Sand Seas
NASA Astrophysics Data System (ADS)
Lancaster, N.; Halfen, A. F.
2012-12-01
Sand seas and dune fields are globally significant sedimentary deposits, which archive the effects of climate and sea level change on a variety of temporal and spatial scales. Dune systems provide a valuable source of information on past climate conditions, including evidence for periods of aridity and unique data on past wind regimes. Researchers have compiled vast quantities of geomorphic and chronological data from these dune systems for nearly half a century, however, these data remain disconnected, making comparisons of dune systems challenging at global and regional scales. The primary goal of this project is to develop a global digital database of chronologic information for periods of desert sand dune accumulation and stabilization, as well as, pertinent stratigraphic and geomorphic information. This database can then be used by scientists to 1) document the history of aeolian processes in arid regions with emphasis on dune systems in low and mid latitude deserts, 2) correlate periods of sand accumulation and stability with other terrestrial and marine paleoclimatic proxies and records, and 3) develop an improved understanding of the response of dune systems to climate change. The database currently resides in Microsoft Access format, which allows searching and filtering of data. The database includes 4 linked tables containing information on the site, chronological control (radiocarbon or luminescence), and the pertinent literature citations. Thus far the database contains information for 838 sites world wide, comprising 2598 luminescence and radiocarbon ages, though these numbers increase regularly as new data is added. The database is only available on request at this time, however, an online, GIS database is being developed and will be available in the near future. Data outputs from the online database will include PDF reports and Google Earth formatted data sets for quick viewing of data. Additionally, data will be available in a gridded format for wider use in data-model comparisons. Sites in database August 2012
Minor, S.A.; Vick, G.S.; Carr, M.D.; Wahl, R.R.
1996-01-01
This map database, identified as Faults, lineaments, and earthquake epicenters digital map of the Pahute Mesa 30' X 60' quadrangle, Nevada, has been approved for release and publication by the Director of the USGS. Although this database has been subjected to rigorous review and is substantially complete, the USGS reserves the right to revise the data pursuant to further analysis and review. Furthermore, it is released on condition that neither the USGS nor the United States Government may be held liable for any damages resulting from its authorized or unauthorized use. This digital map compilation incorporates fault, air photo lineament, and earthquake epicenter data from within the Pahute Mesa 30' by 60' quadrangle, southern Nye County, Nevada (fig. 1). The compilation contributes to the U.S. Department of Energy's Yucca Mountain Project, established to determine whether or not the Yucca Mountain site is suitable for the disposal of high-level nuclear waste. Studies of local and regional faulting and earthquake activity, including the features depicted in this compilation, are carried out to help characterize seismic hazards and tectonic processes that may be relevant to the future stability of Yucca Mountain. The Yucca Mountain site is located in the central part of the Beatty 30' by 60' quadrangle approximately 15 km south of the south edge of the Pahute Mesa quadrangle (fig. 1). The U.S. Geological Survey participates in studies of the Yucca Mountain site under Interagency Agreement DE-AI08-78ET44802. The map compilation is only available on line as a digital database in ARC/INFO ASCII (Generate) and export formats. The database can be downloaded via 'anonymous ftp' from a USGS system named greenwood.cr.usgs.gov (136.177.48.5). The files are located in a directory named /pub/open-file-reports/ofr-96-0262. This directory contains a text document named 'README.1 ST' that contains database technical and explanatory documentation, including instructions for uncompressing the bundled (tar) file. In displaying the compilation it is important to note that the map data set is considered accurate when depicted at a scale of about 1:100,000; displaying the compilation at scales significantly larger than this may result in distortions and (or) mislocations of the data.
Choosing an Optimal Database for Protein Identification from Tandem Mass Spectrometry Data.
Kumar, Dhirendra; Yadav, Amit Kumar; Dash, Debasis
2017-01-01
Database searching is the preferred method for protein identification from digital spectra of mass to charge ratios (m/z) detected for protein samples through mass spectrometers. The search database is one of the major influencing factors in discovering proteins present in the sample and thus in deriving biological conclusions. In most cases the choice of search database is arbitrary. Here we describe common search databases used in proteomic studies and their impact on final list of identified proteins. We also elaborate upon factors like composition and size of the search database that can influence the protein identification process. In conclusion, we suggest that choice of the database depends on the type of inferences to be derived from proteomics data. However, making additional efforts to build a compact and concise database for a targeted question should generally be rewarding in achieving confident protein identifications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, Jennifer; Sandberg, Tami
The Wind-Wildlife Impacts Literature Database (WILD), formerly known as the Avian Literature Database, was created in 1997. The goal of the database was to begin tracking the research that detailed the potential impact of wind energy development on birds. The Avian Literature Database was originally housed on a proprietary platform called Livelink ECM from Open- Text and maintained by in-house technical staff. The initial set of records was added by library staff. A vital part of the newly launched Drupal-based WILD database is the Bibliography module. Many of the resources included in the database have digital object identifiers (DOI). Themore » bibliographic information for any item that has a DOI can be imported into the database using this module. This greatly reduces the amount of manual data entry required to add records to the database. The content available in WILD is international in scope, which can be easily discerned by looking at the tags available in the browse menu.« less
Database Design and Management in Engineering Optimization.
1988-02-01
scientific and engineer- Q.- ’ method In the mid-19SOs along with modern digital com- ing applications. The paper highlights the difference puters, have made...is continuously tion software can call standard subroutines from the DBMS redefined in an application program, DDL must have j libary to define...operations. .. " type data usually encountered in engineering applications. GFDGT: Computes the number of digits needed to display " "’ A user
Comparative Study Of Image Enhancement Algorithms For Digital And Film Mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgado-Gonzalez, A.; Sanmiguel, R. E.
2008-08-11
Here we discuss the application of edge enhancement algorithms on images obtained with a Mammography System which has a Selenium Detector and on the other hand, on images obtained from digitized film mammography. Comparative analysis of such images includes the study of technical aspects of image acquisition, storage, compression and display. A protocol for a local database has been created as a result of this study.
2009-09-01
Forecasts ECS East China Sea ESRL Earth Systems Research Laboratory FA False alarm FARate False alarm rate xviii GDEM Generalized Digital...uses a LTM based, global ocean climatology database called Generalized Digital Environment Model ( GDEM ), in tactical decision aid (TDA) software, such...environment for USW planning. GDEM climatology is derived using temperature and salinity profiles from the Modular Ocean Data Assimilation System
Digital mining claim density map for federal lands in Idaho: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Idaho as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Digital mining claim density map for federal lands in Oregon: 1996
Hyndman, Paul C.; Campbell, Harry W.
1999-01-01
This report describes a digital map generated by the U.S. Geological Survey (USGS) to provide digital spatial mining claim density information for federal lands in Oregon as of March 1997. Mining claim data is earth science information deemed to be relevant to the assessment of historic, current, and future ecological, economic, and social systems. There is no paper map included in this Open-File report. In accordance with the Federal Land Policy and Management Act of 1976 (FLPMA), all unpatented mining claims, mill and tunnel sites must be recorded at the appropriate Bureau of Land Management (BLM) State office. BLM maintains a cumulative computer listing of mining claims in the Mining Claim Recordation System (MCRS) database with locations given by meridian, township, range, and section. A mining claim is considered closed when the claim is relinquished or a formal BLM decision declaring the mining claim null and void has been issued and the appeal period has expired. All other mining claims filed with BLM are considered to be open and actively held. The digital map (figure 1.) with the mining claim density database available in this report are suitable for geographic information system (GIS)-based regional assessments at a scale of 1:100,000 or smaller.
Wu, Chueh-Yu; Lu, Jau-Ching; Liu, Man-Chi; Tung, Yi-Chung
2012-10-21
Microfluidic technology plays an essential role in various lab on a chip devices due to its desired advantages. An automated microfluidic system integrated with actuators and sensors can further achieve better controllability. A number of microfluidic actuation schemes have been well developed. In contrast, most of the existing sensing methods still heavily rely on optical observations and external transducers, which have drawbacks including: costly instrumentation, professional operation, tedious interfacing, and difficulties of scaling up and further signal processing. This paper reports the concept of electrofluidic circuits - electrical circuits which are constructed using ionic liquid (IL)-filled fluidic channels. The developed electrofluidic circuits can be fabricated using a well-developed multi-layer soft lithography (MSL) process with polydimethylsiloxane (PDMS) microfluidic channels. Electrofluidic circuits allow seamless integration of pressure sensors with analog and digital operation functions into microfluidic systems and provide electrical readouts for further signal processing. In the experiments, the analog operation device is constructed based on electrofluidic Wheatstone bridge circuits with electrical outputs of the addition and subtraction results of the applied pressures. The digital operation (AND, OR, and XOR) devices are constructed using the electrofluidic pressure controlled switches, and output electrical signals of digital operations of the applied pressures. The experimental results demonstrate the designed functions for analog and digital operations of applied pressures are successfully achieved using the developed electrofluidic circuits, making them promising to develop integrated microfluidic systems with capabilities of precise pressure monitoring and further feedback control for advanced lab on a chip applications.
Construct validation of an interactive digital algorithm for ostomy care.
Beitz, Janice M; Gerlach, Mary A; Schafer, Vickie
2014-01-01
The purpose of this study was to evaluate construct validity for a previously face and content validated Ostomy Algorithm using digital real-life clinical scenarios. A cross-sectional, mixed-methods Web-based survey design study was conducted. Two hundred ninety-seven English-speaking RNs completed the study; participants practiced in both acute care and postacute settings, with 1 expert ostomy nurse (WOC nurse) and 2 nonexpert nurses. Following written consent, respondents answered demographic questions and completed a brief algorithm tutorial. Participants were then presented with 7 ostomy-related digital scenarios consisting of real-life photos and pertinent clinical information. Respondents used the 11 assessment components of the digital algorithm to choose management options. Participant written comments about the scenarios and the research process were collected. The mean overall percentage of correct responses was 84.23%. Mean percentage of correct responses for respondents with a self-reported basic ostomy knowledge was 87.7%; for those with a self-reported intermediate ostomy knowledge was 85.88% and those who were self-reported experts in ostomy care achieved 82.77% correct response rate. Five respondents reported having no prior ostomy care knowledge at screening and achieved an overall 45.71% correct response rate. No negative comments regarding the algorithm were recorded by participants. The new standardized Ostomy Algorithm remains the only face, content, and construct validated digital clinical decision instrument currently available. Further research on application at the bedside while tracking patient outcomes is warranted.
Digital pathology in nephrology clinical trials, research, and pathology practice.
Barisoni, Laura; Hodgin, Jeffrey B
2017-11-01
In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.
Miller, David M.; Bedford, David R.
2000-01-01
This geologic map database for the El Mirage Lake area describes geologic materials for the dry lake, parts of the adjacent Shadow Mountains and Adobe Mountain, and much of the piedmont extending south from the lake upward toward the San Gabriel Mountains. This area lies within the western Mojave Desert of San Bernardino and Los Angeles Counties, southeastern California. The area is traversed by a few paved highways that service the community of El Mirage, and by numerous dirt roads that lead to outlying properties. An off-highway vehicle area established by the Bureau of Land Management encompasses the dry lake and much of the land north and east of the lake. The physiography of the area consists of the dry lake, flanking mud and sand flats and alluvial piedmonts, and a few sharp craggy mountains. This digital geologic map database, intended for use at 1:24,000-scale, describes and portrays the rock units and surficial deposits of the El Mirage Lake area. The map database was prepared to aid in a water-resource assessment of the area by providing surface geologic information with which deepergroundwater-bearing units may be understood. The area mapped covers the Shadow Mountains SE and parts of the Shadow Mountains, Adobe Mountain, and El Mirage 7.5-minute quadrangles. The map includes detailed geology of surface and bedrock deposits, which represent a significant update from previous bedrock geologic maps by Dibblee (1960) and Troxel and Gunderson (1970), and the surficial geologic map of Ponti and Burke (1980); it incorporates a fringe of the detailed bedrock mapping in the Shadow Mountains by Martin (1992). The map data were assembled as a digital database using ARC/INFO to enable wider applications than traditional paper-product geologic maps and to provide for efficient meshing with other digital data bases prepared by the U.S. Geological Survey's Southern California Areal Mapping Project.
Mutagenicity and carcinogenicity databases are crucial resources for toxicologists and regulators involved in chemicals risk assessment. Until recently, existing public toxicity databases have been constructed primarily as "look-up-tables" of existing data, and most often did no...
Hydrologic Derivatives for Modeling and Analysis—A new global high-resolution database
Verdin, Kristine L.
2017-07-17
The U.S. Geological Survey has developed a new global high-resolution hydrologic derivative database. Loosely modeled on the HYDRO1k database, this new database, entitled Hydrologic Derivatives for Modeling and Analysis, provides comprehensive and consistent global coverage of topographically derived raster layers (digital elevation model data, flow direction, flow accumulation, slope, and compound topographic index) and vector layers (streams and catchment boundaries). The coverage of the data is global, and the underlying digital elevation model is a hybrid of three datasets: HydroSHEDS (Hydrological data and maps based on SHuttle Elevation Derivatives at multiple Scales), GMTED2010 (Global Multi-resolution Terrain Elevation Data 2010), and the SRTM (Shuttle Radar Topography Mission). For most of the globe south of 60°N., the raster resolution of the data is 3 arc-seconds, corresponding to the resolution of the SRTM. For the areas north of 60°N., the resolution is 7.5 arc-seconds (the highest resolution of the GMTED2010 dataset) except for Greenland, where the resolution is 30 arc-seconds. The streams and catchments are attributed with Pfafstetter codes, based on a hierarchical numbering system, that carry important topological information. This database is appropriate for use in continental-scale modeling efforts. The work described in this report was conducted by the U.S. Geological Survey in cooperation with the National Aeronautics and Space Administration Goddard Space Flight Center.
Turning Access into a web-enabled secure information system for clinical trials.
Dongquan Chen; Chen, Wei-Bang; Soong, Mayhue; Soong, Seng-Jaw; Orthner, Helmuth F
2009-08-01
Organizations that have limited resources need to conduct clinical studies in a cost-effective, but secure way. Clinical data residing in various individual databases need to be easily accessed and secured. Although widely available, digital certification, encryption, and secure web server, have not been implemented as widely, partly due to a lack of understanding of needs and concerns over issues such as cost and difficulty in implementation. The objective of this study was to test the possibility of centralizing various databases and to demonstrate ways of offering an alternative to a large-scale comprehensive and costly commercial product, especially for simple phase I and II trials, with reasonable convenience and security. We report a working procedure to transform and develop a standalone Access database into a secure Web-based secure information system. For data collection and reporting purposes, we centralized several individual databases; developed, and tested a web-based secure server using self-issued digital certificates. The system lacks audit trails. The cost of development and maintenance may hinder its wide application. The clinical trial databases scattered in various departments of an institution could be centralized into a web-enabled secure information system. The limitations such as the lack of a calendar and audit trail can be partially addressed with additional programming. The centralized Web system may provide an alternative to a comprehensive clinical trial management system.
Digital Technologies and Pedagogies.
ERIC Educational Resources Information Center
Weis, Tracey M.; Benmayor, Rina; O'Leary, Cecilia; Eynon, Bret
2002-01-01
Shares four college professors' experiences using new media to change approaches to teaching and learning. In their classes, students conduct archival research on African American history in Web-based sites, then construct collaborative interpretations in PowerPoint; incorporate digital storytelling (within a Latina Life Stories class); construct…
Student Engagement through Digital Data
ERIC Educational Resources Information Center
Gross, Liz; Meriwether, Jason L.
2016-01-01
This chapter suggests strategies and tools for student affairs professionals to leverage digital data to measure student engagement and learning outcomes, and refine programs that enhance institutional reputation and improve student persistence. The construct of student engagement is traced from its theoretical origins to recent research…
Photocopy of photograph (digital image located in LBNL Photo Lab ...
Photocopy of photograph (digital image located in LBNL Photo Lab Collection, XBD200503-00117-046). March 2005. ROOF SHIELDING BLOCK AND I-BEAM SUPPORT CONSTRUCTION, CENTER OF BEVATRON - University of California Radiation Laboratory, Bevatron, 1 Cyclotron Road, Berkeley, Alameda County, CA
NASA Astrophysics Data System (ADS)
Pastukhov, A. V.; Kaverin, D. A.; Shchanov, V. M.
2016-09-01
A digital map of soil carbon pools was created for the forest-tundra ecotone in the Usa River basin with the use of ERDAS Imagine 2014 and ArcGIS 10.2 software. Supervised classification and thematic interpretation of satellite images and digital terrain models with the use of a georeferenced database on soil profiles were applied. Expert assessment of the natural diversity and representativeness of random samples for different soil groups was performed, and the minimal necessary size of the statistical sample was determined.
NASA Technical Reports Server (NTRS)
Dunst, Ben
2011-01-01
The height at which smoke from a wildfire is injected into the atmosphere is an important parameter for climatology, because it determines how far the smoke can be transported. Using the MINX program to analyze MISR (Multi-angle Imaging Spectro-Radiometer) data, I digitized wildfire smoke plumes to add to an existing database of these heights for use by scientists studying smoke transport and plume dynamics. In addition to using MINX to do production digitizing of heights, I assisted in gathering lidar data for an ongoing validation of MINX and helped evaluate those data.
NASA Astrophysics Data System (ADS)
Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.
2018-03-01
The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.
Geologic map and map database of the Palo Alto 30' x 60' quadrangle, California
Brabb, E.E.; Jones, D.L.; Graymer, R.W.
2000-01-01
This digital map database, compiled from previously published and unpublished data, and new mapping by the authors, represents the general distribution of bedrock and surficial deposits in the mapped area. Together with the accompanying text file (pamf.ps, pamf.pdf, pamf.txt), it provides current information on the geologic structure and stratigraphy of the area covered. The database delineates map units that are identified by general age and lithology following the stratigraphic nomenclature of the U.S. Geological Survey. The scale of the source maps limits the spatial resolution (scale) of the database to 1:62,500 or smaller.