Modernization and multiscale databases at the U.S. geological survey
Morrison, J.L.
1992-01-01
The U.S. Geological Survey (USGS) has begun a digital cartographic modernization program. Keys to that program are the creation of a multiscale database, a feature-based file structure that is derived from a spatial data model, and a series of "templates" or rules that specify the relationships between instances of entities in reality and features in the database. The database will initially hold data collected from the USGS standard map products at scales of 1:24,000, 1:100,000, and 1:2,000,000. The spatial data model is called the digital line graph-enhanced model, and the comprehensive rule set consists of collection rules, product generation rules, and conflict resolution rules. This modernization program will affect the USGS mapmaking process because both digital and graphic products will be created from the database. In addition, non-USGS map users will have more flexibility in uses of the databases. These remarks are those of the session discussant made in response to the six papers and the keynote address given in the session. ?? 1992.
Modernization and new technologies: Coping with the information explosion
NASA Technical Reports Server (NTRS)
Blados, Walter R.; Cotter, Gladys A.
1993-01-01
Information has become a valuable and strategic resource in all societies and economies. Scientific and technical information is especially important in developing and maintaining a strong national science and technology base. The expanding use of information technology, the growth of interdisciplinary research, and an increase in international collaboration are changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls'. Notwithstanding this modernization program, it is recogized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected to cover expanded sources of information. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative, one that can optimize the cost/benefit equation for all participants.
Modernization and unification: Strategic goals for NASA STI program
NASA Technical Reports Server (NTRS)
Blados, W.; Cotter, Gladys A.
1993-01-01
Information is increasingly becoming a strategic resource in all societies and economies. The NASA Scientific and Technical Information (STI) Program has initiated a modernization program to address the strategic importance and changing characteristics of information. This modernization effort applies new technology to current processes to provide near-term benefits to the user. At the same time, we are developing a long-term modernization strategy designed to transition the program to a multimedia, global 'library without walls.' Notwithstanding this modernization program, it is recognized that no one information center can hope to collect all the relevant data. We see information and information systems changing and becoming more international in scope. We are finding that many nations are expending resources on national systems which duplicate each other. At the same time that this duplication exists, many useful sources of aerospace information are not being collected because of resource limitations. If nations cooperate to develop an international aerospace information system, resources can be used efficiently to cover expanded sources of information. We must consider forming a coalition to collect and provide access to disparate, multidisciplinary sources of information, and to develop standardized tools for documenting and manipulating this data and information. In view of recent technological developments in information science and technology, as well as the reality of scarce resources in all nations, it is time to explore the mutually beneficial possibilities offered by cooperation and international resource sharing. International resources need to be mobilized in a coordinated manner to move us towards this goal. This paper reviews the NASA modernization program and raises for consideration new possibilities for unification of the various aerospace database efforts toward a cooperative international aerospace database initiative that can optimize the cost/benefit equation for all participants.
Development of expert systems for analyzing electronic documents
NASA Astrophysics Data System (ADS)
Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.
2018-05-01
The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.
NASA Astrophysics Data System (ADS)
Zyelyk, Ya. I.; Semeniv, O. V.
2015-12-01
The state of the problem of the post-launch calibration of the satellite electro-optic remote sensors and its solutions in Ukraine is analyzed. The database is improved and dynamic services for user interaction with database from the environment of open geographical information system Quantum GIS for information support of calibration activities are created. A dynamic application under QGIS is developed, implementing these services in the direction of the possibility of data entering, editing and extraction from the database, using the technology of object-oriented programming and of modern complex program design patterns. The functional and algorithmic support of this dynamic software and its interface are developed.
NASA Astrophysics Data System (ADS)
Hayrapetyan, David B.; Hovhannisyan, Levon; Mantashyan, Paytsar A.
2013-04-01
The analysis of complex spectra is an actual problem for modern science. The work is devoted to the creation of a software package, which analyzes spectrum in the different formats, possesses by dynamic knowledge database and self-study mechanism, performs automated analysis of the spectra compound based on knowledge database by application of certain algorithms. In the software package as searching systems, hyper-spherical random search algorithms, gradient algorithms and genetic searching algorithms were used. The analysis of Raman and IR spectrum of diamond-like carbon (DLC) samples were performed by elaborated program. After processing the data, the program immediately displays all the calculated parameters of DLC.
NASA Astrophysics Data System (ADS)
Dornback, M.; Hourigan, T.; Etnoyer, P.; McGuinn, R.; Cross, S. L.
2014-12-01
Research on deep-sea corals has expanded rapidly over the last two decades, as scientists began to realize their value as long-lived structural components of high biodiversity habitats and archives of environmental information. The NOAA Deep Sea Coral Research and Technology Program's National Database for Deep-Sea Corals and Sponges is a comprehensive resource for georeferenced data on these organisms in U.S. waters. The National Database currently includes more than 220,000 deep-sea coral records representing approximately 880 unique species. Database records from museum archives, commercial and scientific bycatch, and from journal publications provide baseline information with relatively coarse spatial resolution dating back as far as 1842. These data are complemented by modern, in-situ submersible observations with high spatial resolution, from surveys conducted by NOAA and NOAA partners. Management of high volumes of modern high-resolution observational data can be challenging. NOAA is working with our data partners to incorporate this occurrence data into the National Database, along with images and associated information related to geoposition, time, biology, taxonomy, environment, provenance, and accuracy. NOAA is also working to link associated datasets collected by our program's research, to properly archive them to the NOAA National Data Centers, to build a robust metadata record, and to establish a standard protocol to simplify the process. Access to the National Database is provided through an online mapping portal. The map displays point based records from the database. Records can be refined by taxon, region, time, and depth. The queries and extent used to view the map can also be used to download subsets of the database. The database, map, and website is already in use by NOAA, regional fishery management councils, and regional ocean planning bodies, but we envision it as a model that can expand to accommodate data on a global scale.
36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
36 CFR 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
36 CFR § 1225.24 - When can an agency apply previously approved schedules to electronic records?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION RECORDS MANAGEMENT SCHEDULING RECORDS § 1225.24 When... must notify the National Archives and Records Administration, Modern Records Programs (NWM), 8601... authority reference; and (v) Format of the records (e.g., database, scanned images, digital photographs, etc...
García, Manuel Pérez
2011-01-01
In this article, the author explains how the support of new technologies has helped historians to develop their research over the last few decades. The author, therefore, summarizes the application of both database and genealogical programs for the southern Europe family studies as a methodological tool. First, the author will establish the importance of the creation of databases using the File Maker program, after which they will explain the value of using genealogical programs such as Genopro and Heredis. The main aim of this article is to give detail about the use of these new technologies as applied to a particular study of southern Europe, specifically the Crown of Castile, during the late modern period. The use of these computer programs has helped to develop the field of social sciences and family history, in particular, social history, during the last decade.
Modernization of the NASA scientific and technical information program
NASA Technical Reports Server (NTRS)
Cotter, Gladys A.; Hunter, Judy F.; Ostergaard, K.
1993-01-01
The NASA Scientific and Technical Information Program utilizes a technology infrastructure assembled in the mid 1960s to late 1970s to process and disseminate its information products. When this infrastructure was developed it placed NASA as a leader in processing STI. The retrieval engine for the STI database was the first of its kind and was used as the basis for developing commercial, other U.S., and foreign government agency retrieval systems. Due to the combination of changes in user requirements and the tremendous increase in technological capabilities readily available in the marketplace, this infrastructure is no longer the most cost-effective or efficient methodology available. Consequently, the NASA STI Program is pursuing a modernization effort that applies new technology to current processes to provide near-term benefits to the user. In conjunction with this activity, we are developing a long-term modernization strategy designed to transition the Program to a multimedia, global 'library without walls.' Critical pieces of the long-term strategy include streamlining access to sources of STI by using advances in computer networking and graphical user interfaces; creating and disseminating technical information in various electronic media including optical disks, video, and full text; and establishing a Technology Focus Group to maintain a current awareness of emerging technology and to plan for the future.
Fiacco, P. A.; Rice, W. H.
1991-01-01
Computerized medical record systems require structured database architectures for information processing. However, the data must be able to be transferred across heterogeneous platform and software systems. Client-Server architecture allows for distributive processing of information among networked computers and provides the flexibility needed to link diverse systems together effectively. We have incorporated this client-server model with a graphical user interface into an outpatient medical record system, known as SuperChart, for the Department of Family Medicine at SUNY Health Science Center at Syracuse. SuperChart was developed using SuperCard and Oracle SuperCard uses modern object-oriented programming to support a hypermedia environment. Oracle is a powerful relational database management system that incorporates a client-server architecture. This provides both a distributed database and distributed processing which improves performance. PMID:1807732
Reasoning on Weighted Delegatable Authorizations
NASA Astrophysics Data System (ADS)
Ruan, Chun; Varadharajan, Vijay
This paper studies logic based methods for representing and evaluating complex access control policies needed by modern database applications. In our framework, authorization and delegation rules are specified in a Weighted Delegatable Authorization Program (WDAP) which is an extended logic program. We show how extended logic programs can be used to specify complex security policies which support weighted administrative privilege delegation, weighted positive and negative authorizations, and weighted authorization propagations. We also propose a conflict resolution method that enables flexible delegation control by considering priorities of authorization grantors and weights of authorizations. A number of rules are provided to achieve delegation depth control, conflict resolution, and authorization and delegation propagations.
Database Design and Management in Engineering Optimization.
1988-02-01
scientific and engineer- Q.- ’ method In the mid-19SOs along with modern digital com- ing applications. The paper highlights the difference puters, have made...is continuously tion software can call standard subroutines from the DBMS redefined in an application program, DDL must have j libary to define...operations. .. " type data usually encountered in engineering applications. GFDGT: Computes the number of digits needed to display " "’ A user
TOXCAST, A TOOL FOR CATEGORIZATION AND ...
Across several EPA Program Offices (e.g., OPPTS, OW, OAR), there is a clear need to develop strategies and methods to screen large numbers of chemicals for potential toxicity, and to use the resulting information to prioritize the use of testing resources towards those entities and endpoints that present the greatest likelihood of risk to human health and the environment. This need could be addressed using the experience of the pharmaceutical industry in the use of advanced modern molecular biology and computational chemistry tools for the development of new drugs, with appropriate adjustment to the needs and desires of environmental toxicology. A conceptual approach named ToxCast has been developed to address the needs of EPA Program Offices in the area of prioritization and screening. Modern computational chemistry and molecular biology tools bring enabling technologies forward that can provide information about the physical and biological properties of large numbers of chemicals. The essence of the proposal is to conduct a demonstration project based upon a rich toxicological database (e.g., registered pesticides, or the chemicals tested in the NTP bioassay program), select a fairly large number (50-100 or more chemicals) representative of a number of differing structural classes and phenotypic outcomes (e.g., carcinogens, reproductive toxicants, neurotoxicants), and evaluate them across a broad spectrum of information domains that modern technology has pro
New DMSP database of precipitating auroral electrons and ions
NASA Astrophysics Data System (ADS)
Redmon, Robert J.; Denig, William F.; Kilcommons, Liam M.; Knipp, Delores J.
2017-08-01
Since the mid-1970s, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low Earth orbit. As the program evolved, so have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) to F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information and the Coordinated Data Analysis Web. We describe how the new database is being applied to high-latitude studies of the colocation of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field-aligned currents, and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.
New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases
NASA Astrophysics Data System (ADS)
Brescia, Massimo
2012-11-01
Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.
SS-Wrapper: a package of wrapper applications for similarity searches on Linux clusters.
Wang, Chunlin; Lefkowitz, Elliot J
2004-10-28
Large-scale sequence comparison is a powerful tool for biological inference in modern molecular biology. Comparing new sequences to those in annotated databases is a useful source of functional and structural information about these sequences. Using software such as the basic local alignment search tool (BLAST) or HMMPFAM to identify statistically significant matches between newly sequenced segments of genetic material and those in databases is an important task for most molecular biologists. Searching algorithms are intrinsically slow and data-intensive, especially in light of the rapid growth of biological sequence databases due to the emergence of high throughput DNA sequencing techniques. Thus, traditional bioinformatics tools are impractical on PCs and even on dedicated UNIX servers. To take advantage of larger databases and more reliable methods, high performance computation becomes necessary. We describe the implementation of SS-Wrapper (Similarity Search Wrapper), a package of wrapper applications that can parallelize similarity search applications on a Linux cluster. Our wrapper utilizes a query segmentation-search (QS-search) approach to parallelize sequence database search applications. It takes into consideration load balancing between each node on the cluster to maximize resource usage. QS-search is designed to wrap many different search tools, such as BLAST and HMMPFAM using the same interface. This implementation does not alter the original program, so newly obtained programs and program updates should be accommodated easily. Benchmark experiments using QS-search to optimize BLAST and HMMPFAM showed that QS-search accelerated the performance of these programs almost linearly in proportion to the number of CPUs used. We have also implemented a wrapper that utilizes a database segmentation approach (DS-BLAST) that provides a complementary solution for BLAST searches when the database is too large to fit into the memory of a single node. Used together, QS-search and DS-BLAST provide a flexible solution to adapt sequential similarity searching applications in high performance computing environments. Their ease of use and their ability to wrap a variety of database search programs provide an analytical architecture to assist both the seasoned bioinformaticist and the wet-bench biologist.
SS-Wrapper: a package of wrapper applications for similarity searches on Linux clusters
Wang, Chunlin; Lefkowitz, Elliot J
2004-01-01
Background Large-scale sequence comparison is a powerful tool for biological inference in modern molecular biology. Comparing new sequences to those in annotated databases is a useful source of functional and structural information about these sequences. Using software such as the basic local alignment search tool (BLAST) or HMMPFAM to identify statistically significant matches between newly sequenced segments of genetic material and those in databases is an important task for most molecular biologists. Searching algorithms are intrinsically slow and data-intensive, especially in light of the rapid growth of biological sequence databases due to the emergence of high throughput DNA sequencing techniques. Thus, traditional bioinformatics tools are impractical on PCs and even on dedicated UNIX servers. To take advantage of larger databases and more reliable methods, high performance computation becomes necessary. Results We describe the implementation of SS-Wrapper (Similarity Search Wrapper), a package of wrapper applications that can parallelize similarity search applications on a Linux cluster. Our wrapper utilizes a query segmentation-search (QS-search) approach to parallelize sequence database search applications. It takes into consideration load balancing between each node on the cluster to maximize resource usage. QS-search is designed to wrap many different search tools, such as BLAST and HMMPFAM using the same interface. This implementation does not alter the original program, so newly obtained programs and program updates should be accommodated easily. Benchmark experiments using QS-search to optimize BLAST and HMMPFAM showed that QS-search accelerated the performance of these programs almost linearly in proportion to the number of CPUs used. We have also implemented a wrapper that utilizes a database segmentation approach (DS-BLAST) that provides a complementary solution for BLAST searches when the database is too large to fit into the memory of a single node. Conclusions Used together, QS-search and DS-BLAST provide a flexible solution to adapt sequential similarity searching applications in high performance computing environments. Their ease of use and their ability to wrap a variety of database search programs provide an analytical architecture to assist both the seasoned bioinformaticist and the wet-bench biologist. PMID:15511296
NOAA Data Rescue of Key Solar Databases and Digitization of Historical Solar Images
NASA Astrophysics Data System (ADS)
Coffey, H. E.
2006-08-01
Over a number of years, the staff at NOAA National Geophysical Data Center (NGDC) has worked to rescue key solar databases by converting them to digital format and making them available via the World Wide Web. NOAA has had several data rescue programs where staff compete for funds to rescue important and critical historical data that are languishing in archives and at risk of being lost due to deteriorating condition, loss of any metadata or descriptive text that describe the databases, lack of interest or funding in maintaining databases, etc. The Solar-Terrestrial Physics Division at NGDC was able to obtain funds to key in some critical historical tabular databases. Recently the NOAA Climate Database Modernization Program (CDMP) funded a project to digitize historical solar images, producing a large online database of historical daily full disk solar images. The images include the wavelengths Calcium K, Hydrogen Alpha, and white light photos, as well as sunspot drawings and the comprehensive drawings of a multitude of solar phenomena on one daily map (Fraunhofer maps and Wendelstein drawings). Included in the digitization are high resolution solar H-alpha images taken at the Boulder Solar Observatory 1967-1984. The scanned daily images document many phases of solar activity, from decadal variation to rotational variation to daily changes. Smaller versions are available online. Larger versions are available by request. See http://www.ngdc.noaa.gov/stp/SOLAR/ftpsolarimages.html. The tabular listings and solar imagery will be discussed.
Energy Supply Options for Modernizing Army Heating Systems
1999-01-01
Army Regulation (AR) 420-49, Heating, Energy Selection and Fuel Storage, Distribution, and Dispens- ing Systems and Technical Manual (TM) 5-650...analysis. 26 USACERL TR 99/23 HEATMAP uses the AutoLISP program in AutoCAD to take the graphical input to populate a Microsoft® Access database in...of 1992, Subtitle F, Federal Agency Energy Man- agement. Technical Manual (TM) 5-650, Repairs and Utilities: Central Boiler Plants (HQDA, 13 October
SoyFN: a knowledge database of soybean functional networks.
Xu, Yungang; Guo, Maozu; Liu, Xiaoyan; Wang, Chunyu; Liu, Yang
2014-01-01
Many databases for soybean genomic analysis have been built and made publicly available, but few of them contain knowledge specifically targeting the omics-level gene-gene, gene-microRNA (miRNA) and miRNA-miRNA interactions. Here, we present SoyFN, a knowledge database of soybean functional gene networks and miRNA functional networks. SoyFN provides user-friendly interfaces to retrieve, visualize, analyze and download the functional networks of soybean genes and miRNAs. In addition, it incorporates much information about KEGG pathways, gene ontology annotations and 3'-UTR sequences as well as many useful tools including SoySearch, ID mapping, Genome Browser, eFP Browser and promoter motif scan. SoyFN is a schema-free database that can be accessed as a Web service from any modern programming language using a simple Hypertext Transfer Protocol call. The Web site is implemented in Java, JavaScript, PHP, HTML and Apache, with all major browsers supported. We anticipate that this database will be useful for members of research communities both in soybean experimental science and bioinformatics. Database URL: http://nclab.hit.edu.cn/SoyFN.
CRAVE: a database, middleware and visualization system for phenotype ontologies.
Gkoutos, Georgios V; Green, Eain C J; Greenaway, Simon; Blake, Andrew; Mallon, Ann-Marie; Hancock, John M
2005-04-01
A major challenge in modern biology is to link genome sequence information to organismal function. In many organisms this is being done by characterizing phenotypes resulting from mutations. Efficiently expressing phenotypic information requires combinatorial use of ontologies. However tools are not currently available to visualize combinations of ontologies. Here we describe CRAVE (Concept Relation Assay Value Explorer), a package allowing storage, active updating and visualization of multiple ontologies. CRAVE is a web-accessible JAVA application that accesses an underlying MySQL database of ontologies via a JAVA persistent middleware layer (Chameleon). This maps the database tables into discrete JAVA classes and creates memory resident, interlinked objects corresponding to the ontology data. These JAVA objects are accessed via calls through the middleware's application programming interface. CRAVE allows simultaneous display and linking of multiple ontologies and searching using Boolean and advanced searches.
LARCRIM user's guide, version 1.0
NASA Technical Reports Server (NTRS)
Davis, John S.; Heaphy, William J.
1993-01-01
LARCRIM is a relational database management system (RDBMS) which performs the conventional duties of an RDBMS with the added feature that it can store attributes which consist of arrays or matrices. This makes it particularly valuable for scientific data management. It is accessible as a stand-alone system and through an application program interface. The stand-alone system may be executed in two modes: menu or command. The menu mode prompts the user for the input required to create, update, and/or query the database. The command mode requires the direct input of LARCRIM commands. Although LARCRIM is an update of an old database family, its performance on modern computers is quite satisfactory. LARCRIM is written in FORTRAN 77 and runs under the UNIX operating system. Versions have been released for the following computers: SUN (3 & 4), Convex, IRIS, Hewlett-Packard, CRAY 2 & Y-MP.
New DMSP Database of Precipitating Auroral Electrons and Ions.
Redmon, Robert J; Denig, William F; Kilcommons, Liam M; Knipp, Delores J
2017-08-01
Since the mid 1970's, the Defense Meteorological Satellite Program (DMSP) spacecraft have operated instruments for monitoring the space environment from low earth orbit. As the program evolved, so to have the measurement capabilities such that modern DMSP spacecraft include a comprehensive suite of instruments providing estimates of precipitating electron and ion fluxes, cold/bulk plasma composition and moments, the geomagnetic field, and optical emissions in the far and extreme ultraviolet. We describe the creation of a new public database of precipitating electrons and ions from the Special Sensor J (SSJ) instrument, complete with original counts, calibrated differential fluxes adjusted for penetrating radiation, estimates of the total kinetic energy flux and characteristic energy, uncertainty estimates, and accurate ephemerides. These are provided in a common and self-describing format that covers 30+ years of DMSP spacecraft from F06 (launched in 1982) through F18 (launched in 2009). This new database is accessible at the National Centers for Environmental Information (NCEI) and the Coordinated Data Analysis Web (CDAWeb). We describe how the new database is being applied to high latitude studies of: the co-location of kinetic and electromagnetic energy inputs, ionospheric conductivity variability, field aligned currents and auroral boundary identification. We anticipate that this new database will support a broad range of space science endeavors from single observatory studies to coordinated system science investigations.
The New Zealand Tsunami Database: historical and modern records
NASA Astrophysics Data System (ADS)
Barberopoulou, A.; Downes, G. L.; Cochran, U. A.; Clark, K.; Scheele, F.
2016-12-01
A database of historical (pre-instrumental) and modern (instrumentally recorded)tsunamis that have impacted or been observed in New Zealand has been compiled andpublished online. New Zealand's tectonic setting, astride an obliquely convergenttectonic boundary on the Pacific Rim, means that it is vulnerable to local, regional andcircum-Pacific tsunamis. Despite New Zealand's comparatively short written historicalrecord of c. 200 years there is a wealth of information about the impact of past tsunamis.The New Zealand Tsunami Database currently has 800+ entries that describe >50 highvaliditytsunamis. Sources of historical information include witness reports recorded indiaries, notes, newspapers, books, and photographs. Information on recent events comesfrom tide gauges and other instrumental recordings such as DART® buoys, and media ofgreater variety, for example, video and online surveys. The New Zealand TsunamiDatabase is an ongoing project with information added as further historical records cometo light. Modern tsunamis are also added to the database once the relevant data for anevent has been collated and edited. This paper briefly overviews the procedures and toolsused in the recording and analysis of New Zealand's historical tsunamis, with emphasison database content.
Carazo, J M; Stelzer, E H
1999-01-01
The BioImage Database Project collects and structures multidimensional data sets recorded by various microscopic techniques relevant to modern life sciences. It provides, as precisely as possible, the circumstances in which the sample was prepared and the data were recorded. It grants access to the actual data and maintains links between related data sets. In order to promote the interdisciplinary approach of modern science, it offers a large set of key words, which covers essentially all aspects of microscopy. Nonspecialists can, therefore, access and retrieve significant information recorded and submitted by specialists in other areas. A key issue of the undertaking is to exploit the available technology and to provide a well-defined yet flexible structure for dealing with data. Its pivotal element is, therefore, a modern object relational database that structures the metadata and ameliorates the provision of a complete service. The BioImage database can be accessed through the Internet. Copyright 1999 Academic Press.
[Design and establishment of modern literature database about acupuncture Deqi].
Guo, Zheng-rong; Qian, Gui-feng; Pan, Qiu-yin; Wang, Yang; Xin, Si-yuan; Li, Jing; Hao, Jie; Hu, Ni-juan; Zhu, Jiang; Ma, Liang-xiao
2015-02-01
A search on acupuncture Deqi was conducted using four Chinese-language biomedical databases (CNKI, Wan-Fang, VIP and CBM) and PubMed database and using keywords "Deqi" or "needle sensation" "needling feeling" "needle feel" "obtaining qi", etc. Then, a "Modern Literature Database for Acupuncture Deqi" was established by employing Microsoft SQL Server 2005 Express Edition, introducing the contents, data types, information structure and logic constraint of the system table fields. From this Database, detailed inquiries about general information of clinical trials, acupuncturists' experience, ancient medical works, comprehensive literature, etc. can be obtained. The present databank lays a foundation for subsequent evaluation of literature quality about Deqi and data mining of undetected Deqi knowledge.
Cadastral Positioning Accuracy Improvement: a Case Study in Malaysia
NASA Astrophysics Data System (ADS)
Hashim, N. M.; Omar, A. H.; Omar, K. M.; Abdullah, N. M.; Yatim, M. H. M.
2016-09-01
Cadastral map is a parcel-based information which is specifically designed to define the limitation of boundaries. In Malaysia, the cadastral map is under authority of the Department of Surveying and Mapping Malaysia (DSMM). With the growth of spatial based technology especially Geographical Information System (GIS), DSMM decided to modernize and reform its cadastral legacy datasets by generating an accurate digital based representation of cadastral parcels. These legacy databases usually are derived from paper parcel maps known as certified plan. The cadastral modernization will result in the new cadastral database no longer being based on single and static parcel paper maps, but on a global digital map. Despite the strict process of the cadastral modernization, this reform has raised unexpected queries that remain essential to be addressed. The main focus of this study is to review the issues that have been generated by this transition. The transformed cadastral database should be additionally treated to minimize inherent errors and to fit them to the new satellite based coordinate system with high positional accuracy. This review result will be applied as a foundation for investigation to study the systematic and effectiveness method for Positional Accuracy Improvement (PAI) in cadastral database modernization.
Scoping of Flood Hazard Mapping Needs for Androscoggin County, Maine
Schalk, Charles W.; Dudley, Robert W.
2007-01-01
Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed and as funds allow. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Androscoggin County. Scoping activities included assembling existing data and map needs information for communities in Androscoggin County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Androscoggin County, Maine, is at least 17 years. Most studies were published in the early 1990s, and some towns have partial maps that are more recent than their study date. Since the studies were done, development has occurred in many of the watersheds and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.
Scoping of Flood Hazard Mapping Needs for Lincoln County, Maine
Schalk, Charles W.; Dudley, Robert W.
2007-01-01
Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Lincoln County. Scoping activities included assembling existing data and map needs information for communities in Lincoln County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps in Lincoln County, Maine is at least 17 years. Many of these studies were published in the mid- to late-1980s, and some towns have partial maps that are more recent than their study. However, in the ensuing 15-20 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.
Discovering H-bonding rules in crystals with inductive logic programming.
Ando, Howard Y; Dehaspe, Luc; Luyten, Walter; Van Craenenbroeck, Elke; Vandecasteele, Henk; Van Meervelt, Luc
2006-01-01
In the domain of crystal engineering, various schemes have been proposed for the classification of hydrogen bonding (H-bonding) patterns observed in 3D crystal structures. In this study, the aim is to complement these schemes with rules that predict H-bonding in crystals from 2D structural information only. Modern computational power and the advances in inductive logic programming (ILP) can now provide computational chemistry with the opportunity for extracting structure-specific rules from large databases that can be incorporated into expert systems. ILP technology is here applied to H-bonding in crystals to develop a self-extracting expert system utilizing data in the Cambridge Structural Database of small molecule crystal structures. A clear increase in performance was observed when the ILP system DMax was allowed to refer to the local structural environment of the possible H-bond donor/acceptor pairs. This ability distinguishes ILP from more traditional approaches that build rules on the basis of global molecular properties.
Development of a North American paleoclimate pollen-based reconstruction database application
NASA Astrophysics Data System (ADS)
Ladd, Matthew; Mosher, Steven; Viau, Andre
2013-04-01
Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.
Cai, Liyan; Wu, Jie; Ma, Tingting; Yang, Lijie
2015-10-01
The acupoint selection was retrieved from the ancient and modern literature on the treatment of sub-healthy condition with acupuncture. The law of acupoint application was analyzed so as to provide a certain reference to the determination of acupoint prescription in clinical acupuncture. The ancient literature was retrieved from Chinese basic ancient literature database. The modern literature was retrieved from Cochrane Library, Medline, PubMed, Ovid evidence-based medicine database, Chinese biomedical literature database, China journal full-text database, VIP journal full-text database and Wanfang database. The database mining software was adopted to explore the law of acupoint application in treatment of sub-healthy conditions with ancient and modern acupuncture. The acupoint use frequency, compatibility association rule, law for meridian use and the use regularity of specific points were analyzed. In the ancient treatment for sub-healthy condition, the top five commonly used acupoints are Shenmen (HT 7), Zhaohai (KI 6), Taibai (SP 3), Daling (PC 7) and Taixi (KI 3). The most commonly combined points are Zhangmen (LR 13), Taibai (SP 3) and Zhaohai (KI 6). The most commonly used meridians are the bladder meridian of foot-taiyang, kidney meridian of foot-shaoyin and liver meridian of foot-jueyin. The most commonly used specific points are the five-shu points. The most commonly used acupoints are located in the lower limbs. In the modern treatment, the top five commonly used acupoints are Zusanli (ST 36), Sanyinjiao (SP 6), Baihui (GV 20), Shenshu (BL 23) and Guanyuan (CV 4). The most commonly supplemented points are Hegu (LI 4) and Taichong (LR 3). The most commonly used meridians are the bladder meridian of foot-taiyang, the conception vessel and the governor vessel. The most commonly used specific points are the back-shu points. The most commonly used acupoints are located in the lower limbs. After the systematic comprehension of the relevant ancient and modern literature, the most commonly used acupoints are selected along the bladder meridian of foot-taiyang, and the most commonly used specific points are the back-shu points, the five-shu points and the front-mu-points. the acupoints are mostly located in the lower limbs.
ProteinWorldDB: querying radical pairwise alignments among protein sets from complete genomes.
Otto, Thomas Dan; Catanho, Marcos; Tristão, Cristian; Bezerra, Márcia; Fernandes, Renan Mathias; Elias, Guilherme Steinberger; Scaglia, Alexandre Capeletto; Bovermann, Bill; Berstis, Viktors; Lifschitz, Sergio; de Miranda, Antonio Basílio; Degrave, Wim
2010-03-01
Many analyses in modern biological research are based on comparisons between biological sequences, resulting in functional, evolutionary and structural inferences. When large numbers of sequences are compared, heuristics are often used resulting in a certain lack of accuracy. In order to improve and validate results of such comparisons, we have performed radical all-against-all comparisons of 4 million protein sequences belonging to the RefSeq database, using an implementation of the Smith-Waterman algorithm. This extremely intensive computational approach was made possible with the help of World Community Grid, through the Genome Comparison Project. The resulting database, ProteinWorldDB, which contains coordinates of pairwise protein alignments and their respective scores, is now made available. Users can download, compare and analyze the results, filtered by genomes, protein functions or clusters. ProteinWorldDB is integrated with annotations derived from Swiss-Prot, Pfam, KEGG, NCBI Taxonomy database and gene ontology. The database is a unique and valuable asset, representing a major effort to create a reliable and consistent dataset of cross-comparisons of the whole protein content encoded in hundreds of completely sequenced genomes using a rigorous dynamic programming approach. The database can be accessed through http://proteinworlddb.org
CamMedNP: building the Cameroonian 3D structural natural products database for virtual screening.
Ntie-Kang, Fidele; Mbah, James A; Mbaze, Luc Meva'a; Lifongo, Lydia L; Scharfe, Michael; Hanna, Joelle Ngo; Cho-Ngwa, Fidelis; Onguéné, Pascal Amoa; Owono Owono, Luc C; Megnassan, Eugene; Sippl, Wolfgang; Efange, Simon M N
2013-04-16
Computer-aided drug design (CADD) often involves virtual screening (VS) of large compound datasets and the availability of such is vital for drug discovery protocols. We present CamMedNP - a new database beginning with more than 2,500 compounds of natural origin, along with some of their derivatives which were obtained through hemisynthesis. These are pure compounds which have been previously isolated and characterized using modern spectroscopic methods and published by several research teams spread across Cameroon. In the present study, 224 distinct medicinal plant species belonging to 55 plant families from the Cameroonian flora have been considered. About 80 % of these have been previously published and/or referenced in internationally recognized journals. For each compound, the optimized 3D structure, drug-like properties, plant source, collection site and currently known biological activities are given, as well as literature references. We have evaluated the "drug-likeness" of this database using Lipinski's "Rule of Five". A diversity analysis has been carried out in comparison with the ChemBridge diverse database. CamMedNP could be highly useful for database screening and natural product lead generation programs.
Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop
NASA Technical Reports Server (NTRS)
Cottrell, William L.
1994-01-01
The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.
Ultra-Structure database design methodology for managing systems biology data and analyses
Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C
2009-01-01
Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849
Based on a True Story? The Portrayal of ECT in International Movies and Television Programs.
Sienaert, Pascal
Movies and television (TV) programs are an important source of public information about ECT. To narratively review the portrayal of ECT in international movies and TV programs from 1948 until present. Several Internet movie databases and a database of phrases appearing in movies and TV programs were searched, supplemented with a Medline-search. No language restrictions were applied. ECT was portrayed in 52 movies (57 scenes), 21 TV programs (23 scenes), and 2 animated sitcoms (2 scenes). In movies, the main indication for ECT is behavioral control or torture (17/57, 29.8%), whereas in TV programs, the most frequent indication is erasing memories (7/25, 28%). In most scenes (47/82; 57.3%) ECT is given without consent, and without anesthesia (59/82; 72%). Unmodified ECT is depicted more frequently in American scenes (48/64, 75%), as opposed to scenes from other countries (11/18; 64.7%). Bilateral electrode placement is used in almost all (89%, 73/82) scenes. The vast majority of movies (46/57, 80.7%) and TV programs (18/25, 72%) show a negative and inaccurate image of the treatment. In the majority of scenes, ECT is used as a metaphor for repression, mind and behavior control, and is shown as a memory-erasing, painful and damaging treatment, adding to the stigma already associated with ECT. Only a few exceptions paint a truthful picture of this indispensable treatment in modern psychiatry. Copyright © 2016 Elsevier Inc. All rights reserved.
muBLASTP: database-indexed protein sequence search on multicore CPUs.
Zhang, Jing; Misra, Sanchit; Wang, Hao; Feng, Wu-Chun
2016-11-04
The Basic Local Alignment Search Tool (BLAST) is a fundamental program in the life sciences that searches databases for sequences that are most similar to a query sequence. Currently, the BLAST algorithm utilizes a query-indexed approach. Although many approaches suggest that sequence search with a database index can achieve much higher throughput (e.g., BLAT, SSAHA, and CAFE), they cannot deliver the same level of sensitivity as the query-indexed BLAST, i.e., NCBI BLAST, or they can only support nucleotide sequence search, e.g., MegaBLAST. Due to different challenges and characteristics between query indexing and database indexing, the existing techniques for query-indexed search cannot be used into database indexed search. muBLASTP, a novel database-indexed BLAST for protein sequence search, delivers identical hits returned to NCBI BLAST. On Intel Haswell multicore CPUs, for a single query, the single-threaded muBLASTP achieves up to a 4.41-fold speedup for alignment stages, and up to a 1.75-fold end-to-end speedup over single-threaded NCBI BLAST. For a batch of queries, the multithreaded muBLASTP achieves up to a 5.7-fold speedups for alignment stages, and up to a 4.56-fold end-to-end speedup over multithreaded NCBI BLAST. With a newly designed index structure for protein database and associated optimizations in BLASTP algorithm, we re-factored BLASTP algorithm for modern multicore processors that achieves much higher throughput with acceptable memory footprint for the database index.
ProteinWorldDB: querying radical pairwise alignments among protein sets from complete genomes
Otto, Thomas Dan; Catanho, Marcos; Tristão, Cristian; Bezerra, Márcia; Fernandes, Renan Mathias; Elias, Guilherme Steinberger; Scaglia, Alexandre Capeletto; Bovermann, Bill; Berstis, Viktors; Lifschitz, Sergio; de Miranda, Antonio Basílio; Degrave, Wim
2010-01-01
Motivation: Many analyses in modern biological research are based on comparisons between biological sequences, resulting in functional, evolutionary and structural inferences. When large numbers of sequences are compared, heuristics are often used resulting in a certain lack of accuracy. In order to improve and validate results of such comparisons, we have performed radical all-against-all comparisons of 4 million protein sequences belonging to the RefSeq database, using an implementation of the Smith–Waterman algorithm. This extremely intensive computational approach was made possible with the help of World Community Grid™, through the Genome Comparison Project. The resulting database, ProteinWorldDB, which contains coordinates of pairwise protein alignments and their respective scores, is now made available. Users can download, compare and analyze the results, filtered by genomes, protein functions or clusters. ProteinWorldDB is integrated with annotations derived from Swiss-Prot, Pfam, KEGG, NCBI Taxonomy database and gene ontology. The database is a unique and valuable asset, representing a major effort to create a reliable and consistent dataset of cross-comparisons of the whole protein content encoded in hundreds of completely sequenced genomes using a rigorous dynamic programming approach. Availability: The database can be accessed through http://proteinworlddb.org Contact: otto@fiocruz.br PMID:20089515
Scoping of Flood Hazard Mapping Needs for Penobscot County, Maine
Schalk, Charles W.; Dudley, Robert W.
2007-01-01
Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program (MFMP), began scoping work in 2006 for Penobscot County. Scoping activities included assembling existing data and map needs information for communities in Penobscot County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database with information gathered during the scoping process. As of 2007, the average age of the FEMA floodplain maps in Penobscot County, Maine, is 22 years, based on the most recent revisions to the maps. Because the revisions did not affect all the map panels in each town, however, the true average date probably is more than 22 years. Many of the studies were published in the mid-1980s. Since the studies were completed, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights or flood mapping.
Scoping of Flood Hazard Mapping Needs for Hancock County, Maine
Schalk, Charles W.; Dudley, Robert W.
2007-01-01
Background The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine Floodplain Management Program (MFMP) State Planning Office, began scoping work in 2006 for Hancock County. Scoping activities included assembling existing data and map needs information for communities in Hancock County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) database with information gathered during the scoping process. The average age of the FEMA floodplain maps (all types) in Hancock County, Maine, is at least 19 years. Most of these studies were published in the late 1980s and early 1990s, and no study is more recent than 1992. Some towns have partial maps that are more recent than their study, indicating that the true average age of the data is probably more than 19 years. Since the studies were done, development has occurred in some of the watersheds and the characteristics of the watersheds have changed. Therefore, many of the older studies may not depict current conditions or accurately estimate risk in terms of flood heights or flood mapping.
Distributed computing for macromolecular crystallography
Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Ballard, Charles
2018-01-01
Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community. PMID:29533240
Distributed computing for macromolecular crystallography.
Krissinel, Evgeny; Uski, Ville; Lebedev, Andrey; Winn, Martyn; Ballard, Charles
2018-02-01
Modern crystallographic computing is characterized by the growing role of automated structure-solution pipelines, which represent complex expert systems utilizing a number of program components, decision makers and databases. They also require considerable computational resources and regular database maintenance, which is increasingly more difficult to provide at the level of individual desktop-based CCP4 setups. On the other hand, there is a significant growth in data processed in the field, which brings up the issue of centralized facilities for keeping both the data collected and structure-solution projects. The paradigm of distributed computing and data management offers a convenient approach to tackling these problems, which has become more attractive in recent years owing to the popularity of mobile devices such as tablets and ultra-portable laptops. In this article, an overview is given of developments by CCP4 aimed at bringing distributed crystallographic computations to a wide crystallographic community.
Nuclear weapons modernization: Plans, programs, and issues for Congress
NASA Astrophysics Data System (ADS)
Woolf, Amy F.
2017-11-01
The United States is currently recapitalizing each delivery system in its "nuclear triad" and refurbishing many of the warheads carried by those systems. The plans for these modernization programs have raised a number of questions, both within Congress and among analysts in the nuclear weapons and arms control communities, about the costs associated with the programs and the need to recapitalize each leg of the triad at the same time. This paper covers four distinct issues. It begins with a brief review of the planned modernization programs, then addresses questions about why the United States is pursuing all of these modernization programs at this time. It then reviews the debate about how much these modernization programs are likely to cost in the next decade and considers possible changes that might reduce the cost. It concludes with some comments about congressional views on the modernization programs and prospects for continuing congressional support in the coming years.
Tempest: GPU-CPU computing for high-throughput database spectral matching.
Milloy, Jeffrey A; Faherty, Brendan K; Gerber, Scott A
2012-07-06
Modern mass spectrometers are now capable of producing hundreds of thousands of tandem (MS/MS) spectra per experiment, making the translation of these fragmentation spectra into peptide matches a common bottleneck in proteomics research. When coupled with experimental designs that enrich for post-translational modifications such as phosphorylation and/or include isotopically labeled amino acids for quantification, additional burdens are placed on this computational infrastructure by shotgun sequencing. To address this issue, we have developed a new database searching program that utilizes the massively parallel compute capabilities of a graphical processing unit (GPU) to produce peptide spectral matches in a very high throughput fashion. Our program, named Tempest, combines efficient database digestion and MS/MS spectral indexing on a CPU with fast similarity scoring on a GPU. In our implementation, the entire similarity score, including the generation of full theoretical peptide candidate fragmentation spectra and its comparison to experimental spectra, is conducted on the GPU. Although Tempest uses the classical SEQUEST XCorr score as a primary metric for evaluating similarity for spectra collected at unit resolution, we have developed a new "Accelerated Score" for MS/MS spectra collected at high resolution that is based on a computationally inexpensive dot product but exhibits scoring accuracy similar to that of the classical XCorr. In our experience, Tempest provides compute-cluster level performance in an affordable desktop computer.
ERIC Educational Resources Information Center
Koo, Charles M.
In 1978, China launched its "Four Modernizations" program, which included modernization in agriculture, industry, national defense, and science and technology. To promote this program and to mobilize the Chinese masses to take a more positive and active attitude toward modernization, the government called upon the forces of the mass…
Ada software productivity prototypes: A case study
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Habib-Agahi, Hamid; Malhotra, Shan
1988-01-01
A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level.
Scoping of flood hazard mapping needs for Kennebec County, Maine
Dudley, Robert W.; Schalk, Charles W.
2006-01-01
This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Kennebec County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Kennebec County. Scoping activities included assembling existing data and map needs information for communities in Kennebec County (efforts were made to not duplicate those of pre-scoping completed in March 2005), documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Kennebec County, Maine is 16 years. Most of these studies were in the late 1970's to the mid 1980s. However, in the ensuing 20-30 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Kennebec County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Kennebec County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and how e
Scoping of flood hazard mapping needs for Somerset County, Maine
Dudley, Robert W.; Schalk, Charles W.
2006-01-01
This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Somerset County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Somerset County. Scoping activities included assembling existing data and map needs information for communities in Somerset County (efforts were made to not duplicate those of pre-scoping completed in March 2005), documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Somerset County, Maine is 18.1 years. Most of these studies were in the late 1970's to the mid 1980s. However, in the ensuing 20-30 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Somerset County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Somerset County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and ho
Scoping of flood hazard mapping needs for Cumberland County, Maine
Dudley, Robert W.; Schalk, Charles W.
2006-01-01
This report was prepared by the U.S. Geological Survey (USGS) Maine Water Science Center as the deliverable for scoping of flood hazard mapping needs for Cumberland County, Maine, under Federal Emergency Management Agency (FEMA) Inter-Agency Agreement Number HSFE01-05-X-0018. This section of the report explains the objective of the task and the purpose of the report. The Federal Emergency Management Agency (FEMA) developed a plan in 1997 to modernize the FEMA flood mapping program. FEMA flood maps delineate flood hazard areas in support of the National Flood Insurance Program (NFIP). FEMA's plan outlined the steps necessary to update FEMA's flood maps for the nation to a seamless digital format and streamline FEMA's operations in raising public awareness of the importance of the maps and responding to requests to revise them. The modernization of flood maps involves conversion of existing information to digital format and integration of improved flood hazard data as needed. To determine flood mapping modernization needs, FEMA has established specific scoping activities to be done on a county-by-county basis for identifying and prioritizing requisite flood-mapping activities for map modernization. The U.S. Geological Survey (USGS), in cooperation with FEMA and the Maine State Planning Office Floodplain Management Program, began scoping work in 2005 for Cumberland County. Scoping activities included assembling existing data and map needs information for communities in Cumberland County, documentation of data, contacts, community meetings, and prioritized mapping needs in a final scoping report (this document), and updating the Mapping Needs Update Support System (MNUSS) Database or its successor with information gathered during the scoping process. The average age of the FEMA floodplain maps in Cumberland County, Maine is 21 years. Most of these studies were in the early to mid 1980s. However, in the ensuing 20-25 years, development has occurred in many of the watersheds, and the characteristics of the watersheds have changed with time. Therefore, many of the older studies may not depict current conditions nor accurately estimate risk in terms of flood heights. The following is the scope of work as defined in the FEMA/USGS Statement of Work: Task 1: Collect data from a variety of sources including community surveys, other Federal and State Agencies, National Flood Insurance Program (NFIP) State Coordinators, Community Assistance Visits (CAVs) and FEMA archives. Lists of mapping needs will be obtained from the MNUSS database, community surveys, and CAVs, if available. FEMA archives will be inventoried for effective FIRM panels, FIS reports, and other flood-hazard data or existing study data. Best available base map information, topographic data, flood-hazard data, and hydrologic and hydraulic data will be identified. Data from the Maine Floodplain Management Program database also will be utilized. Task 2: Contact communities in Cumberland County to notify them that FEMA and the State have selected them for a map update, and that a project scope will be developed with their input. Topics to be reviewed with the communities include (1) Purpose of the Flood Map Project (for example, the update needs that have prompted the map update); (2) The community's mapping needs; (3) The community's available mapping, hydrologic, hydraulic, and flooding information; (4) target schedule for completing the project; and (5) The community's engineering, planning, and geographic information system (GIS) capabilities. On the basis of the collected information from Task 1 and community contacts/meetings in Task 2, the USGS will develop a Draft Project Scope for the identified mapping needs of the communities in Cumberland County. The following items will be addressed in the Draft Project Scope: review of available information, determine if and how effective FIS data can be used in new project, and identify other data needed to
Technology and the Modern Library.
ERIC Educational Resources Information Center
Boss, Richard W.
1984-01-01
Overview of the impact of information technology on libraries highlights turnkey vendors, bibliographic utilities, commercial suppliers of records, state and regional networks, computer-to-computer linkages, remote database searching, terminals and microcomputers, building local databases, delivery of information, digital telefacsimile,…
[Current status and trends in the health of the Moscow population].
Tishuk, E A; Plavunov, N F; Soboleva, N P
1997-01-01
Based on vast comprehensive medical statistical database, the authors analyze the health status of the population and the efficacy of public health service in Moscow. The pre-crisis tendencies and the modern status of public health under modern socioeconomic conditions are noted.
[The future of the European nephrology belongs to the young: the Young Nephrologists' Platform].
Bolignano, Davide
2014-01-01
Young people are the future of research, especially in nephrology. The prevalence of young nephrologists within the main scientific European societies varies from the 12% to 34% and the 20% of the ERA-EDTA members are less than 40 years old in 2013. Recently, the ERA-EDTA has launched a new platform, the Young Nephrologists Platform (YNP), which the aim is to become the first modern network of young nephrologists from Europe and beyond. YNP aims at promoting the aggregation of young people through modern communication channels such as social networks, blogs and through the construction of a database collecting information on attitudes and personal experiences of each young nephrologist. A mentorship program, focused and young-oriented clinical courses on hot topics and the direct involvement of young nephrologists in working groups and scientific studies are some of the other interesting initiatives driven by YNP. The future of nephrology belongs to the young and YNP could represent a good springboard for the professional growth of young nephrologists.
Optimal Decisions for Organ Exchanges in a Kidney Paired Donation Program.
Li, Yijiang; Song, Peter X-K; Zhou, Yan; Leichtman, Alan B; Rees, Michael A; Kalbfleisch, John D
2014-05-01
The traditional concept of barter exchange in economics has been extended in the modern era to the area of living-donor kidney transplantation, where one incompatible donor-candidate pair is matched to another pair with a complementary incompatibility, such that the donor from one pair gives an organ to a compatible candidate in the other pair and vice versa. Kidney paired donation (KPD) programs provide a unique and important platform for living incompatible donor-candidate pairs to exchange organs in order to achieve mutual benefit. In this paper, we propose novel organ allocation strategies to arrange kidney exchanges under uncertainties with advantages, including (i) allowance for a general utility-based evaluation of potential kidney transplants and an explicit consideration of stochastic features inherent in a KPD program; and (ii) exploitation of possible alternative exchanges when the originally planned allocation cannot be fully executed. This allocation strategy is implemented using an integer programming (IP) formulation, and its implication is assessed via a data-based simulation system by tracking an evolving KPD program over a series of match runs. Extensive simulation studies are provided to illustrate our proposed approach.
Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai
2009-01-01
Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications.
Ice Accretion Test Results for Three Large-Scale Swept-Wing Models in the NASA Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Broeren, Andy; Potapczuk, Mark; Lee, Sam; Malone, Adam; Paul, Ben; Woodard, Brian
2016-01-01
The design and certification of modern transport airplanes for flight in icing conditions increasing relies on three-dimensional numerical simulation tools for ice accretion prediction. There is currently no publically available, high-quality, ice accretion database upon which to evaluate the performance of icing simulation tools for large-scale swept wings that are representative of modern commercial transport airplanes. The purpose of this presentation is to present the results of a series of icing wind tunnel test campaigns whose aim was to provide an ice accretion database for large-scale, swept wings.
Inequality in fertility rate and modern contraceptive use among Ghanaian women from 1988–2008
2013-01-01
Background In most resource poor countries, particularly sub-Saharan Africa, modern contraceptive use and prevalence is unusually low and fertility is very high resulting in rapid population growth and high maternal mortality and morbidity. Current evidence shows slow progress in expanding the use of contraceptives by women of low socioeconomic status and insufficient financial commitment to family planning programs. We examined gaps and trends in modern contraceptive use and fertility within different socio-demographic subgroups in Ghana between 1988 and 2008. Methods We constructed a database using the Women’s Questionnaire from the Ghana Demographic and Health Survey (GDHS) 1988, 1993, 1998, 2003 and 2008. We applied regression-based Total Attributable Fraction (TAF); we also calculated the Relative and Slope Indices of Inequality (RII and SII) to complement the TAF in our investigation. Results Equality in use of modern contraceptives increased from 1988 to 2008. In contrast, inequality in fertility rate increased from 1988 to 2008. It was also found that rural–urban residence gap in the use of modern contraceptive methods had almost disappeared in 2008, while education and income related inequalities remained. Conclusions One obvious observation is that the discrepancy between equality in use of contraceptives and equality in fertility must be addressed in a future revision of policies related to family planning. Otherwise this could be a major obstacle for attaining further progress in achieving the Millennium Development Goal (MDG) 5. More research into the causes of the unfortunate discrepancy is urgently needed. There still exist significant education and income related inequalities in both parameters that need appropriate action. PMID:23718745
Inequality in fertility rate and modern contraceptive use among Ghanaian women from 1988-2008.
Asamoah, Benedict O; Agardh, Anette; Ostergren, Per-Östergren
2013-05-29
In most resource poor countries, particularly sub-Saharan Africa, modern contraceptive use and prevalence is unusually low and fertility is very high resulting in rapid population growth and high maternal mortality and morbidity. Current evidence shows slow progress in expanding the use of contraceptives by women of low socioeconomic status and insufficient financial commitment to family planning programs. We examined gaps and trends in modern contraceptive use and fertility within different socio-demographic subgroups in Ghana between 1988 and 2008. We constructed a database using the Women's Questionnaire from the Ghana Demographic and Health Survey (GDHS) 1988, 1993, 1998, 2003 and 2008. We applied regression-based Total Attributable Fraction (TAF); we also calculated the Relative and Slope Indices of Inequality (RII and SII) to complement the TAF in our investigation. Equality in use of modern contraceptives increased from 1988 to 2008. In contrast, inequality in fertility rate increased from 1988 to 2008. It was also found that rural-urban residence gap in the use of modern contraceptive methods had almost disappeared in 2008, while education and income related inequalities remained. One obvious observation is that the discrepancy between equality in use of contraceptives and equality in fertility must be addressed in a future revision of policies related to family planning. Otherwise this could be a major obstacle for attaining further progress in achieving the Millennium Development Goal (MDG) 5. More research into the causes of the unfortunate discrepancy is urgently needed. There still exist significant education and income related inequalities in both parameters that need appropriate action.
24 CFR 901.15 - Indicator #2, modernization.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Indicator #2, modernization. 901.15... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.15 Indicator #2, modernization. This indicator is automatically excluded if a PHA does not have a modernization program. This indicator examines the...
The Moon in the Russian scientific-educational project: Kazan-GeoNa-2010
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.; Petrova, N.
Historically thousand-year Kazan city and the two-hundred-year Kazan university Russia carry out a role of the scientific-organizational and cultural-educational center of Volga region For the further successful development of educational and scientific-educational activity of the Russian Federation the Republic Tatarstan Kazan is offered the national project - the International Center of the Science and the Internet of Technologies bf GeoNa bf Geo metry of bf Na ture - bf GeoNa is developed - wisdom enthusiasm pride grandeur which includes a modern complex of conference halls up to 4 thousand places the Center the Internet of Technologies 3D Planetarium - development of the Moon PhysicsLand an active museum of natural sciences an oceanarium training a complex Spheres of Knowledge botanical and landscape oases In center bf GeoNa will be hosted conferences congresses fundamental scientific researches of the Moon scientific-educational actions presentation of the international scientific programs on lunar research modern lunar databases exhibition Hi-tech of the equipment the extensive cultural-educational tourist and cognitive programs Center bf GeoNa will enable scientists and teachers of the Russian universities to join to advanced achievements of a science information technologies to establish scientific communications with foreign colleagues in sphere of the high technology and educational projects with world space centers
NASA Astrophysics Data System (ADS)
Huang, Pei; Wu, Sangyun; Feng, Aiping; Guo, Yacheng
2008-10-01
As littoral areas in possession of concentrated population, abundant resources, developed industry and active economy, the coastal areas are bound to become the forward positions and supported regions for marine exploitation. In the 21st century, the pressure that coastal zones are faced with is as follows: growth of population and urbanization, rise of sea level and coastal erosion, shortage of freshwater resource and deterioration of water resource, and degradation of fishery resource and so on. So the resources of coastal zones should be programmed and used reasonably for the sustainable development of economy and environment. This paper proposes a design research on the construction of coastal zone planning and management information system based on GIS and database technologies. According to this system, the planning results of coastal zones could be queried and displayed expediently through the system interface. It is concluded that the integrated application of GIS and database technologies provides a new modern method for the management of coastal zone resources, and makes it possible to ensure the rational development and utilization of the coastal zone resources, along with the sustainable development of economy and environment.
Clinical research in a hospital--from the lone rider to teamwork.
Hannisdal, E
1996-01-01
Clinical research of high international standard is very demanding and requires clinical data of high quality, software, hardware and competence in research design and statistical treatment of data. Most busy clinicians have little time allocated for clinical research and this increases the need for a potent infrastructure. This paper describes how the Norwegian Radium Hospital, a specialized cancer hospital, has reorganized the clinical research process. This includes a new department, the Clinical Research Office, which serves the formal framework, a central Diagnosis Registry, clinical databases and multicentre studies. The department assists about 120 users, mainly clinicians. Installation of a network software package with over 10 programs has strongly provided an internal standardization, reduced the costs and saved clinicians a great deal of time. The hospital is building up about 40 diagnosis-specific clinical databases with up to 200 variables registered. These databases are shared by the treatment group and seem to be important tools for quality assurance. We conclude that the clinical research process benefits from a firm infrastructure facilitating teamwork through extensive use of modern information technology. We are now ready for the next phase, which is to work for a better external technical framework for cooperation with other institutions throughout the world.
HITRAN2016: Part I. Line lists for H_2O, CO_2, O_3, N_2O, CO, CH_4, and O_2
NASA Astrophysics Data System (ADS)
Gordon, Iouli E.; Rothman, Laurence S.; Tan, Yan; Kochanov, Roman V.; Hill, Christian
2017-06-01
The HITRAN2016 database is now officially released. Plethora of experimental and theoretical molecular spectroscopic data were collected, evaluated and vetted before compiling the new edition of the database. The database is now distributed through the dynamic user interface HITRANonline (available at www.hitran.org) which offers many flexible options for browsing and downloading the data. In addition HITRAN Application Programming Interface (HAPI) offers modern ways to download the HITRAN data and use it to carry out sophisticated calculations. The line-by-line lists for almost all of the 47 HITRAN molecules were updated in comparison with the previous compilation (HITRAN2012. Some of the most important updates for major atmospheric absorbers, such as H_2O, CO_2, O_3, N_2O, CO, CH_4, and O_2, will be presented in this talk, while the trace gases will be presented in the next talk by Y. Tan. The HITRAN2016 database now provides alternative line-shape representations for a number of molecules, as well as broadening by gases dominant in planetary atmospheres. In addition, substantial extension and improvement of cross-section data is featured, which will be described in a dedicated talk by R. V. Kochanov. The new edition of the database is a substantial step forward to improve retrievals of the planetary atmospheric constituents in comparison with previous editions, while offering new ways of working with the data. The HITRAN database is supported by the NASA AURA and PDART program grants NNX14AI55G and NNX16AG51G. I. E. Gordon, L. S. Rothman, C. Hill, R. V. Kochanov, Y. Tan, et al. The HITRAN2016 Molecular Spectroscopic Database. JQSRT 2017;submitted. Many spectroscopists and atmospheric scientists worldwide have contributed data to the database or provided invaluable validations. C. Hill, I. E. Gordon, R. V. Kochanov, L. Barrett, J.S. Wilzewski, L.S. Rothman, JQSRT. 177 (2016) 4-14 R.V. Kochanov, I. E. Gordon, L. S. Rothman, P. Wcislo, C. Hill, J. S. Wilzewski, JQSRT. 177 (2016) 15-30. L. S. Rothman, I. E. Gordon et al. The HITRAN2012 Molecular Spectroscopic Database. JQSRT, 113 (2013) 4-50.
Application of Modern Fortran to Spacecraft Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Williams, Jacob; Falck, Robert D.; Beekman, Izaak B.
2018-01-01
In this paper, applications of the modern Fortran programming language to the field of spacecraft trajectory optimization and design are examined. Modern object-oriented Fortran has many advantages for scientific programming, although many legacy Fortran aerospace codes have not been upgraded to use the newer standards (or have been rewritten in other languages perceived to be more modern). NASA's Copernicus spacecraft trajectory optimization program, originally a combination of Fortran 77 and Fortran 95, has attempted to keep up with modern standards and makes significant use of the new language features. Various algorithms and methods are presented from trajectory tools such as Copernicus, as well as modern Fortran open source libraries and other projects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sher, David J., E-mail: david_sher@rush.edu; Liptay, Michael J.; Fidler, Mary Jo
Purpose: The optimal locoregional therapy for stage IIIA non-small cell lung cancer (NSCLC) is controversial, with definitive chemoradiation therapy (CRT) and neoadjuvant therapy followed by surgery (NT-S) serving as competing strategies. In this study, we used the National Cancer Database to determine the prevalence and predictors of NT in a large, modern cohort of patients. Methods and Materials: Patients with stage IIIA NSCLC treated with CRT or NT-S between 2003 and 2010 at programs accredited by the Commission on Cancer were included. Predictors were categorized as clinical, time/geographic, socioeconomic, and institutional. In accord with the National Cancer Database, institutions were classifiedmore » as academic/research program and as comprehensive and noncomprehensive community cancer centers. Logistic regression and random effects multilevel logistic regression were performed for univariable and multivariable analyses, respectively. Results: The cohort consisted of 18,581 patients, 3,087 (16.6%) of whom underwent NT-S (10.6% induction CRT, 6% induction chemotherapy). The prevalence of NT-S was constant over time, but there were significant relative 31% and 30% decreases in pneumonectomy and right-sided pneumonectomy, respectively, over time (P trend <.02). In addition to younger age, lower T stage, and favorable comorbidity score, indicators of higher socioeconomic status were strong independent predictors of NT-S, including white race, higher income, and private/managed insurance. The type of institution (academic/research program vs comprehensive or noncomprehensive community cancer centers, odds ratio 1.54 and 2.08, respectively) strongly predicted NT-S, but treatment volume did not. Conclusions: Neoadjuvant therapy followed by surgery was an uncommon treatment approach in Commission on Cancer programs, and the prevalence of postinduction pneumonectomy decreased over time. Higher socioeconomic status and treatment at academic institutions were significant predictors of NT-S. Further research should be performed to enable a better understanding of these disparities.« less
Department of Defense Healthcare Management System Modernization (DHMSM)
2016-03-01
2016 Major Automated Information System Annual Report Department of Defense Healthcare Management System Modernization (DHMSM) Defense...DSN Fax: Date Assigned: November 16, 2015 Program Information Program Name Department of Defense Healthcare Management System Modernization...DHMSM) DoD Component DoD The acquiring DoD Component is Program Executive Office (PEO) Department of Defense (DoD) Healthcare Management Systems (DHMS
[Development and integration of the Oncological Documentation System ODS].
Raab, G; van Den Bergh, M
2001-08-01
To simplify clinical routine and to improve medical quality without exceeding the existing resources. Intensifying communication and cooperation between all institutions of patients' health care. The huge amount of documentation work of physicians can no longer be done without modern tools of paperless data processing. The development of ODS was a tight cooperation between physician and technician which resulted in a mutual understanding and led to a high level of user convenience. - At present all cases of gynecology, especially gynecologic oncology can be documented and processed by ODS. Users easily will adopt the system as data entry within different program areas follows the same rules. In addition users can choose between an individual input of data and assistants guiding them through highly specific areas of documentation. ODS is a modern, modular structured and very fast multiuser database environment for in- and outpatient documentation. It automatically generates a lot of reports for clinical day to day business. Statistical routines will help the user reflecting his work and its quality. Documentation of clinical trials according to the GCP guidelines can be done by ODS using the internet or offline datasharing. As ODS is the synthesis of a computer based patient administration system and an oncological documentation database, it represents the basis for the construction of the electronical patient chart as well as the digital documentation of clinical trials. The introduction of this new technology to physicians and nurses has to be done slowly and carefully, in order to increase motivation and to improve the results.
Method and computer program product for maintenance and modernization backlogging
Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M
2013-02-19
According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.
Logistics Modernization Program System Procure-to-Pay Process Did Not Correct Material Weaknesses
2012-05-29
Prevalidation of DOD Commercial Payments,” March 2, 2007 Army U.S. Army Audit Agency Report No. A-2007-0205- FFM , “Logistics Modernization Program...Report No. A-2007-0163- FFM , “FY 03–FY 05 Obligations Recorded in the Logistics Modernization Program,” July 27, 2007 U.S. Army Audit Agency Report No
Intrusion Detection in Database Systems
NASA Astrophysics Data System (ADS)
Javidi, Mohammad M.; Sohrabi, Mina; Rafsanjani, Marjan Kuchaki
Data represent today a valuable asset for organizations and companies and must be protected. Ensuring the security and privacy of data assets is a crucial and very difficult problem in our modern networked world. Despite the necessity of protecting information stored in database systems (DBS), existing security models are insufficient to prevent misuse, especially insider abuse by legitimate users. One mechanism to safeguard the information in these databases is to use an intrusion detection system (IDS). The purpose of Intrusion detection in database systems is to detect transactions that access data without permission. In this paper several database Intrusion detection approaches are evaluated.
NASA Astrophysics Data System (ADS)
Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.
2014-07-01
"Prediflood" is a database of historical floods occurred in Catalonia (NE Iberian Peninsula), between 10th Century and 21th Century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, apt to be used in multidisciplinary flood analysis techniques, as meteorological or hydraullic reconstructions.
Breast tumor malignancy modelling using evolutionary neural logic networks.
Tsakonas, Athanasios; Dounias, Georgios; Panagi, Georgia; Panourgias, Evangelia
2006-01-01
The present work proposes a computer assisted methodology for the effective modelling of the diagnostic decision for breast tumor malignancy. The suggested approach is based on innovative hybrid computational intelligence algorithms properly applied in related cytological data contained in past medical records. The experimental data used in this study were gathered in the early 1990s in the University of Wisconsin, based in post diagnostic cytological observations performed by expert medical staff. Data were properly encoded in a computer database and accordingly, various alternative modelling techniques were applied on them, in an attempt to form diagnostic models. Previous methods included standard optimisation techniques, as well as artificial intelligence approaches, in a way that a variety of related publications exists in modern literature on the subject. In this report, a hybrid computational intelligence approach is suggested, which effectively combines modern mathematical logic principles, neural computation and genetic programming in an effective manner. The approach proves promising either in terms of diagnostic accuracy and generalization capabilities, or in terms of comprehensibility and practical importance for the related medical staff.
1987-06-15
001 GENERAL DYNAMICS 00 FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM Phase 2 Final Project Repc t JUNG 0 ?7 PROJECT 28 AUTOMATION...DYNAMICS FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM Phase 2 Final Project Report PROJECT 28 AUTOMATION OF RECEIVING, RECEIVING...13 6 PROJECT ASSUMPTIONS 20 7 PRELIMINARY/FINAL DESIGN AND FINDINGS 21 8 SYSTEM/EQUIPMENT/MACHINING SPECIFICATIONS 37 9 VENDOR/ INDUSTRY ANALYSIS
NASA Astrophysics Data System (ADS)
Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.
2014-12-01
"Prediflood" is a database of historical floods that occurred in Catalonia (NE Iberian Peninsula), between the 11th century and the 21st century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, suitable for use in multidisciplinary flood analysis techniques, such as meteorological or hydraulic reconstructions.
Modern & Classical Languages: K-12 Program EValuation 1988-89.
ERIC Educational Resources Information Center
Martinez, Margaret Perea
This evaluation of the modern and classical languages programs, K-12, in the Albuquerque (New Mexico) public school system provides general information on the program's history, philosophy, recognition, curriculum development, teachers, and activities. Specific information is offered on the different program components, namely, the elementary…
A geo-spatial data management system for potentially active volcanoes—GEOWARN project
NASA Astrophysics Data System (ADS)
Gogu, Radu C.; Dietrich, Volker J.; Jenny, Bernhard; Schwandner, Florian M.; Hurni, Lorenz
2006-02-01
Integrated studies of active volcanic systems for the purpose of long-term monitoring and forecast and short-term eruption prediction require large numbers of data-sets from various disciplines. A modern database concept has been developed for managing and analyzing multi-disciplinary volcanological data-sets. The GEOWARN project (choosing the "Kos-Yali-Nisyros-Tilos volcanic field, Greece" and the "Campi Flegrei, Italy" as test sites) is oriented toward potentially active volcanoes situated in regions of high geodynamic unrest. This article describes the volcanological database of the spatial and temporal data acquired within the GEOWARN project. As a first step, a spatial database embedded in a Geographic Information System (GIS) environment was created. Digital data of different spatial resolution, and time-series data collected at different intervals or periods, were unified in a common, four-dimensional representation of space and time. The database scheme comprises various information layers containing geographic data (e.g. seafloor and land digital elevation model, satellite imagery, anthropogenic structures, land-use), geophysical data (e.g. from active and passive seismicity, gravity, tomography, SAR interferometry, thermal imagery, differential GPS), geological data (e.g. lithology, structural geology, oceanography), and geochemical data (e.g. from hydrothermal fluid chemistry and diffuse degassing features). As a second step based on the presented database, spatial data analysis has been performed using custom-programmed interfaces that execute query scripts resulting in a graphical visualization of data. These query tools were designed and compiled following scenarios of known "behavior" patterns of dormant volcanoes and first candidate signs of potential unrest. The spatial database and query approach is intended to facilitate scientific research on volcanic processes and phenomena, and volcanic surveillance.
Biosequence Similarity Search on the Mercury System
Krishnamurthy, Praveen; Buhler, Jeremy; Chamberlain, Roger; Franklin, Mark; Gyang, Kwame; Jacob, Arpith; Lancaster, Joseph
2007-01-01
Biosequence similarity search is an important application in modern molecular biology. Search algorithms aim to identify sets of sequences whose extensional similarity suggests a common evolutionary origin or function. The most widely used similarity search tool for biosequences is BLAST, a program designed to compare query sequences to a database. Here, we present the design of BLASTN, the version of BLAST that searches DNA sequences, on the Mercury system, an architecture that supports high-volume, high-throughput data movement off a data store and into reconfigurable hardware. An important component of application deployment on the Mercury system is the functional decomposition of the application onto both the reconfigurable hardware and the traditional processor. Both the Mercury BLASTN application design and its performance analysis are described. PMID:18846267
MOSAIC: An organic geochemical and sedimentological database for marine surface sediments
NASA Astrophysics Data System (ADS)
Tavagna, Maria Luisa; Usman, Muhammed; De Avelar, Silvania; Eglinton, Timothy
2015-04-01
Modern ocean sediments serve as the interface between the biosphere and the geosphere, play a key role in biogeochemical cycles and provide a window on how contemporary processes are written into the sedimentary record. Research over past decades has resulted in a wealth of information on the content and composition of organic matter in marine sediments, with ever-more sophisticated techniques continuing to yield information of greater detail and as an accelerating pace. However, there has been no attempt to synthesize this wealth of information. We are establishing a new database that incorporates information relevant to local, regional and global-scale assessment of the content, source and fate of organic materials accumulating in contemporary marine sediments. In the MOSAIC (Modern Ocean Sediment Archive and Inventory of Carbon) database, particular emphasis is placed on molecular and isotopic information, coupled with relevant contextual information (e.g., sedimentological properties) relevant to elucidating factors that influence the efficiency and nature of organic matter burial. The main features of MOSAIC include: (i) Emphasis on continental margin sediments as major loci of carbon burial, and as the interface between terrestrial and oceanic realms; (ii) Bulk to molecular-level organic geochemical properties and parameters, including concentration and isotopic compositions; (iii) Inclusion of extensive contextual data regarding the depositional setting, in particular with respect to sedimentological and redox characteristics. The ultimate goal is to create an open-access instrument, available on the web, to be utilized for research and education by the international community who can both contribute to, and interrogate the database. The submission will be accomplished by means of a pre-configured table available on the MOSAIC webpage. The information on the filled tables will be checked and eventually imported, via the Structural Query Language (SQL), into MOSAIC. MOSAIC is programmed with PostgreSQL, an open-source database management system. In order to locate geographically the data, each element/datum is associated to a latitude, longitude and depth, facilitating creation of a geospatial database which can be easily interfaced to a Geographic Information System (GIS). In order to make the database broadly accessible, a HTML-PHP language-based website will ultimately be created and linked to the database. Consulting the website will allow for both data visualization as well as export of data in txt format for utilization with common software solutions (e.g. ODV, Excel, Matlab, Python, Word, PPT, Illustrator…). In this very early stage, MOSAIC presently contains approximately 10000 analyses conducted on more than 1800 samples which were collected from over 1600 different geographical locations around the world. Through participation of the international research community, MOSAIC will rapidly develop into a rich archive and versatile tool for investigation of distribution and composition of organic matter accumulating in seafloor sediments. The present contribution will outline the structure of MOSAIC, provide examples of data output, and solicit feedback on desirable features to be included in the database and associated software tools.
Pay for performance in thoracic surgery.
Varela, Gonzalo
2007-08-01
In the context of improving the quality of the medical practice, PFP programs have been developed to reward best medical practice. Early studies showed little gain in quality after implementing PFP family practice programs and some unintended consequences, like excluding high-risk patients from medical services when good outcomes were linked to payment. To date, no PFP programs have been implemented in surgical practice, but it is expected that value-based purchasing philosophy will be extended to surgical specialties in the near future. Quality initiatives in surgery can be based on outcome or process measures. Outcomes-focused quality approaches rely on accurate information obtained from multiinstitutional clinical databases for calculation of risk-adjusted models. Primary outcomes such surgical mortality are uncommon in modern thoracic surgery and outcome measures should rely on more prevalent intermediate outcomes such as specific postoperative morbidities or emergency readmission. Process-based quality approaches need to be based on scientific evidence linking process to outcomes. It is our responsibility to develop practice guidelines or international practice consensus to facilitate the parameters to be evaluated in the near future.
School Buildings: Remodeling; Rehabilitation; Modernization; Repair. Bulletin, 1950, No. 17
ERIC Educational Resources Information Center
Yiles, Nelson E.
1950-01-01
Adequate school plants are essential to a modern educational program. The school plant that is not properly maintained soon fails to provide the service for which it was intended. The total program of maintenance, including repairs, renovation, remodeling, rehabilitation, and modernization should be carefully planned. Some tasks will recur at…
NNDC Stand: Activities and Services of the National Nuclear Data Center
NASA Astrophysics Data System (ADS)
Pritychenko, B.; Arcilla, R.; Burrows, T. W.; Dunford, C. L.; Herman, M. W.; McLane, V.; Obložinský, P.; Sonzogni, A. A.; Tuli, J. K.; Winchell, D. F.
2005-05-01
The National Nuclear Data Center (NNDC) collects, evaluates, and disseminates nuclear physics data for basic nuclear research, applied nuclear technologies including energy, shielding, medical and homeland security. In 2004, to answer the needs of nuclear data users community, NNDC completed a project to modernize data storage and management of its databases and began offering new nuclear data Web services. The principles of database and Web application development as well as related nuclear reaction and structure database services are briefly described.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
...; Comment Request Clinical Trials Reporting Program (CTRP) Database (NCI) Summary: Under the provisions of... Collection: Title: Clinical Trials Reporting Program (CTRP) Database. Type of Information Collection Request... Program (CTRP) Database, to serve as a single, definitive source of information about all NCI-supported...
Insufficient Governance Over Logistics Modernization Program System Development
2010-11-02
Controls Over the Prevalidation of DOD Commercial Payments,” March 2, 2007 Army USAAA Report No. A-2007-0205- FFM , “Logistics Modernization Program...0163- FFM , “FY 03–FY 05 Obligations Recorded in the Logistics Modernization Program,” July 27, 2007 USAAA Report No. A-2007-0154-ALR, “Follow up...Audit of Aged Accounts–U.S. Army Communications-Electronics Life Cycle Management Command,” July 2, 2007 USAAA Report No. A-2006-0234- FFM
ERIC Educational Resources Information Center
Bowman, Benjamin F.
For the past two decades the central Information Retrieval Services of the Max Planck Society has been providing database searches for scientists in Max Planck Institutes and Research Groups throughout Germany. As a supplement to traditional search services offered by professional intermediaries, they have recently fostered the introduction of a…
Nimz, Kathryn; Ramsey, David W.; Sherrod, David R.; Smith, James G.
2008-01-01
Since 1979, Earth scientists of the Geothermal Research Program of the U.S. Geological Survey have carried out multidisciplinary research in the Cascade Range. The goal of this research is to understand the geology, tectonics, and hydrology of the Cascades in order to characterize and quantify geothermal resource potential. A major goal of the program is compilation of a comprehensive geologic map of the entire Cascade Range that incorporates modern field studies and that has a unified and internally consistent explanation. This map is one of three in a series that shows Cascade Range geology by fitting published and unpublished mapping into a province-wide scheme of rock units distinguished by composition and age; map sheets of the Cascade Range in Washington (Smith, 1993) and California will complete the series. The complete series forms a guide to exploration and evaluation of the geothermal resources of the Cascade Range and will be useful for studies of volcano hazards, volcanology, and tectonics. This digital release contains all the information used to produce the geologic map published as U.S. Geological Survey Geologic Investigations Series I-2569 (Sherrod and Smith, 2000). The main component of this digital release is a geologic map database prepared using ArcInfo GIS. This release also contains files to view or print the geologic map and accompanying descriptive pamphlet from I-2569.
The NASA modern technology rotors program
NASA Technical Reports Server (NTRS)
Watts, M. E.; Cross, J. L.
1986-01-01
Existing data bases regarding helicopters are based on work conducted on 'old-technology' rotor systems. The Modern Technology Rotors (MTR) Program is to provide extensive data bases on rotor systems using present and emerging technology. The MTR is concerned with modern, four-bladed, rotor systems presently being manufactured or under development. Aspects of MTR philosophy are considered along with instrumentation, the MTR test program, the BV 360 Rotor, and the UH-60 Black Hawk. The program phases include computer modelling, shake test, model-scale test, minimally instrumented flight test, extensively pressure-instrumented-blade flight test, and full-scale wind tunnel test.
Cornish, Peter A; Berry, Gillian; Benton, Sherry; Barros-Gomes, Patricia; Johnson, Dawn; Ginsburg, Rebecca; Whelan, Beth; Fawcett, Emily; Romano, Vera
2017-11-01
A new stepped care model developed in North America reimagines the original United Kingdom model for the modern university campus environment. It integrates a range of established and emerging online mental health programs systematically along dimensions of treatment intensity and associated student autonomy. Program intensity can be either stepped up or down depending on level of client need. Because monitoring is configured to give both provider and client feedback on progress, the model empowers clients to participate actively in care options, decisions, and delivery. Not only is stepped care designed to be more efficient than traditional counseling services, early observations suggest it improves outcomes and access, including the elimination of service waitlists. This paper describes the new model in detail and outlines implementation experiences at 3 North American universities. While the experiences implementing the model have been positive, there is a need for development of technology that would facilitate more thorough evaluation. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
The Impact of Modernization Programs on Academic Teachers' Work: A Mexican Case Study
ERIC Educational Resources Information Center
Zavala, Blanca Arciga
2006-01-01
For more than ten years, academics of public universities in Mexico have endured modernization programs that promote individual productivity and operate as a mechanism of selection and assessment. The implementation of the programs has exposed a tension between the values implicit in the programs and the values of the academic teachers. There is a…
Through Kazan ASPERA to Modern Projects
NASA Astrophysics Data System (ADS)
Gusev, Alexander; Kitiashvili, Irina; Petrova, Natasha
Now the European Union form the Sixth Framework Programme. One of its the objects of the EU Programme is opening national researches and training programmes. The Russian PhD students and young astronomers have business and financial difficulties in access to modern databases and astronomical projects and so they has not been included in European overview of priorities. Modern requirements to the organization of observant projects on powerful telescopes assumes painstaking scientific computer preparation of the application. A rigid competition for observation time assume preliminary computer modeling of target object for success of the application. Kazan AstroGeoPhysics Partnership
ERIC Educational Resources Information Center
Niehoff, Richard O.; Wilder, Bernard
Nonformal education programs operating in the modern sector in Ethiopia are described in a perspective relevant to the Ethiopian context. The modern sector is defined as those activities concerned with the manufacture of goods, extraction of raw materials, the processing of raw materials, the provision of services, and the creation and maintenance…
Implementation of a computer database testing and analysis program.
Rouse, Deborah P
2007-01-01
The author is the coordinator of a computer software database testing and analysis program implemented in an associate degree nursing program. Computer software database programs help support the testing development and analysis process. Critical thinking is measurable and promoted with their use. The reader of this article will learn what is involved in procuring and implementing a computer database testing and analysis program in an academic nursing program. The use of the computerized database for testing and analysis will be approached as a method to promote and evaluate the nursing student's critical thinking skills and to prepare the nursing student for the National Council Licensure Examination.
Crustose coralline algae increased framework and diversity on ancient coral reefs.
Weiss, Anna; Martindale, Rowan C
2017-01-01
Crustose coralline algae (CCA) are key producers of carbonate sediment on reefs today. Despite their importance in modern reef ecosystems, the long-term relationship of CCA with reef development has not been quantitatively assessed in the fossil record. This study includes data from 128 Cenozoic coral reefs collected from the Paleobiology Database, the Paleoreefs Database, as well as the original literature and assesses the correlation of CCA abundance with taxonomic diversity (both corals and reef dwellers) and framework of fossil coral reefs. Chi-squared tests show reef type is significantly correlated with CCA abundance and post-hoc tests indicate higher involvement of CCA is associated with stronger reef structure. Additionally, general linear models show coral reefs with higher amounts of CCA had a higher diversity of reef-dwelling organisms. These data have important implications for paleoecology as they demonstrate that CCA increased building capacity, structural integrity, and diversity of ancient coral reefs. The analyses presented here demonstrate that the function of CCA on modern coral reefs is similar to their function on Cenozoic reefs; thus, studies of ancient coral reef collapse are even more meaningful as modern analogues.
Astronomy in the Russian Scientific-Educational Project: "KAZAN-GEONA-2010"
NASA Astrophysics Data System (ADS)
Gusev, A.; Kitiashvili, I.
2006-08-01
The European Union promotes the Sixth Framework Programme. One of the goals of the EU Programme is opening national research and training programs. A special role in the history of the Kazan University was played by the great mathematician Nikolai Lobachevsky - the founder of non-Euclidean geometry (1826). Historically, the thousand-year old city of Kazan and the two-hundred-year old Kazan University carry out the role of the scientific, organizational, and cultural educational center of the Volga region. For the continued successful development of educational and scientific-educational activity of the Russian Federation, the Republic Tatarstan, Kazan was offered the national project: the International Center of the Sciences and Internet Technologies "GeoNa" (Geometry of Nature - GeoNa - is wisdom, enthusiasm, pride, grandeur). This is a modern complex of conference halls including the Center for Internet Technologies, a 3D Planetarium - development of the Moon, PhysicsLand, an active museum of natural sciences, an oceanarium, and a training complex "Spheres of Knowledge". Center GeoNa promotes the direct and effective channel of cooperation with scientific centers around the world. GeoNa will host conferences, congresses, fundamental scientific research sessions of the Moon and planets, and scientific-educational actions: presentation of the international scientific programs on lunar research and modern lunar databases. A more intense program of exchange between scientific centers and organizations for a better knowledge and planning of their astronomical curricula and the introduction of the teaching of astronomy are proposed. Center GeoNa will enable scientists and teachers of the Russian universities with advanced achievements in science and information technologies to join together to establish scientific communications with foreign colleagues in the sphere of the high technology and educational projects with world scientific centers.
Housing first on a large scale: Fidelity strengths and challenges in the VA's HUD-VASH program.
Kertesz, Stefan G; Austin, Erika L; Holmes, Sally K; DeRussy, Aerin J; Van Deusen Lukas, Carol; Pollio, David E
2017-05-01
Housing First (HF) combines permanent supportive housing and supportive services for homeless individuals and removes traditional treatment-related preconditions for housing entry. There has been little research describing strengths and shortfalls of HF implementation outside of research demonstration projects. The U.S. Department of Veterans Affairs (VA) has transitioned to an HF approach in a supportive housing program serving over 85,000 persons. This offers a naturalistic window to study fidelity when HF is adopted on a large scale. We operationalized HF into 20 criteria grouped into 5 domains. We assessed 8 VA medical centers twice (1 year apart), scoring each criterion using a scale ranging from 1 ( low fidelity ) to 4 ( high fidelity ). There were 2 HF domains (no preconditions and rapidly offering permanent housing) for which high fidelity was readily attained. There was uneven progress in prioritizing the most vulnerable clients for housing support. Two HF domains (sufficient supportive services and a modern recovery philosophy) had considerably lower fidelity. Interviews suggested that operational issues such as shortfalls in staffing and training likely hindered performance in these 2 domains. In this ambitious national HF program, the largest to date, we found substantial fidelity in focusing on permanent housing and removal of preconditions to housing entry. Areas of concern included the adequacy of supportive services and adequacy in deployment of a modern recovery philosophy. Under real-world conditions, large-scale implementation of HF is likely to require significant additional investment in client service supports to assure that results are concordant with those found in research studies. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Ben Ayed, Rayda; Ben Hassen, Hanen; Ennouri, Karim; Ben Marzoug, Riadh; Rebai, Ahmed
2016-01-01
Olive (Olea europaea), whose importance is mainly due to nutritional and health features, is one of the most economically significant oil-producing trees in the Mediterranean region. Unfortunately, the increasing market demand towards virgin olive oil could often result in its adulteration with less expensive oils, which is a serious problem for the public and quality control evaluators of virgin olive oil. Therefore, to avoid frauds, olive cultivar identification and virgin olive oil authentication have become a major issue for the producers and consumers of quality control in the olive chain. Presently, genetic traceability using SSR is the cost effective and powerful marker technique that can be employed to resolve such problems. However, to identify an unknown monovarietal virgin olive oil cultivar, a reference system has become necessary. Thus, an Olive Genetic Diversity Database (OGDD) (http://www.bioinfo-cbs.org/ogdd/) is presented in this work. It is a genetic, morphologic and chemical database of worldwide olive tree and oil having a double function. In fact, besides being a reference system generated for the identification of unkown olive or virgin olive oil cultivars based on their microsatellite allele size(s), it provides users additional morphological and chemical information for each identified cultivar. Currently, OGDD is designed to enable users to easily retrieve and visualize biologically important information (SSR markers, and olive tree and oil characteristics of about 200 cultivars worldwide) using a set of efficient query interfaces and analysis tools. It can be accessed through a web service from any modern programming language using a simple hypertext transfer protocol call. The web site is implemented in java, JavaScript, PHP, HTML and Apache with all major browsers supported. Database URL: http://www.bioinfo-cbs.org/ogdd/ PMID:26827236
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuli, J.K.; Sonzogni,A.
The National Nuclear Data Center has provided remote access to some of its resources since 1986. The major databases and other resources available currently through NNDC Web site are summarized. The National Nuclear Data Center (NNDC) has provided remote access to the nuclear physics databases it maintains and to other resources since 1986. With considerable innovation access is now mostly through the Web. The NNDC Web pages have been modernized to provide a consistent state-of-the-art style. The improved database services and other resources available from the NNOC site at www.nndc.bnl.govwill be described.
Belaid, Loubna; Dumont, Alexandre; Chaillet, Nils; De Brouwere, Vincent; Zertal, Amel; Hounton, Sennen; Ridde, Valéry
2015-09-28
Despite a global increase in contraception use, its prevalence remains low in low- and middle-income countries. One strategy to improve uptake and use of contraception, as an essential complement to policies and supply-side interventions, is demand generation. Demand generation interventions have reportedly produced positive effects on uptake and use of family planning services, but the evidence base remains poorly documented. To reduce this knowledge gap, we will conduct a systematic review on the impact of demand generation interventions on the use of modern contraception. The objectives of the review will be as follows: (1) to synthesize evidence on the impacts and costs of family planning demand generation interventions and on their effectiveness in improving modern contraceptive use and (2) to identify the indicators used to assess effectiveness, cost-effectiveness, and impacts of demand generation interventions. We will systematically review the public health and health promotion literature in several databases (e.g., CINAHL, Medline, EMBASE) as well as gray literature. We will select articles from 1970 to 2015, in French and in English. The review will include studies that assess the impact of family planning programs or interventions on changes in contraception use. The studied interventions will be those with a demand generation component, even if a supply component is implemented. Two members of the team will independently search, screen, extract data, and assess the quality of the studies selected. Different tools will be used to assess the quality of the studies depending on the study design. If appropriate, a meta-analysis will be conducted. The analysis will involve comparing odd ratios (OR) DISCUSSION: The systematic review results will be disseminated to United Nations Population Fund program countries and will contribute to the development of a guidance document and programmatic tools for planning, implementing, and evaluating demand generation interventions in family planning. Improving the effectiveness of family planning programs is critical for empowering women and adolescent girls, improving human capital, reducing dependency ratios, reducing maternal and child mortality, and achieving demographic dividends in low- and middle-income countries. This protocol is registered in PROSPERO (CRD 42015017549).
Teacher's Guide to SERAPHIM Software III. Modern Chemistry.
ERIC Educational Resources Information Center
Bogner, Donna J.
Designed to assist chemistry teachers in selecting appropriate software programs, this publication is the third in a series of six teacher's guides from Project SERAPHIM, a program sponsored by the National Science Foundation. This guide is keyed to the chapters of the text "Modern Chemistry." Program suggestions are arranged in the same…
Teacher's Guide to SERAPHIM Software IV Chemistry: A Modern Course.
ERIC Educational Resources Information Center
Bogner, Donna J.
Designed to assist chemistry teachers in selecting appropriate software programs, this publication is the fourth in a series of six teacher's guides from Project SERAPHIM, a program sponsored by the National Science Foundation. This guide is keyed to the chapters of the text "Chemistry: A Modern Course." Program suggestions are arranged…
Teaching Strategies and Methods in Modern Environments for Learning of Programming
ERIC Educational Resources Information Center
Djenic, Slobodanka; Mitic, Jelena
2017-01-01
This paper presents teaching strategies and methods, applicable in modern blended environments for learning of programming. Given the fact that the manner of applying teaching strategies always depends on the specific requirements of a certain area of learning, the paper outlines the basic principles of teaching in programming courses, as well as…
NASA's Space Environments and Effects (SEE) Program: The Pursuit of Tomorrow's Space Technology
NASA Technical Reports Server (NTRS)
Pearson, Steven D.; Hardage, Donna M.
1998-01-01
A hazard to all spacecraft orbiting the earth and exploring the unknown in deep space is the existence of a harsh and ever changing environment with its subsequent effects. Some of these environmental hazards, such as plasma, extreme thermal excursions, meteoroids, and ionizing radiation result from natural sources, whereas others, such as orbital debris and neutral contamination are induced by the presence of spacecraft themselves. The subsequent effects can provide damaging or even disabling effects on spacecraft, its materials, and its instruments. In partnership with industry, academia, and other government agencies, National Aeronautics & Space Administration's (NASA's) Space Environments & Effects (SEE) Program defines the space environments and advocates technology development to accommodate or mitigate these harmful environments on the spacecraft. This program provides a very comprehensive and focused approach to understanding the space environment, to define the best techniques for both flight and ground-based experimentation, to update the models which predict both the environments and the environmental effects on spacecraft, and finally to ensure that this information is properly maintained and inserted into spacecraft design programs. This paper will provide an overview of the Program's purpose, goals, database management and technical activities. In particular, the SEE Program has been very active in developing improved ionizing radiation models and developing related flight experiments which should aid in determining the effect of the radiation environment on modern electronics.
Fan, Long; Hui, Jerome H L; Yu, Zu Guo; Chu, Ka Hou
2014-07-01
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/. © 2014 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Minnett, R.; Koppers, A. A. P.; Jarboe, N.; Jonestrask, L.; Tauxe, L.; Constable, C.
2016-12-01
The Magnetics Information Consortium (https://earthref.org/MagIC/) develops and maintains a database and web application for supporting the paleo-, geo-, and rock magnetic scientific community. Historically, this objective has been met with an Oracle database and a Perl web application at the San Diego Supercomputer Center (SDSC). The Oracle Enterprise Cluster at SDSC, however, was decommissioned in July of 2016 and the cost for MagIC to continue using Oracle became prohibitive. This provided MagIC with a unique opportunity to reexamine the entire technology stack and data model. MagIC has developed an open-source web application using the Meteor (http://meteor.com) framework and a MongoDB database. The simplicity of the open-source full-stack framework that Meteor provides has improved MagIC's development pace and the increased flexibility of the data schema in MongoDB encouraged the reorganization of the MagIC Data Model. As a result of incorporating actively developed open-source projects into the technology stack, MagIC has benefited from their vibrant software development communities. This has translated into a more modern web application that has significantly improved the user experience for the paleo-, geo-, and rock magnetic scientific community.
NASA Astrophysics Data System (ADS)
Thakore, Arun K.; Sauer, Frank
1994-05-01
The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.
Intra-reach headwater fish assemblage structure
McKenna, James E.
2017-01-01
Large-scale conservation efforts can take advantage of modern large databases and regional modeling and assessment methods. However, these broad-scale efforts often assume uniform average habitat conditions and/or species assemblages within stream reaches.
Toward an integrated knowledge environment to support modern oncology.
Blake, Patrick M; Decker, David A; Glennon, Timothy M; Liang, Yong Michael; Losko, Sascha; Navin, Nicholas; Suh, K Stephen
2011-01-01
Around the world, teams of researchers continue to develop a wide range of systems to capture, store, and analyze data including treatment, patient outcomes, tumor registries, next-generation sequencing, single-nucleotide polymorphism, copy number, gene expression, drug chemistry, drug safety, and toxicity. Scientists mine, curate, and manually annotate growing mountains of data to produce high-quality databases, while clinical information is aggregated in distant systems. Databases are currently scattered, and relationships between variables coded in disparate datasets are frequently invisible. The challenge is to evolve oncology informatics from a "systems" orientation of standalone platforms and silos into an "integrated knowledge environments" that will connect "knowable" research data with patient clinical information. The aim of this article is to review progress toward an integrated knowledge environment to support modern oncology with a focus on supporting scientific discovery and improving cancer care.
Tsunami.gov: NOAA's Tsunami Information Portal
NASA Astrophysics Data System (ADS)
Shiro, B.; Carrick, J.; Hellman, S. B.; Bernard, M.; Dildine, W. P.
2014-12-01
We present the new Tsunami.gov website, which delivers a single authoritative source of tsunami information for the public and emergency management communities. The site efficiently merges information from NOAA's Tsunami Warning Centers (TWC's) by way of a comprehensive XML feed called Tsunami Event XML (TEX). The resulting unified view allows users to quickly see the latest tsunami alert status in geographic context without having to understand complex TWC areas of responsibility. The new site provides for the creation of a wide range of products beyond the traditional ASCII-based tsunami messages. The publication of modern formats such as Common Alerting Protocol (CAP) can drive geographically aware emergency alert systems like FEMA's Integrated Public Alert and Warning System (IPAWS). Supported are other popular information delivery systems, including email, text messaging, and social media updates. The Tsunami.gov portal allows NOAA staff to easily edit content and provides the facility for users to customize their viewing experience. In addition to access by the public, emergency managers and government officials may be offered the capability to log into the portal for special access rights to decision-making and administrative resources relevant to their respective tsunami warning systems. The site follows modern HTML5 responsive design practices for optimized use on mobile as well as non-mobile platforms. It meets all federal security and accessibility standards. Moving forward, we hope to expand Tsunami.gov to encompass tsunami-related content currently offered on separate websites, including the NOAA Tsunami Website, National Tsunami Hazard Mitigation Program, NOAA Center for Tsunami Research, National Geophysical Data Center's Tsunami Database, and National Data Buoy Center's DART Program. This project is part of the larger Tsunami Information Technology Modernization Project, which is consolidating the software architectures of NOAA's existing TWC's into a single system. We welcome your feedback to help Tsunami.gov become an effective public resource for tsunami information and a medium to enable better global tsunami warning coordination.
A LTA flight research vehicle. [technology assessment, airships
NASA Technical Reports Server (NTRS)
Nebiker, F. R.
1975-01-01
An Airship Flight Research Program is proposed. Major program objectives are summarized and a Modernized Navy ZPG3W Airship recommended as the flight test vehicle. The origin of the current interest in modern airship vehicles is briefly discussed and the major benefits resulting from the flight research program described. Airship configurations and specifications are included.
Increasing family planning in Myanmar: the role of the private sector and social franchise programs.
Aung, Tin; Hom, Nang Mo; Sudhinaraset, May
2017-07-01
This study examines the influence of clinical social franchise program on modern contraceptive use. This was a cross-sectional survey of contraceptive use among 2390 currently married women across 25 townships in Myanmar in 2014. Social franchise program measures were from programmatic records. Multivariable models show that women who lived in communities with at least 1-5 years of a clinical social franchise intrauterine device (IUD) program had 4.770 higher odds of using a modern contraceptive method compared to women living in communities with no IUD program [CI: 3.739-6.084]. Townships where the reproductive health program had existed for at least 10 years had 1.428 higher odds of reporting modern method use compared to women living in townships where the programs had existed for less than 10 years [CI: 1.016-2.008]. This study found consistent and robust evidence for an increase in family planning methods over program duration as well as intensity of social franchise programs.
Modern Art as Public Care: Alzheimer's and the Aesthetics of Universal Personhood.
Selberg, Scott
2015-12-01
This article is based on ethnographic research of the New York Museum of Modern Art's influential Alzheimer's access program, Meet Me at MoMA. The program belongs to an increasingly popular model of psychosocial treatment that promotes art as potentially therapeutic or beneficial to people experiencing symptoms of dementia as well as to their caregivers. Participant observation of the sessions and a series of interviews with museum staff and educators reveal broader assumptions about the relationship between modern art, dementia, and personhood. These assumptions indicate a museological investment in the capacity and perceived interiority of all participants. Ultimately, the program authorizes a narrative of universal personhood that harmonizes with the museum's longstanding focus on temporal and aesthetic modernism. © 2015 by the American Anthropological Association.
1986-05-01
Information System (DGIS) is being developed to provide the DD crmjnj t with a modern tool to access diverse dtabaiees and extract information products...this community with a modern tool for accessing these databases and extracting information products from them. Since the Defense Technical Information...adjunct to DROLS xesults. The study , thereor. centerd around obtaining background information inside the unit on that unit’s users who request DROLS
Examples of Use of SINBAD Database for Nuclear Data and Code Validation
NASA Astrophysics Data System (ADS)
Kodeli, Ivan; Žerovnik, Gašper; Milocco, Alberto
2017-09-01
The SINBAD database currently contains compilations and evaluations of over 100 shielding benchmark experiments. The SINBAD database is widely used for code and data validation. Materials covered include: Air, N. O, H2O, Al, Be, Cu, graphite, concrete, Fe, stainless steel, Pb, Li, Ni, Nb, SiC, Na, W, V and mixtures thereof. Over 40 organisations from 14 countries and 2 international organisations have contributed data and work in support of SINBAD. Examples of the use of the database in the scope of different international projects, such as the Working Party on Evaluation Cooperation of the OECD and the European Fusion Programme demonstrate the merit and possible usage of the database for the validation of modern nuclear data evaluations and new computer codes.
The ESID Online Database network.
Guzman, D; Veit, D; Knerr, V; Kindle, G; Gathmann, B; Eades-Perner, A M; Grimbacher, B
2007-03-01
Primary immunodeficiencies (PIDs) belong to the group of rare diseases. The European Society for Immunodeficiencies (ESID), is establishing an innovative European patient and research database network for continuous long-term documentation of patients, in order to improve the diagnosis, classification, prognosis and therapy of PIDs. The ESID Online Database is a web-based system aimed at data storage, data entry, reporting and the import of pre-existing data sources in an enterprise business-to-business integration (B2B). The online database is based on Java 2 Enterprise System (J2EE) with high-standard security features, which comply with data protection laws and the demands of a modern research platform. The ESID Online Database is accessible via the official website (http://www.esid.org/). Supplementary data are available at Bioinformatics online.
75 FR 33651 - Meetings of Humanities Panel
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-14
... Comparative Literature and Literary Theory in Fellowships, submitted to the Division of Research Programs at... meeting will review applications for Early Modern European History in Fellowships, submitted to the.... Room: 415. Program: This meeting will review applications for Modern European History I in Fellowships...
The Ongoing Impact of the U.S. Fast Reactor Integral Experiments Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
John D. Bess; Michael A. Pope; Harold F. McFarlane
2012-11-01
The creation of a large database of integral fast reactor physics experiments advanced nuclear science and technology in ways that were unachievable by less capital intensive and operationally challenging approaches. They enabled the compilation of integral physics benchmark data, validated (or not) analytical methods, and provided assurance of future rector designs The integral experiments performed at Argonne National Laboratory (ANL) represent decades of research performed to support fast reactor design and our understanding of neutronics behavior and reactor physics measurements. Experiments began in 1955 with the Zero Power Reactor No. 3 (ZPR-3) and terminated with the Zero Power Physics Reactormore » (ZPPR, originally the Zero Power Plutonium Reactor) in 1990 at the former ANL-West site in Idaho, which is now part of the Idaho National Laboratory (INL). Two additional critical assemblies, ZPR-6 and ZPR-9, operated at the ANL-East site in Illinois. A total of 128 fast reactor assemblies were constructed with these facilities [1]. The infrastructure and measurement capabilities are too expensive to be replicated in the modern era, making the integral database invaluable as the world pushes ahead with development of liquid metal cooled reactors.« less
GEMINI: Integrative Exploration of Genetic Variation and Genome Annotations
Paila, Umadevi; Chapman, Brad A.; Kirchner, Rory; Quinlan, Aaron R.
2013-01-01
Modern DNA sequencing technologies enable geneticists to rapidly identify genetic variation among many human genomes. However, isolating the minority of variants underlying disease remains an important, yet formidable challenge for medical genetics. We have developed GEMINI (GEnome MINIng), a flexible software package for exploring all forms of human genetic variation. Unlike existing tools, GEMINI integrates genetic variation with a diverse and adaptable set of genome annotations (e.g., dbSNP, ENCODE, UCSC, ClinVar, KEGG) into a unified database to facilitate interpretation and data exploration. Whereas other methods provide an inflexible set of variant filters or prioritization methods, GEMINI allows researchers to compose complex queries based on sample genotypes, inheritance patterns, and both pre-installed and custom genome annotations. GEMINI also provides methods for ad hoc queries and data exploration, a simple programming interface for custom analyses that leverage the underlying database, and both command line and graphical tools for common analyses. We demonstrate GEMINI's utility for exploring variation in personal genomes and family based genetic studies, and illustrate its ability to scale to studies involving thousands of human samples. GEMINI is designed for reproducibility and flexibility and our goal is to provide researchers with a standard framework for medical genomics. PMID:23874191
Data Services Upgrade: Perfecting the ISIS-I Topside Digital Ionogram Database
NASA Technical Reports Server (NTRS)
Wang, Yongli; Benson, Robert F.; Bilitza, Dieter; Fung, Shing. F.; Chu, Philip; Huang, Xueqin; Truhlik, Vladimir
2015-01-01
The ionospheric topside sounders of the International Satellites for Ionospheric Studies (ISIS) program were designed as analog systems. More than 16,000 of the original telemetry tapes from three satellites were used to produce topside digital ionograms, via an analog-to-digital (A/D) conversion process, suitable for modern analysis techniques. Unfortunately, many of the resulting digital topside ionogram files could not be auto-processed to produce topside Ne(h) profiles because of problems encountered during the A/D process. Software has been written to resolve these problems and here we report on (1) the first application of this software to a significant portion of the ISIS-1 digital topside-ionogram database, (2) software improvements motivated by this activity, (3) N(sub e)(h) profiles automatically produced from these corrected ISIS-1 digital ionogram files, and (4) the availability via the Virtual Wave Observatory (VWO) of the corrected ISIS-1 digital topside ionogram files for research. We will also demonstrate the use of these N(sub e)(h) profiles for making refinements in the International Reference Ionosphere (IRI) and in the determination of transition heights from Oxygen ion to Hydrogen ion.
ERIC Educational Resources Information Center
Brown, Gerald R.
This document reports on a study tour of Canadian schools conducted by the Sri Lanka Ministry of Education. The purposes of the tour were to: develop an awareness of the scope of modern school library programming; investigate the aspects of implementation of a modern school library program including staffing, facilities, educational programming,…
The China Factor in Japanese Military Modernization for the 21st Century
1997-06-01
Japan’s ongoing modernization of its forces, which are directed under its National Defense Program Outline and Midterm Defense Program, do not, however...directed under its National Defense Program Outline and Midterm Defense Program, do not, however, seem to be in reaction to any overt perception of a...It did so with the economic motive of acquiring a captive market for Japanese consumer goods, the strategic consideration of preempting Russian
Compilation of Abstracts of Theses Submitted by Candidates for Degrees
1987-09-30
Paral- lel, Multiple Backend Database Systems Feudo, C.V. Modern Hardware Tochnololies 88 MAJ , USA 8nd. Sof ware Techniques for Online uatabase Storage...and itsApplication in the War- gaming , Reseamth and Analysis (W.A.R.) Lab Waltens erger, G.M. On Limited War, Escalation 524 CPT,, USRF Control, and...TECHNIQIUES FOR ONLINE DATABASE ,TORAGE AND ACCESS Christopher V. Feudo Ma or, United States Army B.S., United States Military Academy# 1972
2008-12-01
projects have either resorted to partitioned smaller databases, or to a hybrid scheme where meta - data are stored in the database, along with pointers to...comes from the briefing of Dr. Mark Duchaineau from LLNL. If we assume that a pixel from a modern airborne sensor covers a square meter, then one can... airborne platform. After surveillance is complete, the data (in fact the disks them- selves) are sent to a ground station for processing. Despite the
Power-plant modernization program in Latvia. Desk Study Report No. 1. Export trade information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-08-01
The Government of Latvia has requested the U.S. Trade and Development Program's (TDP's) assistance in financing the cost of a feasibility study to develop a modernization program for its thermal power stations aimed at improving their performance and efficiency. The consultant will work with engineers and managers of Latvenergo, Latvia's power utility, to review the performance of the country's two thermal power stations and carry out a detailed study for the rehabilitation and modernization of the TEC-2 thermal power station in Riga. The overall goal of the program will be to maximize the output capacity of the country's two powermore » stations through the implementation of economically efficient rehabilitation projects.« less
Big data analytics for early detection of breast cancer based on machine learning
NASA Astrophysics Data System (ADS)
Ivanova, Desislava
2017-12-01
This paper presents the concept and the modern advances in personalized medicine that rely on technology and review the existing tools for early detection of breast cancer. The breast cancer types and distribution worldwide is discussed. It is spent time to explain the importance of identifying the normality and to specify the main classes in breast cancer, benign or malignant. The main purpose of the paper is to propose a conceptual model for early detection of breast cancer based on machine learning for processing and analysis of medical big dataand further knowledge discovery for personalized treatment. The proposed conceptual model is realized by using Naive Bayes classifier. The software is written in python programming language and for the experiments the Wisconsin breast cancer database is used. Finally, the experimental results are presented and discussed.
Speizer, Ilene S; Corroon, Meghan; Calhoun, Lisa; Lance, Peter; Montana, Livia; Nanda, Priya; Guilkey, David
2014-01-01
ABSTRACT Family planning is crucial for preventing unintended pregnancies and for improving maternal and child health and well-being. In urban areas where there are large inequities in family planning use, particularly among the urban poor, programs are needed to increase access to and use of contraception among those most in need. This paper presents the midterm evaluation findings of the Urban Reproductive Health Initiative (Urban RH Initiative) programs, funded by the Bill & Melinda Gates Foundation, that are being implemented in 4 countries: India (Uttar Pradesh), Kenya, Nigeria, and Senegal. Between 2010 and 2013, the Measurement, Learning & Evaluation (MLE) project collected baseline and 2-year longitudinal follow-up data from women in target study cities to examine the role of demand generation activities undertaken as part of the Urban RH Initiative programs. Evaluation results demonstrate that, in each country where it was measured, outreach by community health or family planning workers as well as local radio programs were significantly associated with increased use of modern contraceptive methods. In addition, in India and Nigeria, television programs had a significant effect on modern contraceptive use, and in Kenya and Nigeria, the program slogans and materials that were blanketed across the cities (eg, leaflets/brochures distributed at health clinics and the program logo placed on all forms of materials, from market umbrellas to health facility signs and television programs) were also significantly associated with modern method use. Our results show that targeted, multilevel demand generation activities can make an important contribution to increasing modern contraceptive use in urban areas and could impact Millennium Development Goals for improved maternal and child health and access to reproductive health for all. PMID:25611476
Ecological selectivity of the emerging mass extinction in the oceans.
Payne, Jonathan L; Bush, Andrew M; Heim, Noel A; Knope, Matthew L; McCauley, Douglas J
2016-09-16
To better predict the ecological and evolutionary effects of the emerging biodiversity crisis in the modern oceans, we compared the association between extinction threat and ecological traits in modern marine animals to associations observed during past extinction events using a database of 2497 marine vertebrate and mollusc genera. We find that extinction threat in the modern oceans is strongly associated with large body size, whereas past extinction events were either nonselective or preferentially removed smaller-bodied taxa. Pelagic animals were victimized more than benthic animals during previous mass extinctions but are not preferentially threatened in the modern ocean. The differential importance of large-bodied animals to ecosystem function portends greater future ecological disruption than that caused by similar levels of taxonomic loss in past mass extinction events. Copyright © 2016, American Association for the Advancement of Science.
NASA Technical Reports Server (NTRS)
1984-01-01
Management of the data within a planetary data system (PDS) is addressed. Principles of modern data management are described and several large NASA scientific data base systems are examined. Data management in PDS is outlined and the major data management issues are introduced.
Air Force Intercontinental Ballistic Missile Fuze Modernization (ICBM Fuze Mod)
2015-12-01
Selected Acquisition Report ( SAR ) RCS: DD-A&T(Q&A)823-498 Air Force Intercontinental Ballistic Missile Fuze Modernization (ICBM Fuze Mod) As of FY...2015 SAR March 17, 2016 09:03:03 UNCLASSIFIED 2 Table of Contents Common Acronyms and Abbreviations for MDAP Programs 3 Program Information...Acquisition Unit Cost ICBM Fuze Mod December 2015 SAR March 17, 2016 09:03:03 UNCLASSIFIED 3 PB - President’s Budget PE - Program Element PEO - Program
Adding EUNIS and VAULT rocket data to the VSO with Modern Perl frameworks
NASA Astrophysics Data System (ADS)
Mansky, Edmund
2017-08-01
A new Perl code is described, that uses the modern Object-oriented Moose framework, to add EUNIS and VAULT rocket data to the Virtual Solar Observatory website. The code permits the easy fixing of FITS header fields in the case where some FITS fields that are required are missing from the original data files. The code makes novel use of the Moose extensions “before” and “after” to build in dependencies so that database creation of tables occurs before the loading of data, and that the validation of file-dependent tables occurs after the loading is completed. Also described is the computation and loading of the deferred FITS field CHECKSUM into the database following the loading and validation of the file-dependent tables. The loading of the EUNIS 2006 and 2007 flight data, and the VAULT 2.0 flight data is described in detail as illustrative examples.
The Role of Modern Control Theory in the Design of Controls for Aircraft Turbine Engines
NASA Technical Reports Server (NTRS)
Zeller, J.; Lehtinen, B.; Merrill, W.
1982-01-01
Accomplishments in applying Modern Control Theory to the design of controls for advanced aircraft turbine engines were reviewed. The results of successful research programs are discussed. Ongoing programs as well as planned or recommended future thrusts are also discussed.
Teaching Modern Dance: A Conceptual Approach
ERIC Educational Resources Information Center
Enghauser, Rebecca Gose
2008-01-01
A conceptual approach to teaching modern dance can broaden the awareness and deepen the understanding of modern dance in the educational arena in general, and in dance education specifically. This article describes a unique program that dance teachers can use to introduce modern dance to novice dancers, as well as more experienced dancers,…
Identification of key ancestors of modern germplasm in a breeding program of maize.
Technow, F; Schrag, T A; Schipprack, W; Melchinger, A E
2014-12-01
Probabilities of gene origin computed from the genomic kinships matrix can accurately identify key ancestors of modern germplasms Identifying the key ancestors of modern plant breeding populations can provide valuable insights into the history of a breeding program and provide reference genomes for next generation whole genome sequencing. In an animal breeding context, a method was developed that employs probabilities of gene origin, computed from the pedigree-based additive kinship matrix, for identifying key ancestors. Because reliable and complete pedigree information is often not available in plant breeding, we replaced the additive kinship matrix with the genomic kinship matrix. As a proof-of-concept, we applied this approach to simulated data sets with known ancestries. The relative contribution of the ancestral lines to later generations could be determined with high accuracy, with and without selection. Our method was subsequently used for identifying the key ancestors of the modern Dent germplasm of the public maize breeding program of the University of Hohenheim. We found that the modern germplasm can be traced back to six or seven key ancestors, with one or two of them having a disproportionately large contribution. These results largely corroborated conjectures based on early records of the breeding program. We conclude that probabilities of gene origin computed from the genomic kinships matrix can be used for identifying key ancestors in breeding programs and estimating the proportion of genes contributed by them.
A Tony Thomas-Inspired Guide to INSPIRE
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Connell, Heath B.; /Fermilab
2010-04-01
The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution frommore » the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.« less
A Statewide Information Databases Program: What Difference Does It Make to Academic Libraries?
ERIC Educational Resources Information Center
Lester, June; Wallace, Danny P.
2004-01-01
The Oklahoma Department of Libraries (ODL) launched Oklahoma's statewide database program in 1997. For the state's academic libraries, the program extended access to information, increased database use, and fostered positive relationships among ODL, academic libraries, and Oklahoma State Regents for Higher Education (OSRHE), creating a more…
78 FR 49126 - Modernizing the FCC Form 477 Data Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-13
... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Parts 0, 1, and 43 [WC Docket No. 11-10; FCC 13-87] Modernizing the FCC Form 477 Data Program AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY: The Report and Order revises the Federal Communications Commission's Form 477 collection to...
Personal and Family Survival. Civil Defense Adult Education; Teacher's Manual.
ERIC Educational Resources Information Center
Office of Civil Defense (DOD), Washington, DC.
A manual intended as an instructor's aid in presenting a Civil Defense Adult Education Course is presented. It contains 10 lesson plans: Course Introduction, Modern Weapons and Radioactive Fallout (Effects), Modern Weapons and Radioactive Fallout (Protection), National Civil Defense Program, National Shelter Program (Community Shelters), National…
76 FR 10827 - Modernizing the FCC Form 477 Data Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-28
... Congress, the Commission, other policy makers, and consumers.'' The Commission designed the program as a... Initiative, designed to modernize and streamline how the Commission collects, uses, and disseminates data. As... burden for filers, for example, in the design of the collection or in tools the Commission can provide...
Classification of Ancient Mammal Individuals Using Dental Pulp MALDI-TOF MS Peptide Profiling
Tran, Thi-Nguyen-Ny; Aboudharam, Gérard; Gardeisen, Armelle; Davoust, Bernard; Bocquet-Appel, Jean-Pierre; Flaudrops, Christophe; Belghazi, Maya; Raoult, Didier; Drancourt, Michel
2011-01-01
Background The classification of ancient animal corpses at the species level remains a challenging task for forensic scientists and anthropologists. Severe damage and mixed, tiny pieces originating from several skeletons may render morphological classification virtually impossible. Standard approaches are based on sequencing mitochondrial and nuclear targets. Methodology/Principal Findings We present a method that can accurately classify mammalian species using dental pulp and mass spectrometry peptide profiling. Our work was organized into three successive steps. First, after extracting proteins from the dental pulp collected from 37 modern individuals representing 13 mammalian species, trypsin-digested peptides were used for matrix-assisted laser desorption/ionization time-of-flight mass spectrometry analysis. The resulting peptide profiles accurately classified every individual at the species level in agreement with parallel cytochrome b gene sequencing gold standard. Second, using a 279–modern spectrum database, we blindly classified 33 of 37 teeth collected in 37 modern individuals (89.1%). Third, we classified 10 of 18 teeth (56%) collected in 15 ancient individuals representing five mammal species including human, from five burial sites dating back 8,500 years. Further comparison with an upgraded database comprising ancient specimen profiles yielded 100% classification in ancient teeth. Peptide sequencing yield 4 and 16 different non-keratin proteins including collagen (alpha-1 type I and alpha-2 type I) in human ancient and modern dental pulp, respectively. Conclusions/Significance Mass spectrometry peptide profiling of the dental pulp is a new approach that can be added to the arsenal of species classification tools for forensics and anthropology as a complementary method to DNA sequencing. The dental pulp is a new source for collagen and other proteins for the species classification of modern and ancient mammal individuals. PMID:21364886
Reliability database development for use with an object-oriented fault tree evaluation program
NASA Technical Reports Server (NTRS)
Heger, A. Sharif; Harringtton, Robert J.; Koen, Billy V.; Patterson-Hine, F. Ann
1989-01-01
A description is given of the development of a fault-tree analysis method using object-oriented programming. In addition, the authors discuss the programs that have been developed or are under development to connect a fault-tree analysis routine to a reliability database. To assess the performance of the routines, a relational database simulating one of the nuclear power industry databases has been constructed. For a realistic assessment of the results of this project, the use of one of existing nuclear power reliability databases is planned.
Contraception supply chain challenges: a review of evidence from low- and middle-income countries.
Mukasa, Bakali; Ali, Moazzam; Farron, Madeline; Van de Weerdt, Renee
2017-10-01
To identify and assess factors determining the functioning of supply chain systems for modern contraception in low- and middle-income countries (LMICs), and to identify challenges contributing to contraception stockouts that may lead to unmet need. Scientific databases and grey literature were searched including Database of Abstracts of Reviews of Effectiveness (DARE), PubMed, MEDLINE, POPLINE, CINAHL, Academic Search Complete, Science Direct, Web of Science, Cochrane Central, Google Scholar, WHO databases and websites of key international organisations. Studies indicated that supply chain system inefficiencies significantly affect availability of modern FP and contraception commodities in LMICs, especially in rural public facilities where distribution barriers may be acute. Supply chain failures or bottlenecks may be attributed to: weak and poorly institutionalized logistic management information systems (LMIS), poor physical infrastructures in LMICs, lack of trained and dedicated staff for supply chain management, inadequate funding, and rigid government policies on task sharing. However, there is evidence that implementing effective LMISs and involving public and private providers will distribution channels resulted in reduction in medical commodities' stockout rates. Supply chain bottlenecks contribute significantly to persistent high stockout rates for modern contraceptives in LMICs. Interventions aimed at enhancing uptake of contraceptives to reduce the problem of unmet need in LMICs should make strong commitments towards strengthening these countries' health commodities supply chain management systems. Current evidence is limited and additional, and well-designed implementation research on contraception supply chain systems is warranted to gain further understanding and insights on the determinants of supply chain bottlenecks and their impact on stockouts of contraception commodities.
NASA Astrophysics Data System (ADS)
Nyberg, B.; Helland-Hansen, W.
2017-12-01
Long-term preservation of alluvial sediments is dependent on the hydrological processes that deposit sediments solely within an area that has available accomodation space and net subsidence know as a sedimentary basin. An understanding of the river processes contributing to terrestrial sedimentary basins is essential to fundamentally constrain and quantify controls on the modern terrestrial sink. Furthermore, the terrestrial source to sink controls place constraints on the entire coastal, shelf and deep marine sediment routing systems. In addition, the geographical importance of modern terrestrial sedimentary basins for agriculture and human settlements has resulted in significant upstream anthropogenic catchment modification for irrigation and energy needs. Yet to our knowledge, a global catchment model depicting the drainage patterns to modern terrestrial sedimentary basins has previously not been established that may be used to address these challenging issues. Here we present a new database of 180,737 global catchments that show the surface drainage patterns to modern terrestrial sedimentary basins. This is achieved by using high resolution river networks derived from digital elevation models in relation to newly acquired maps on global modern sedimentary basins to identify terrestrial sinks. The results show that active tectonic regimes are typically characterized by larger terrestrial sedimentary basins, numerous smaller source catchments and a high source to sink relief ratio. To the contrary passive margins drain catchments to smaller terrestrial sedimentary basins, are composed of fewer source catchments that are relatively larger and a lower source to sink relief ratio. The different geomorphological characteristics of source catchments by tectonic setting influence the spatial and temporal patterns of fluvial architecture within sedimentary basins and the anthropogenic methods of exploiting those rivers. The new digital database resource is aimed to help the geoscientific community to contribute further to our quantitative understanding of source-to-sink systems and its allogenic and autogenic controls, geomorphological characteristics, terrestrial sediment transit times and the anthropogenic impact on those systems.
ERIC Educational Resources Information Center
Alliance for Excellent Education, 2014
2014-01-01
In comments submitted April 7, 2014, the Alliance for Excellent Education called on the Federal Communications Commission (FCC) to modernize the federal E-rate program in order to lay the foundation for expanding the program through increased funding. These comments are in response to an E-rate Public Notice issued by the FCC on March 6, 2014…
The JANA calibrations and conditions database API
NASA Astrophysics Data System (ADS)
Lawrence, David
2010-04-01
Calibrations and conditions databases can be accessed from within the JANA Event Processing framework through the API defined in its JCalibration base class. The API is designed to support everything from databases, to web services to flat files for the backend. A Web Service backend using the gSOAP toolkit has been implemented which is particularly interesting since it addresses many modern cybersecurity issues including support for SSL. The API allows constants to be retrieved through a single line of C++ code with most of the context, including the transport mechanism, being implied by the run currently being analyzed and the environment relieving developers from implementing such details.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Computational Thermochemistry of Jet Fuels and Rocket Propellants
NASA Technical Reports Server (NTRS)
Crawford, T. Daniel
2002-01-01
The design of new high-energy density molecules as candidates for jet and rocket fuels is an important goal of modern chemical thermodynamics. The NASA Glenn Research Center is home to a database of thermodynamic data for over 2000 compounds related to this goal, in the form of least-squares fits of heat capacities, enthalpies, and entropies as functions of temperature over the range of 300 - 6000 K. The chemical equilibrium with applications (CEA) program written and maintained by researchers at NASA Glenn over the last fifty years, makes use of this database for modeling the performance of potential rocket propellants. During its long history, the NASA Glenn database has been developed based on experimental results and data published in the scientific literature such as the standard JANAF tables. The recent development of efficient computational techniques based on quantum chemical methods provides an alternative source of information for expansion of such databases. For example, it is now possible to model dissociation or combustion reactions of small molecules to high accuracy using techniques such as coupled cluster theory or density functional theory. Unfortunately, the current applicability of reliable computational models is limited to relatively small molecules containing only around a dozen (non-hydrogen) atoms. We propose to extend the applicability of coupled cluster theory- often referred to as the 'gold standard' of quantum chemical methods- to molecules containing 30-50 non-hydrogen atoms. The centerpiece of this work is the concept of local correlation, in which the description of the electron interactions- known as electron correlation effects- are reduced to only their most important localized components. Such an advance has the potential to greatly expand the current reach of computational thermochemistry and thus to have a significant impact on the theoretical study of jet and rocket propellants.
An Improved Database System for Program Assessment
ERIC Educational Resources Information Center
Haga, Wayne; Morris, Gerard; Morrell, Joseph S.
2011-01-01
This research paper presents a database management system for tracking course assessment data and reporting related outcomes for program assessment. It improves on a database system previously presented by the authors and in use for two years. The database system presented is specific to assessment for ABET (Accreditation Board for Engineering and…
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.
Ecological Modernization and the US Farm Bill: The Case of the Conservation Security Program
ERIC Educational Resources Information Center
Lenihan, Martin H.; Brasier, Kathryn J.
2010-01-01
This paper examines the debate surrounding the inception of the Conservation Security Program (CSP) under the 2002 US Farm Bill as a possible expression of ecological modernization by examining the discursive contributions made by official actors, social movement organizations, and producer organizations. Based on this analysis, the CSP embodies…
Understanding the Impacts of the Medicare Modernization Act: Concerns of Congressional Staff
ERIC Educational Resources Information Center
Mueller, Keith J.; Coburn, Andrew F.; MacKinney, Clinton; McBride, Timothy D.; Slifkin, Rebecca T.; Wakefield, Mary K.
2005-01-01
Sweeping changes to the Medicare program embodied in the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA), including a new prescription drug benefit, changes in payment policies, and reform of the Medicare managed-care program, have major implications for rural health care. The most efficient mechanism for research to…
NEW GRADUATE PROGRAMS IN MODERN FOREIGN LANGUAGES, WHY THEY ARE NEEDED.
ERIC Educational Resources Information Center
TURNER, DAYMOND E., JR.
ADDITIONAL DOCTORAL PROGRAMS ARE NEEDED IN MODERN FOREIGN LANGUAGES. CURRENT PRODUCTION OF GRADUATE DEGREES APPEARS SCARCELY ADEQUATE FOR REPLACING FACULTY WHO ANNUALLY LEAVE TEACHING BECAUSE OF DEATH, ILLNESS, RETIREMENT, OR CHANGE OF VOCATION. THE SUPPLY WILL HARDLY KEEP PACE WITH THE DEMAND CREATED BY THE ESTABLISHMENT OF NEW INSTITUTIONS OF…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-24
... Student Database AGENCY: Office of Elementary and Secondary Education, Department of Education. ACTION... entitled ``Migrant Education Bypass Program Student Database (MEBPSD)'' (18-14-06). The Secretary has...
Burnham, J F; Shearer, B S; Wall, J C
1992-01-01
Librarians have used bibliometrics for many years to assess collections and to provide data for making selection and deselection decisions. With the advent of new technology--specifically, CD-ROM databases and reprint file database management programs--new cost-effective procedures can be developed. This paper describes a recent multidisciplinary study conducted by two library faculty members and one allied health faculty member to test a bibliometric method that used the MEDLINE and CINAHL databases on CD-ROM and the Papyrus database management program to produce a new collection development methodology. PMID:1600424
2009-04-01
Available Military Speech Databases 2-2 2.3.1 FELIN Database 2-2 2.3.1.1 Overview 2-2 2.3.1.2 Technical Specifications 2-3 2.3.1.3 Limitations...emotion, confusion due to conflicting information, psychological tension, pain , and other typical conditions encountered in the modern battlefield...too, the number of possible language combinations scale with N3. It is clear that in a field of research that has only recently started and with so
Implementation of Three Text to Speech Systems for Kurdish Language
NASA Astrophysics Data System (ADS)
Bahrampour, Anvar; Barkhoda, Wafa; Azami, Bahram Zahir
Nowadays, concatenative method is used in most modern TTS systems to produce artificial speech. The most important challenge in this method is choosing appropriate unit for creating database. This unit must warranty smoothness and high quality speech, and also, creating database for it must reasonable and inexpensive. For example, syllable, phoneme, allophone, and, diphone are appropriate units for all-purpose systems. In this paper, we implemented three synthesis systems for Kurdish language based on syllable, allophone, and diphone and compare their quality using subjective testing.
Alcoa Massena Modernization Project and Request for a Single Source Determination
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Modern Hardware Technologies and Software Techniques for On-Line Database Storage and Access.
1985-12-01
of the information in a message narrative. This method employs artificial intelligence techniques to extract information, In simalest terms, an...disf ribif ion (tape replacemenf) systemns Database distribution On-fine mass storage Videogame ROM (luke-box I Media Cost Mt $2-10/438 $10-SO/G38...trajninq ot tne great intelligence for the analyst would be required. If, on’ the other hand, a sentence analysis scneme siTole enouq,. for the low-level
Offset Requirements for U.S. Steel's Fairfield Modernization
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
TOWARDS A CORE DATA SET FOR LANDSCAPE ASSESSMENTS
One of the primary goals of the NATO Committee on Challenges to Modern Society (CCMS) Landscape Pilot Study is to further develop, apply, and share landscape assessment technologies and spatial databases among participating countries, with the ultimate aim of sustaining environme...
The role of non-technical skills in surgery
Agha, Riaz A.; Fowler, Alexander J.; Sevdalis, Nick
2015-01-01
Non-technical skills are of increasing importance in surgery and surgical training. A traditional focus on technical skills acquisition and competence is no longer enough for the delivery of a modern, safe surgical practice. This review discusses the importance of non-technical skills and the values that underpin successful modern surgical practice. This narrative review used a number of sources including written and online, there was no specific search strategy of defined databases. Modern surgical practice requires; technical and non-technical skills, evidence-based practice, an emphasis on lifelong learning, monitoring of outcomes and a supportive institutional and health service framework. Finally these requirements need to be combined with a number of personal and professional values including integrity, professionalism and compassionate, patient-centred care. PMID:26904193
Verheggen, Kenneth; Raeder, Helge; Berven, Frode S; Martens, Lennart; Barsnes, Harald; Vaudel, Marc
2017-09-13
Sequence database search engines are bioinformatics algorithms that identify peptides from tandem mass spectra using a reference protein sequence database. Two decades of development, notably driven by advances in mass spectrometry, have provided scientists with more than 30 published search engines, each with its own properties. In this review, we present the common paradigm behind the different implementations, and its limitations for modern mass spectrometry datasets. We also detail how the search engines attempt to alleviate these limitations, and provide an overview of the different software frameworks available to the researcher. Finally, we highlight alternative approaches for the identification of proteomic mass spectrometry datasets, either as a replacement for, or as a complement to, sequence database search engines. © 2017 Wiley Periodicals, Inc.
Applications of hypermedia systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lennon, J.; Maurer, H.
1995-05-01
In this paper, we consider several new aspects of modern hypermedia systems. The applications discussed include: (1) General Information and Communication Systems: Distributed information systems for businesses, schools and universities, museums, libraries, health systems, etc. (2) Electronic orientation and information displays: Electronic guided tours, public information kiosks, and publicity dissemination with archive facilities. (3) Lecturing: A system going beyond the traditional to empower both teachers and learners. (4) Libraries: A further step towards fully electronic library systems. (5) Directories of all kinds: Staff, telephone, and all sorts of generic directories. (6) Administration: A fully integrated system such as the onemore » proposed will mean efficient data processing and valuable statistical data. (7) Research: Material can now be accessed from databases all around the world. The effects of networking and computer-supported collaborative work are discussed, and examples of new scientific visualization programs are quoted. The paper concludes with a section entitled {open_quotes}Future Directions{close_quotes}.« less
Introduction to Cosmology, Proceedings of the Polish Astronomical Society volume 4
NASA Astrophysics Data System (ADS)
Biernacka, Monika; Bajan, Katarzyna; Stachowski, Grzegorz; Pollo, Agnieszka
2016-07-01
On 11-23 July 2016, Jan Kochanowski University in Kielce was the host of the Second Cosmological School "Introduction to Cosmology". The main purpose of the School was to provide an introduction to a selection of the most interesting topics in modern cosmology, both in theory and observations. The program included a series of mini-workshops on cosmological simulations, Virtual Observatory database and tools and Spectral Energy Distribution tting. The School was intended for undergraduate, MSc and PhD students, as well as young postdoctoral researchers. The School was co-organized by the Polish Astronomical Society, the Jan Kochanowski University in Kielce, the Jagiellonian University in Cracow, the Nuclear Centre for Nuclear Research and the N. Copernicus Astronomical Center in Warsaw. The Interdisciplinary Centre for Mathematical and Computational Modeling kindly provided us with the possibility to remotely use their computing facilities.
Introduction to Cosmology, Proceedings of the Polish Astronomical Society volume 4
NASA Astrophysics Data System (ADS)
Biernacka, Monika; Bajan, Katarzyna; Stachowski, Grzegorz; Pollo, Agnieszka
2017-08-01
On 11-23 July 2016, Jan Kochanowski University in Kielce was the host of the Second Cosmological School "Introduction to Cosmology". The main purpose of the School was to provide an introduction to a selection of the most interesting topics in modern cosmology, both in theory and observations. The program included a series of mini-workshops on cosmological simulations, Virtual Observatory database and tools and Spectral Energy Distribution tting. The School was intended for undergraduate, MSc and PhD students, as well as young postdoctoral researchers. The School was co-organized by the Polish Astronomical Society, the Jan Kochanowski University in Kielce, the Jagiellonian University in Cracow, the Nuclear Centre for Nuclear Research and the N. Copernicus Astronomical Center in Warsaw. The Interdisciplinary Centre for Mathematical and Computational Modeling kindly provided us with the possibility to remotely use their computing facilities.
[A new concept for integration of image databanks into a comprehensive patient documentation].
Schöll, E; Holm, J; Eggli, S
2001-05-01
Image processing and archiving are of increasing importance in the practice of modern medicine. Particularly due to the introduction of computer-based investigation methods, physicians are dealing with a wide variety of analogue and digital picture archives. On the other hand, clinical information is stored in various text-based information systems without integration of image components. The link between such traditional medical databases and picture archives is a prerequisite for efficient data management as well as for continuous quality control and medical education. At the Department of Orthopedic Surgery, University of Berne, a software program was developed to create a complete multimedia electronic patient record. The client-server system contains all patients' data, questionnaire-based quality control, and a digital picture archive. Different interfaces guarantee the integration into the hospital's data network. This article describes our experiences in the development and introduction of a comprehensive image archiving system at a large orthopedic center.
Ben Ayed, Rayda; Ben Hassen, Hanen; Ennouri, Karim; Ben Marzoug, Riadh; Rebai, Ahmed
2016-01-01
Olive (Olea europaea), whose importance is mainly due to nutritional and health features, is one of the most economically significant oil-producing trees in the Mediterranean region. Unfortunately, the increasing market demand towards virgin olive oil could often result in its adulteration with less expensive oils, which is a serious problem for the public and quality control evaluators of virgin olive oil. Therefore, to avoid frauds, olive cultivar identification and virgin olive oil authentication have become a major issue for the producers and consumers of quality control in the olive chain. Presently, genetic traceability using SSR is the cost effective and powerful marker technique that can be employed to resolve such problems. However, to identify an unknown monovarietal virgin olive oil cultivar, a reference system has become necessary. Thus, an Olive Genetic Diversity Database (OGDD) (http://www.bioinfo-cbs.org/ogdd/) is presented in this work. It is a genetic, morphologic and chemical database of worldwide olive tree and oil having a double function. In fact, besides being a reference system generated for the identification of unkown olive or virgin olive oil cultivars based on their microsatellite allele size(s), it provides users additional morphological and chemical information for each identified cultivar. Currently, OGDD is designed to enable users to easily retrieve and visualize biologically important information (SSR markers, and olive tree and oil characteristics of about 200 cultivars worldwide) using a set of efficient query interfaces and analysis tools. It can be accessed through a web service from any modern programming language using a simple hypertext transfer protocol call. The web site is implemented in java, JavaScript, PHP, HTML and Apache with all major browsers supported. Database URL: http://www.bioinfo-cbs.org/ogdd/. © The Author(s) 2016. Published by Oxford University Press.
Modern pollen data from North America and Greenland for multi-scale paleoenvironmental applications
Whitmore, J.; Gajewski, K.; Sawada, M.; Williams, J.W.; Shuman, B.; Bartlein, P.J.; Minckley, T.; Viau, A.E.; Webb, T.; Shafer, S.; Anderson, P.; Brubaker, L.
2005-01-01
The modern pollen network in North America and Greenland is presented as a database for use in quantitative calibration studies and paleoenvironmental reconstructions. The georeferenced database includes 4634 samples from all regions of the continent and 134 pollen taxa that range from ubiquitous to regionally diagnostic taxa. Climate data and vegetation characteristics were assigned to every site. Automated and manual procedures were used to verify the accuracy of geographic coordinates and identify duplicate records among datasets, incomplete pollen sums, and other potential errors. Data are currently available for almost all of North America, with variable density. Pollen taxonomic diversity, as measured by the Shannon-Weiner coefficient, varies as a function of location, as some vegetation regions are dominated by one or two major pollen producers, while other regions have a more even composition of pollen taxa. Squared-chord distances computed between samples show that most modern pollen samples find analogues within their own vegetation zone. Both temperature and precipitation inferred from best analogues are highly correlated with observed values but temperature exhibits the strongest relation. Maps of the contemporary distribution of several pollen types in relation to the range of the plant taxon illustrate the correspondence between plant and pollen ranges. ?? 2005 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cook, Ellyn J.; van der Kaars, Sander
2006-10-01
We review attempts to derive quantitative climatic estimates from Australian pollen data, including the climatic envelope, climatic indicator and modern analogue approaches, and outline the need to pursue alternatives for use as input to, or validation of, simulations by models of past, present and future climate patterns. To this end, we have constructed and tested modern pollen-climate transfer functions for mainland southeastern Australia and Tasmania using the existing southeastern Australian pollen database and for northern Australia using a new pollen database we are developing. After testing for statistical significance, 11 parameters were selected for mainland southeastern Australia, seven for Tasmania and six for northern Australia. The functions are based on weighted-averaging partial least squares regression and their predictive ability evaluated against modern observational climate data using leave-one-out cross-validation. Functions for summer, annual and winter rainfall and temperatures are most robust for southeastern Australia, while in Tasmania functions for minimum temperature of the coldest period, mean winter and mean annual temperature are the most reliable. In northern Australia, annual and summer rainfall and annual and summer moisture indexes are the strongest. The validation of all functions means all can be applied to Quaternary pollen records from these three areas with confidence. Copyright
ERIC Educational Resources Information Center
Rosenberg, Michael S.; Boyer, K. Lynn; Sindelar, Paul T.; Misra, Sunil K.
2007-01-01
This study describes special education alternative route (AR) teacher preparation programs. The authors developed a national database of programs and collected information on program sponsorship, length and intensity, features, and participant demographics. Most of the 235 programs in the database were in states that had significant shortages of…
BIOPEP database and other programs for processing bioactive peptide sequences.
Minkiewicz, Piotr; Dziuba, Jerzy; Iwaniak, Anna; Dziuba, Marta; Darewicz, Małgorzata
2008-01-01
This review presents the potential for application of computational tools in peptide science based on a sample BIOPEP database and program as well as other programs and databases available via the World Wide Web. The BIOPEP application contains a database of biologically active peptide sequences and a program enabling construction of profiles of the potential biological activity of protein fragments, calculation of quantitative descriptors as measures of the value of proteins as potential precursors of bioactive peptides, and prediction of bonds susceptible to hydrolysis by endopeptidases in a protein chain. Other bioactive and allergenic peptide sequence databases are also presented. Programs enabling the construction of binary and multiple alignments between peptide sequences, the construction of sequence motifs attributed to a given type of bioactivity, searching for potential precursors of bioactive peptides, and the prediction of sites susceptible to proteolytic cleavage in protein chains are available via the Internet as are other approaches concerning secondary structure prediction and calculation of physicochemical features based on amino acid sequence. Programs for prediction of allergenic and toxic properties have also been developed. This review explores the possibilities of cooperation between various programs.
76 FR 12308 - Modernizing the FCC Form 477 Data Program; Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... FEDERAL COMMUNICATIONS COMMISSION 47 CFR Parts 1, 20, and 43 [WCB: WC Docket Nos. 07-38, 09-190, 10-132, 11-10; FCC 11-14] Modernizing the FCC Form 477 Data Program; Correction AGENCY: Federal..., Deputy Manager. [FR Doc. 2011-5095 Filed 3-4-11; 8:45 am] BILLING CODE 6712-01-P ...
Federal incentives for industrial modernization: Historical review and future opportunities
NASA Technical Reports Server (NTRS)
Coleman, Sandra C.; Batson, Robert G.
1987-01-01
Concerns over the aging of the U.S. aerospace industrial base led DOD to introduce first its Technology Modernization (Tech Mod) Program, and more recently the Industrial Modernization Incentive Program (IMIP). These incentives include productivity shared savings rewards, contractor investment protection to allow for amortization of plant and equipment, and subcontractor/vendor participation. The purpose here is to review DOD IMIP and to evaluate whether a similar program is feasible for NASA and other non-DOD agencies. The IMIP methodology is of interest to industrial engineers because it provides a structured, disciplined approach to identifying productivity improvement opportunities and documenting their expected benefit. However, it is shown that more research on predicting and validating cost avoidance is needed.
Owens, John
2009-01-01
Technological advances in the acquisition of DNA and protein sequence information and the resulting onrush of data can quickly overwhelm the scientist unprepared for the volume of information that must be evaluated and carefully dissected to discover its significance. Few laboratories have the luxury of dedicated personnel to organize, analyze, or consistently record a mix of arriving sequence data. A methodology based on a modern relational-database manager is presented that is both a natural storage vessel for antibody sequence information and a conduit for organizing and exploring sequence data and accompanying annotation text. The expertise necessary to implement such a plan is equal to that required by electronic word processors or spreadsheet applications. Antibody sequence projects maintained as independent databases are selectively unified by the relational-database manager into larger database families that contribute to local analyses, reports, interactive HTML pages, or exported to facilities dedicated to sophisticated sequence analysis techniques. Database files are transposable among current versions of Microsoft, Macintosh, and UNIX operating systems.
Generation of an Aerothermal Data Base for the X33 Spacecraft
NASA Technical Reports Server (NTRS)
Roberts, Cathy; Huynh, Loc
1998-01-01
The X-33 experimental program is a cooperative program between industry and NASA, managed by Lockheed-Martin Skunk Works to develop an experimental vehicle to demonstrate new technologies for a single-stage-to-orbit, fully reusable launch vehicle (RLV). One of the new technologies to be demonstrated is an advanced Thermal Protection System (TPS) being designed by BF Goodrich (formerly Rohr, Inc.) with support from NASA. The calculation of an aerothermal database is crucial to identifying the critical design environment data for the TPS. The NASA Ames X-33 team has generated such a database using Computational Fluid Dynamics (CFD) analyses, engineering analysis methods and various programs to compare and interpolate the results from the CFD and the engineering analyses. This database, along with a program used to query the database, is used extensively by several X-33 team members to help them in designing the X-33. This paper will describe the methods used to generate this database, the program used to query the database, and will show some of the aerothermal analysis results for the X-33 aircraft.
MetaMapR: pathway independent metabolomic network analysis incorporating unknowns.
Grapov, Dmitry; Wanichthanarak, Kwanjeera; Fiehn, Oliver
2015-08-15
Metabolic network mapping is a widely used approach for integration of metabolomic experimental results with biological domain knowledge. However, current approaches can be limited by biochemical domain or pathway knowledge which results in sparse disconnected graphs for real world metabolomic experiments. MetaMapR integrates enzymatic transformations with metabolite structural similarity, mass spectral similarity and empirical associations to generate richly connected metabolic networks. This open source, web-based or desktop software, written in the R programming language, leverages KEGG and PubChem databases to derive associations between metabolites even in cases where biochemical domain or molecular annotations are unknown. Network calculation is enhanced through an interface to the Chemical Translation System, which allows metabolite identifier translation between >200 common biochemical databases. Analysis results are presented as interactive visualizations or can be exported as high-quality graphics and numerical tables which can be imported into common network analysis and visualization tools. Freely available at http://dgrapov.github.io/MetaMapR/. Requires R and a modern web browser. Installation instructions, tutorials and application examples are available at http://dgrapov.github.io/MetaMapR/. ofiehn@ucdavis.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Troutman, Sandra M.; Stanley, Richard G.
2003-01-01
This database and accompanying text depict historical and modern reported occurrences of petroleum both in wells and at the surface within the boundaries of the Central Alaska Province. These data were compiled from previously published and unpublished sources and were prepared for use in the 2002 U.S. Geological Survey petroleum assessment of Central Alaska, Yukon Flats region. Indications of petroleum are described as oil or gas shows in wells, oil or gas seeps, or outcrops of oil shale or oil-bearing rock and include confirmed and unconfirmed reports. The scale of the source map limits the spatial resolution (scale) of the database to 1:2,500,000 or smaller.
The purpose of this SOP is to describe the database storage organization, and to describe the sources of data for each database used during the Arizona NHEXAS project and the Border study. Keywords: data; database; organization.
The U.S.-Mexico Border Program is sponsored by t...
NBIC: National Ballast Information Clearinghouse
Smithsonian Environmental Research Center Logo US Coast Guard Logo Submit BW Report | Search NBIC Database / Database Manager: Tami Huber Senior Analyst / Ecologist: Mark Minton Data Managers Ashley Arnwine Jessica Hardee Amanda Reynolds Database Design and Programming / Application Programming: Paul Winterbauer
A storage scheme for the real-time database supporting the on-line commitment
NASA Astrophysics Data System (ADS)
Dai, Hong-bin; Jing, Yu-jian; Wang, Hui
2013-07-01
The modern SCADA (Supervisory Control and Data acquisition) systems have been applied to various aspects of everyday life. As the time goes on, the requirements of the applications of the systems vary. Thus the data structure of the real-time database, which is the core of a SCADA system, often needs modification. As a result, the commitment consisting of a sequence of configuration operations modifying the data structure of the real-time database is performed from time to time. Though it is simple to perform the off-line commitment by first stopping and then restarting the system, during which all the data in the real-time database are reconstructed. It is much more preferred or in some cases even necessary to perform the on-line commitment, during which the real-time database can still provide real-time service and the system continues working normally. In this paper, a storage scheme of the data in the real-time database is proposed. It helps the real-time database support its on-line commitment, during which real-time service is still available.
The WSMR Timing System: Toward New Horizons
NASA Technical Reports Server (NTRS)
Gilbert, William A.; Stimets, Bob
1996-01-01
In 1991, White Sands Missile Range (WSMR) initiated a modernization program for its range timing system. The main focus of this modernization program was to develop a system that was highly accurate, easy to maintain, and portable. The logical decision at the time was to develop a system based solely on Global Positioning System (GPS) technology. Since that time, wsmr has changed its philosophy on how GPS would be utilized for the timing system. This paper will describe WSMR's initial modernization plans for its range timing system and how certain events have led to a modification of these plans.
Fulbright-Hays Summer Seminars Abroad Program 1989. Egypt: Transition to the Modern World.
ERIC Educational Resources Information Center
Office of International Education (ED), Washington, DC.
This document consists of four papers on various aspects of development in Egypt prepared by participants in the Fulbright-Hays Seminars Abroad Program in Egypt in 1989. Four of the papers are descriptive, one is a lesson plan. The papers included are: (1) "Egypt: Transition to Modern Times" (Katherine Jensen) focuses on the role of…
Europe Report, Science and Technology
1986-09-30
to certain basic products of the food industry such as beer, vinegar , 51 spirits, starches, etc. It is also assumed that modern biotechnologies...Czechoslovak food production. This is also the objective of innovative and modernizing programs in the fermented food sectors. The program for the...cattle and improves fodder utilization, assuming balanced doses of fodder. The development of fermentation techniques of production will occur within
NASA STI Program Coordinating Council Eleventh Meeting: NASA STI Modernization Plan
NASA Technical Reports Server (NTRS)
1993-01-01
The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was the modernization of the STI Program. Topics covered included the activities of the Engineering Review Board in the creation of the Infrastructure Upgrade Plan, the progress of the RECON Replacement Project, the use and status of Electronic SCAN (Selected Current Aerospace Notices), the Machine Translation Project, multimedia, electronic document interchange, the NASA Access Mechanism, computer network upgrades, and standards in the architectural effort.
Improving robustness and computational efficiency using modern C++
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paterno, M.; Kowalkowski, J.; Green, C.
2014-01-01
For nearly two decades, the C++ programming language has been the dominant programming language for experimental HEP. The publication of ISO/IEC 14882:2011, the current version of the international standard for the C++ programming language, makes available a variety of language and library facilities for improving the robustness, expressiveness, and computational efficiency of C++ code. However, much of the C++ written by the experimental HEP community does not take advantage of the features of the language to obtain these benefits, either due to lack of familiarity with these features or concern that these features must somehow be computationally inefficient. In thismore » paper, we address some of the features of modern C+-+, and show how they can be used to make programs that are both robust and computationally efficient. We compare and contrast simple yet realistic examples of some common implementation patterns in C, currently-typical C++, and modern C++, and show (when necessary, down to the level of generated assembly language code) the quality of the executable code produced by recent C++ compilers, with the aim of allowing the HEP community to make informed decisions on the costs and benefits of the use of modern C++.« less
Accessibility and quality of online information for pediatric orthopaedic surgery fellowships.
Davidson, Austin R; Murphy, Robert F; Spence, David D; Kelly, Derek M; Warner, William C; Sawyer, Jeffrey R
2014-12-01
Pediatric orthopaedic fellowship applicants commonly use online-based resources for information on potential programs. Two primary sources are the San Francisco Match (SF Match) database and the Pediatric Orthopaedic Society of North America (POSNA) database. We sought to determine the accessibility and quality of information that could be obtained by using these 2 sources. The online databases of the SF Match and POSNA were reviewed to determine the availability of embedded program links or external links for the included programs. If not available in the SF Match or POSNA data, Web sites for listed programs were located with a Google search. All identified Web sites were analyzed for accessibility, content volume, and content quality. At the time of online review, 50 programs, offering 68 positions, were listed in the SF Match database. Although 46 programs had links included with their information, 36 (72%) of them simply listed http://www.sfmatch.org as their unique Web site. Ten programs (20%) had external links listed, but only 2 (4%) linked directly to the fellowship web page. The POSNA database does not list any links to the 47 programs it lists, which offer 70 positions. On the basis of a Google search of the 50 programs listed in the SF Match database, web pages were found for 35. Of programs with independent web pages, all had a description of the program and 26 (74%) described their application process. Twenty-nine (83%) listed research requirements, 22 (63%) described the rotation schedule, and 12 (34%) discussed the on-call expectations. A contact telephone number and/or email address was provided by 97% of programs. Twenty (57%) listed both the coordinator and fellowship director, 9 (26%) listed the coordinator only, 5 (14%) listed the fellowship director only, and 1 (3%) had no contact information given. The SF Match and POSNA databases provide few direct links to fellowship Web sites, and individual program Web sites either do not exist or do not effectively convey information about the programs. Improved accessibility and accurate information online would allow potential applicants to obtain information about pediatric fellowships in a more efficient manner.
3MdB: the Mexican Million Models database
NASA Astrophysics Data System (ADS)
Morisset, C.; Delgado-Inglada, G.
2014-10-01
The 3MdB is an original effort to construct a large multipurpose database of photoionization models. This is a more modern version of a previous attempt based on Cloudy3D and IDL tools. It is accessed by MySQL requests. The models are obtained using the well known and widely used Cloudy photoionization code (Ferland et al, 2013). The database is aimed to host grids of models with different references to identify each project and to facilitate the extraction of the desired data. We present here a description of the way the database is managed and some of the projects that use 3MdB. Anybody can ask for a grid to be run and stored in 3MdB, to increase the visibility of the grid and the potential side applications of it.
Characterization of Oribtal Debris via Hyper-Velocity Ground-Based Tests
NASA Technical Reports Server (NTRS)
Cowardin, H.
2015-01-01
Existing DoD and NASA satellite breakup models are based on a key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve the break-up models and the NASA Size Estimation Model (SEM) for events involving more modern satellite designs, the NASA Orbital Debris Program Office has worked in collaboration with the University of Florida to replicate a hypervelocity impact using a satellite built with modern-day spacecraft materials and construction techniques. The spacecraft, called DebriSat, was intended to be a representative of modern LEO satellites and all major designs decisions were reviewed and approved by subject matter experts at Aerospace Corporation. DebriSat is composed of 7 major subsystems including attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. All fragments down to 2 mm is size will be characterized via material, size, shape, bulk density, and the associated data will be stored in a database for multiple users to access. Laboratory radar and optical measurements will be performed on a subset of fragments to provide a better understanding of the data products from orbital debris acquired from ground-based radars and telescopes. The resulting data analysis from DebriSat will be used to update break-up models and develop the first optical SEM in conjunction with updates into the current NASA SEM. The characterization of the fragmentation will be discussed in the subsequent presentation.
Language-Agnostic Reproducible Data Analysis Using Literate Programming.
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.
Language-Agnostic Reproducible Data Analysis Using Literate Programming
Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa
2016-01-01
A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123
Bohl, Daniel D; Russo, Glenn S; Basques, Bryce A; Golinvaux, Nicholas S; Fu, Michael C; Long, William D; Grauer, Jonathan N
2014-12-03
There has been an increasing use of national databases to conduct orthopaedic research. Questions regarding the validity and consistency of these studies have not been fully addressed. The purpose of this study was to test for similarity in reported measures between two national databases commonly used for orthopaedic research. A retrospective cohort study of patients undergoing lumbar spinal fusion procedures during 2009 to 2011 was performed in two national databases: the Nationwide Inpatient Sample and the National Surgical Quality Improvement Program. Demographic characteristics, comorbidities, and inpatient adverse events were directly compared between databases. The total numbers of patients included were 144,098 from the Nationwide Inpatient Sample and 8434 from the National Surgical Quality Improvement Program. There were only small differences in demographic characteristics between the two databases. There were large differences between databases in the rates at which specific comorbidities were documented. Non-morbid obesity was documented at rates of 9.33% in the Nationwide Inpatient Sample and 36.93% in the National Surgical Quality Improvement Program (relative risk, 0.25; p < 0.05). Peripheral vascular disease was documented at rates of 2.35% in the Nationwide Inpatient Sample and 0.60% in the National Surgical Quality Improvement Program (relative risk, 3.89; p < 0.05). Similarly, there were large differences between databases in the rates at which specific inpatient adverse events were documented. Sepsis was documented at rates of 0.38% in the Nationwide Inpatient Sample and 0.81% in the National Surgical Quality Improvement Program (relative risk, 0.47; p < 0.05). Acute kidney injury was documented at rates of 1.79% in the Nationwide Inpatient Sample and 0.21% in the National Surgical Quality Improvement Program (relative risk, 8.54; p < 0.05). As database studies become more prevalent in orthopaedic surgery, authors, reviewers, and readers should view these studies with caution. This study shows that two commonly used databases can identify demographically similar patients undergoing a common orthopaedic procedure; however, the databases document markedly different rates of comorbidities and inpatient adverse events. The differences are likely the result of the very different mechanisms through which the databases collect their comorbidity and adverse event data. Findings highlight concerns regarding the validity of orthopaedic database research. Copyright © 2014 by The Journal of Bone and Joint Surgery, Incorporated.
The Perfect Marriage: Integrated Word Processing and Data Base Management Programs.
ERIC Educational Resources Information Center
Pogrow, Stanley
1983-01-01
Discussion of database integration and how it operates includes recommendations on compatible brand name word processing and database management programs, and a checklist for evaluating essential and desirable features of the available programs. (MBR)
SUPERSITES INTEGRATED RELATIONAL DATABASE (SIRD)
As part of EPA's Particulate Matter (PM) Supersites Program (Program), the University of Maryland designed and developed the Supersites Integrated Relational Database (SIRD). Measurement data in SIRD include comprehensive air quality data from the 7 Supersite program locations f...
Transaction Processing Performance Council (TPC): State of the Council 2010
NASA Astrophysics Data System (ADS)
Nambiar, Raghunath; Wakou, Nicholas; Carman, Forrest; Majdalany, Michael
The Transaction Processing Performance Council (TPC) is a non-profit corporation founded to define transaction processing and database benchmarks and to disseminate objective, verifiable performance data to the industry. Established in August 1988, the TPC has been integral in shaping the landscape of modern transaction processing and database benchmarks over the past twenty-two years. This paper provides an overview of the TPC's existing benchmark standards and specifications, introduces two new TPC benchmarks under development, and examines the TPC's active involvement in the early creation of additional future benchmarks.
NASA Astrophysics Data System (ADS)
Tyupikova, T. V.; Samoilov, V. N.
2003-04-01
Modern information technologies urge natural sciences to further development. But it comes together with evaluation of infrastructures, to spotlight favorable conditions for the development of science and financial base in order to prove and protect legally new research. Any scientific development entails accounting and legal protection. In the report, we consider a new direction in software, organization and control of common databases on the example of the electronic document handling, which functions in some departments of the Joint Institute for Nuclear Research.
Managing vulnerabilities and achieving compliance for Oracle databases in a modern ERP environment
NASA Astrophysics Data System (ADS)
Hölzner, Stefan; Kästle, Jan
In this paper we summarize good practices on how to achieve compliance for an Oracle database in combination with an ERP system. We use an integrated approach to cover both the management of vulnerabilities (preventive measures) and the use of logging and auditing features (detective controls). This concise overview focusses on the combination Oracle and SAP and it’s dependencies, but also outlines security issues that arise with other ERP systems. Using practical examples, we demonstrate common vulnerabilities and coutermeasures as well as guidelines for the use of auditing features.
76 FR 77504 - Notice of Submission for OMB Review
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-13
... of Review: Extension. Title of Collection: Charter Schools Program Grand Award Database. OMB Control... collect data necessary for the Charter Schools Program (CSP) Grant Award Database. The CSP is authorized... award information from grantees (State agencies and some schools) for a database of current CSP-funded...
Developing Modern Information Systems and Services: Africa's Challenges for the Future.
ERIC Educational Resources Information Center
Chowdhury, G. G.
1996-01-01
Discusses the current state of information systems and services in Africa, examines future possibilities, and suggests areas for improvement. Topics include the lack of automation; CD-ROM databases for accessibility to information sources; developing low-cost electronic communication facilities; Internet connectivity; dependence on imported…
Lin, Yun; Melby, Daniel P; Krishnan, Balaji; Adabag, Selcuk; Tholakanahalli, Venkatakrishna; Li, Jian-Ming
2017-08-01
The aim of this study is to investigate the frequency of electrosurgery-related pacemaker malfunction. A retrospective study was conducted to investigate electrosurgery-related pacemaker malfunction in consecutive patients undergoing pulse generator (PG) replacement or upgrade from two large hospitals in Minneapolis, MN between January 2011 and January 2014. The occurrence of this pacemaker malfunction was then studied by using MAUDE database for all four major device vendors. A total of 1398 consecutive patients from 2 large tertiary referral centers in Minneapolis, MN undergoing PG replacement or upgrade surgery were retrospectively studied. Four patients (0.3% of all patients), all with pacemakers from St Jude Medical (2.8%, 4 of 142) had output failure or inappropriately low pacing rate below 30 bpm during electrosurgery, despite being programmed in an asynchronous mode. During the same period, 1174 cases of pacemaker malfunctions were reported on the same models in MAUDE database, 37 of which (3.2%) were electrosurgery-related. Twenty-four cases (65%) had output failure or inappropriate low pacing rate. The distribution of adverse events was loss of pacing (59.5%), reversion to backup pacing (32.4%), inappropriate low pacing rate (5.4%), and ventricular fibrillation (2.7%). The majority of these (78.5%) occurred during PG replacement at ERI or upgrade surgery. No electrosurgery-related malfunction was found in MAUDE database on 862 pacemaker malfunction cases during the same period from other vendors. Electrosurgery during PG replacement or upgrade surgery can trigger output failure or inappropriate low pacing rate in certain models of modern pacemakers. Cautions should be taken for pacemaker-dependent patients.
Discovering Knowledge from Noisy Databases Using Genetic Programming.
ERIC Educational Resources Information Center
Wong, Man Leung; Leung, Kwong Sak; Cheng, Jack C. Y.
2000-01-01
Presents a framework that combines Genetic Programming and Inductive Logic Programming, two approaches in data mining, to induce knowledge from noisy databases. The framework is based on a formalism of logic grammars and is implemented as a data mining system called LOGENPRO (Logic Grammar-based Genetic Programming System). (Contains 34…
The Steward Observatory asteroid relational database
NASA Technical Reports Server (NTRS)
Sykes, Mark V.; Alvarezdelcastillo, Elizabeth M.
1991-01-01
The Steward Observatory Asteroid Relational Database (SOARD) was created as a flexible tool for undertaking studies of asteroid populations and sub-populations, to probe the biases intrinsic to asteroid databases, to ascertain the completeness of data pertaining to specific problems, to aid in the development of observational programs, and to develop pedagogical materials. To date, SOARD has compiled an extensive list of data available on asteroids and made it accessible through a single menu-driven database program. Users may obtain tailored lists of asteroid properties for any subset of asteroids or output files which are suitable for plotting spectral data on individual asteroids. The program has online help as well as user and programmer documentation manuals. The SOARD already has provided data to fulfill requests by members of the astronomical community. The SOARD continues to grow as data is added to the database and new features are added to the program.
Chinas Future SSBN Command and Control Structure
2016-11-01
ndupress.ndu.edu SF No. 299 1 China’s ongoing modernization program is transforming the country’s nuclear arsenal from one consisting of a few...also be mediated by bureaucratic poli- tics, including the time-honored tradition of interservice rivalry. The emergent SSBN fleet may represent a prime...represent a reliable source of resources. China has dedicated substantial resources to undertaking a nuclear modernization program designed to ensure
NASA Astrophysics Data System (ADS)
Kelbert, A.; Blum, C.
2015-12-01
Magnetotelluric Transfer Functions (MT TFs) represent most of the information about Earth electrical conductivity found in the raw electromagnetic data, providing inputs for further inversion and interpretation. To be useful for scientific interpretation, they must also contain carefully recorded metadata. Making these data available in a discoverable and citable fashion would provide the most benefit to the scientific community, but such a development requires that the metadata is not only present in the file but is also searchable. The most commonly used MT TF format to date, the historical Society of Exploration Geophysicists Electromagnetic Data Interchange Standard 1987 (EDI), no longer supports some of the needs of modern magnetotellurics, most notably accurate error bars recording. Moreover, the inherent heterogeneity of EDI's and other historic MT TF formats has mostly kept the community away from healthy data sharing practices. Recently, the MT team at Oregon State University in collaboration with IRIS Data Management Center developed a new, XML-based format for MT transfer functions, and an online system for long-term storage, discovery and sharing of MT TF data worldwide (IRIS SPUD; www.iris.edu/spud/emtf). The system provides a query page where all of the MT transfer functions collected within the USArray MT experiment and other field campaigns can be searched for and downloaded; an automatic on-the-fly conversion to the historic EDI format is also included. To facilitate conversion to the new, more comprehensive and sustainable, XML format for MT TFs, and to streamline inclusion of historic data into the online database, we developed a set of open source format conversion tools, which can be used for rotation of MT TFs as well as a general XML <-> EDI converter (https://seiscode.iris.washington.edu/projects/emtf-fcu). Here, we report on the newly established collaboration between the USGS Geomagnetism Program and the Oregon State University to gather and convert both historic and modern-day MT or related transfer functions into the searchable database at the IRIS DMC. The more complete and free access to these previously collected MT TFs will be of great value to MT scientists both in planning future surveys, and then to leverage the value of the new data at the inversion and interpretation stage.
Web-Based Environment for Maintaining Legacy Software
NASA Technical Reports Server (NTRS)
Tigges, Michael; Thompson, Nelson; Orr, Mark; Fox, Richard
2007-01-01
Advanced Tool Integration Environment (ATIE) is the name of both a software system and a Web-based environment created by the system for maintaining an archive of legacy software and expertise involved in developing the legacy software. ATIE can also be used in modifying legacy software and developing new software. The information that can be encapsulated in ATIE includes experts documentation, input and output data of tests cases, source code, and compilation scripts. All of this information is available within a common environment and retained in a database for ease of access and recovery by use of powerful search engines. ATIE also accommodates the embedment of supporting software that users require for their work, and even enables access to supporting commercial-off-the-shelf (COTS) software within the flow of the experts work. The flow of work can be captured by saving the sequence of computer programs that the expert uses. A user gains access to ATIE via a Web browser. A modern Web-based graphical user interface promotes efficiency in the retrieval, execution, and modification of legacy code. Thus, ATIE saves time and money in the support of new and pre-existing programs.
17 CFR 38.552 - Elements of an acceptable audit trail program.
Code of Federal Regulations, 2014 CFR
2014-04-01
... of the order shall also be captured. (b) Transaction history database. A designated contract market's audit trail program must include an electronic transaction history database. An adequate transaction history database includes a history of all trades executed via open outcry or via entry into an electronic...
17 CFR 38.552 - Elements of an acceptable audit trail program.
Code of Federal Regulations, 2013 CFR
2013-04-01
... of the order shall also be captured. (b) Transaction history database. A designated contract market's audit trail program must include an electronic transaction history database. An adequate transaction history database includes a history of all trades executed via open outcry or via entry into an electronic...
Correlates of Access to Business Research Databases
ERIC Educational Resources Information Center
Gottfried, John C.
2010-01-01
This study examines potential correlates of business research database access through academic libraries serving top business programs in the United States. Results indicate that greater access to research databases is related to enrollment in graduate business programs, but not to overall enrollment or status as a public or private institution.…
A Relational Algebra Query Language for Programming Relational Databases
ERIC Educational Resources Information Center
McMaster, Kirby; Sambasivam, Samuel; Anderson, Nicole
2011-01-01
In this paper, we describe a Relational Algebra Query Language (RAQL) and Relational Algebra Query (RAQ) software product we have developed that allows database instructors to teach relational algebra through programming. Instead of defining query operations using mathematical notation (the approach commonly taken in database textbooks), students…
Modern Design of Resonant Edge-Slot Array Antennas
NASA Technical Reports Server (NTRS)
Gosselin, R. B.
2006-01-01
Resonant edge-slot (slotted-waveguide) array antennas can now be designed very accurately following a modern computational approach like that followed for some other microwave components. This modern approach makes it possible to design superior antennas at lower cost than was previously possible. Heretofore, the physical and engineering knowledge of resonant edge-slot array antennas had remained immature since they were introduced during World War II. This is because despite their mechanical simplicity, high reliability, and potential for operation with high efficiency, the electromagnetic behavior of resonant edge-slot antennas is very complex. Because engineering design formulas and curves for such antennas are not available in the open literature, designers have been forced to implement iterative processes of fabricating and testing multiple prototypes to derive design databases, each unique for a specific combination of operating frequency and set of waveguide tube dimensions. The expensive, time-consuming nature of these processes has inhibited the use of resonant edge-slot antennas. The present modern approach reduces costs by making it unnecessary to build and test multiple prototypes. As an additional benefit, this approach affords a capability to design an array of slots having different dimensions to taper the antenna illumination to reduce the amplitudes of unwanted side lobes. The heart of the modern approach is the use of the latest commercially available microwave-design software, which implements finite-element models of electromagnetic fields in and around waveguides, antenna elements, and similar components. Instead of building and testing prototypes, one builds a database and constructs design curves from the results of computational simulations for sets of design parameters. The figure shows a resonant edge-slot antenna designed following this approach. Intended for use as part of a radiometer operating at a frequency of 10.7 GHz, this antenna was fabricated from dimensions defined exclusively by results of computational simulations. The final design was found to be well optimized and to yield performance exceeding that initially required.
ERIC Educational Resources Information Center
Myint-U, Athi; O'Donnell, Lydia; Phillips, Dawna
2012-01-01
This technical brief describes updates to a database of dropout prevention programs and policies in 2006/07 created by the Regional Education Laboratory (REL) Northeast and Islands and described in the Issues & Answers report, "Piloting a searchable database of dropout prevention programs in nine low-income urban school districts in the…
... develops and applies tools of modern toxicology and molecular biology to identify substances in the environment that may ... application of new technologies for modern toxicology and molecular biology. A world leader in toxicology research, NTP has ...
Digital Geodata Traces--New Challenges for Geographic Education
ERIC Educational Resources Information Center
Hohnle, Steffen; Michel, Boris; Glasze, Georg; Uphues, Rainer
2013-01-01
Young people in modern societies consciously (e.g. Facebook) or unconsciously (e.g. some Google services) produce a vast amount of geodata. Using relational databases, private companies are capable of creating very precise profiles of the individual user and his/her spatial practices from this data. This almost inevitably prompts questions…
On Research Methodology in Applied Linguistics in 2002-2008
ERIC Educational Resources Information Center
Martynychev, Andrey
2010-01-01
This dissertation examined the status of data-based research in applied linguistics through an analysis of published research studies in nine peer-reviewed applied linguistics journals ("Applied Language Learning, The Canadian Modern Language Review / La Revue canadienne des langues vivantes, Current Issues in Language Planning, Dialog on Language…
Library Users: How They Adapt to Changing Roles.
ERIC Educational Resources Information Center
Miido, Helis
Traditional library tasks, for example database searching, are increasingly performed by library users, forcing both the librarian and the user to assume at times dichotomous roles of teacher and student. Modern librarians install new software and guide organizations in multimedia applications. Librarians need to be cognizant of the human factor,…
Height modernization program and subsidence study in northern Ohio.
DOT National Transportation Integrated Search
2013-11-01
This study is an initiative focused on establishing accurate, reliable heights using Global Navigation Satellite System (GNSS) technology in conjunction with traditional leveling, gravity, and modern remote sensing information. The traditional method...
NASA Astrophysics Data System (ADS)
Wang, Lusheng; Yang, Yong; Lin, Guohui
Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute O(logn) objects in order to find the closest object, where n is the total number of objects in the database.
A curated database of cyanobacterial strains relevant for modern taxonomy and phylogenetic studies.
Ramos, Vitor; Morais, João; Vasconcelos, Vitor M
2017-04-25
The dataset herein described lays the groundwork for an online database of relevant cyanobacterial strains, named CyanoType (http://lege.ciimar.up.pt/cyanotype). It is a database that includes categorized cyanobacterial strains useful for taxonomic, phylogenetic or genomic purposes, with associated information obtained by means of a literature-based curation. The dataset lists 371 strains and represents the first version of the database (CyanoType v.1). Information for each strain includes strain synonymy and/or co-identity, strain categorization, habitat, accession numbers for molecular data, taxonomy and nomenclature notes according to three different classification schemes, hierarchical automatic classification, phylogenetic placement according to a selection of relevant studies (including this), and important bibliographic references. The database will be updated periodically, namely by adding new strains meeting the criteria for inclusion and by revising and adding up-to-date metadata for strains already listed. A global 16S rDNA-based phylogeny is provided in order to assist users when choosing the appropriate strains for their studies.
A curated database of cyanobacterial strains relevant for modern taxonomy and phylogenetic studies
Ramos, Vitor; Morais, João; Vasconcelos, Vitor M.
2017-01-01
The dataset herein described lays the groundwork for an online database of relevant cyanobacterial strains, named CyanoType (http://lege.ciimar.up.pt/cyanotype). It is a database that includes categorized cyanobacterial strains useful for taxonomic, phylogenetic or genomic purposes, with associated information obtained by means of a literature-based curation. The dataset lists 371 strains and represents the first version of the database (CyanoType v.1). Information for each strain includes strain synonymy and/or co-identity, strain categorization, habitat, accession numbers for molecular data, taxonomy and nomenclature notes according to three different classification schemes, hierarchical automatic classification, phylogenetic placement according to a selection of relevant studies (including this), and important bibliographic references. The database will be updated periodically, namely by adding new strains meeting the criteria for inclusion and by revising and adding up-to-date metadata for strains already listed. A global 16S rDNA-based phylogeny is provided in order to assist users when choosing the appropriate strains for their studies. PMID:28440791
PHASE I MATERIALS PROPERTY DATABASE DEVELOPMENT FOR ASME CODES AND STANDARDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Lin, Lianshan
2013-01-01
To support the ASME Boiler and Pressure Vessel Codes and Standard (BPVC) in modern information era, development of a web-based materials property database is initiated under the supervision of ASME Committee on Materials. To achieve efficiency, the project heavily draws upon experience from development of the Gen IV Materials Handbook and the Nuclear System Materials Handbook. The effort is divided into two phases. Phase I is planned to deliver a materials data file warehouse that offers a depository for various files containing raw data and background information, and Phase II will provide a relational digital database that provides advanced featuresmore » facilitating digital data processing and management. Population of the database will start with materials property data for nuclear applications and expand to data covering the entire ASME Code and Standards including the piping codes as the database structure is continuously optimized. The ultimate goal of the effort is to establish a sound cyber infrastructure that support ASME Codes and Standards development and maintenance.« less
NASA Astrophysics Data System (ADS)
Kong, Xiang-Zhao; Tutolo, Benjamin M.; Saar, Martin O.
2013-02-01
SUPCRT92 is a widely used software package for calculating the standard thermodynamic properties of minerals, gases, aqueous species, and reactions. However, it is labor-intensive and error-prone to use it directly to produce databases for geochemical modeling programs such as EQ3/6, the Geochemist's Workbench, and TOUGHREACT. DBCreate is a SUPCRT92-based software program written in FORTRAN90/95 and was developed in order to produce the required databases for these programs in a rapid and convenient way. This paper describes the overall structure of the program and provides detailed usage instructions.
NASA Astrophysics Data System (ADS)
Battersby, Cara
2016-01-01
Many students graduate high school having never learned about the process and people behind modern science research. The BiteScis program addresses this gap by providing easily implemented lesson plans that incorporate the whos, whats, and hows of today's scienctific discoveries. We bring together practicing scientists (motivated graduate students from the selective communicating science conference, ComSciCon) with K-12 science teachers to produce, review, and disseminate K-12 lesson plans based on modern science research. These lesson plans vary in topic from environmental science to neurobiology to astrophysics, and involve a range of activities from laboratory exercises to art projects, debates, or group discussion. An integral component of the program is a series of short, "bite-size" articles on modern science research written for K-12 students. The "bite-size" articles and lesson plans will be made freely available online in an easily searchable web interface that includes association with a variety of curriculum standards. This ongoing program is in its first year with about 15 lesson plans produced to date.
Speizer, Ilene S; Corroon, Meghan; Calhoun, Lisa; Lance, Peter; Montana, Livia; Nanda, Priya; Guilkey, David
2014-11-06
Family planning is crucial for preventing unintended pregnancies and for improving maternal and child health and well-being. In urban areas where there are large inequities in family planning use, particularly among the urban poor, programs are needed to increase access to and use of contraception among those most in need. This paper presents the midterm evaluation findings of the Urban Reproductive Health Initiative (Urban RH Initiative) programs, funded by the Bill & Melinda Gates Foundation, that are being implemented in 4 countries: India (Uttar Pradesh), Kenya, Nigeria, and Senegal. Between 2010 and 2013, the Measurement, Learning & Evaluation (MLE) project collected baseline and 2-year longitudinal follow-up data from women in target study cities to examine the role of demand generation activities undertaken as part of the Urban RH Initiative programs. Evaluation results demonstrate that, in each country where it was measured, outreach by community health or family planning workers as well as local radio programs were significantly associated with increased use of modern contraceptive methods. In addition, in India and Nigeria, television programs had a significant effect on modern contraceptive use, and in Kenya and Nigeria, the program slogans and materials that were blanketed across the cities (eg, leaflets/brochures distributed at health clinics and the program logo placed on all forms of materials, from market umbrellas to health facility signs and television programs) were also significantly associated with modern method use. Our results show that targeted, multilevel demand generation activities can make an important contribution to increasing modern contraceptive use in urban areas and could impact Millennium Development Goals for improved maternal and child health and access to reproductive health for all. © Speizer et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly cited. To view a copy of the license, visit http://creativecommons.org/licenses/by/3.0/. When linking to this article, please use the following permanent link: http://dx.doi.org/10.9745/GHSP-D-14-00109.
Modern Gemini-Approach to Technology Development for Human Space Exploration
NASA Technical Reports Server (NTRS)
White, Harold
2010-01-01
In NASA's plan to put men on the moon, there were three sequential programs: Mercury, Gemini, and Apollo. The Gemini program was used to develop and integrate the technologies that would be necessary for the Apollo program to successfully put men on the moon. We would like to present an analogous modern approach that leverages legacy ISS hardware designs, and integrates developing new technologies into a flexible architecture This new architecture is scalable, sustainable, and can be used to establish human exploration infrastructure beyond low earth orbit and into deep space.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waddell, Lucas; Muldoon, Frank; Henry, Stephen Michael
In order to effectively plan the management and modernization of their large and diverse fleets of vehicles, Program Executive Office Ground Combat Systems (PEO GCS) and Program Executive Office Combat Support and Combat Service Support (PEO CS&CSS) commis- sioned the development of a large-scale portfolio planning optimization tool. This software, the Capability Portfolio Analysis Tool (CPAT), creates a detailed schedule that optimally prioritizes the modernization or replacement of vehicles within the fleet - respecting numerous business rules associated with fleet structure, budgets, industrial base, research and testing, etc., while maximizing overall fleet performance through time. This paper contains a thor-more » ough documentation of the terminology, parameters, variables, and constraints that comprise the fleet management mixed integer linear programming (MILP) mathematical formulation. This paper, which is an update to the original CPAT formulation document published in 2015 (SAND2015-3487), covers the formulation of important new CPAT features.« less
Network-based statistical comparison of citation topology of bibliographic databases
Šubelj, Lovro; Fiala, Dalibor; Bajec, Marko
2014-01-01
Modern bibliographic databases provide the basis for scientific research and its evaluation. While their content and structure differ substantially, there exist only informal notions on their reliability. Here we compare the topological consistency of citation networks extracted from six popular bibliographic databases including Web of Science, CiteSeer and arXiv.org. The networks are assessed through a rich set of local and global graph statistics. We first reveal statistically significant inconsistencies between some of the databases with respect to individual statistics. For example, the introduced field bow-tie decomposition of DBLP Computer Science Bibliography substantially differs from the rest due to the coverage of the database, while the citation information within arXiv.org is the most exhaustive. Finally, we compare the databases over multiple graph statistics using the critical difference diagram. The citation topology of DBLP Computer Science Bibliography is the least consistent with the rest, while, not surprisingly, Web of Science is significantly more reliable from the perspective of consistency. This work can serve either as a reference for scholars in bibliometrics and scientometrics or a scientific evaluation guideline for governments and research agencies. PMID:25263231
Database of Mechanical Properties of Textile Composites
NASA Technical Reports Server (NTRS)
Delbrey, Jerry
1996-01-01
This report describes the approach followed to develop a database for mechanical properties of textile composites. The data in this database is assembled from NASA Advanced Composites Technology (ACT) programs and from data in the public domain. This database meets the data documentation requirements of MIL-HDBK-17, Section 8.1.2, which describes in detail the type and amount of information needed to completely document composite material properties. The database focuses on mechanical properties of textile composite. Properties are available for a range of parameters such as direction, fiber architecture, materials, environmental condition, and failure mode. The composite materials in the database contain innovative textile architectures such as the braided, woven, and knitted materials evaluated under the NASA ACT programs. In summary, the database contains results for approximately 3500 coupon level tests, for ten different fiber/resin combinations, and seven different textile architectures. It also includes a limited amount of prepreg tape composites data from ACT programs where side-by-side comparisons were made.
Taking control of your digital library: how modern citation managers do more than just referencing.
Mahajan, Amit K; Hogarth, D Kyle
2013-12-01
Physicians are constantly navigating the overwhelming body of medical literature available on the Internet. Although early citation managers were capable of limited searching of index databases and tedious bibliography production, modern versions of citation managers such as EndNote, Zotero, and Mendeley are powerful web-based tools for searching, organizing, and sharing medical literature. Effortless point-and-click functions provide physicians with the ability to develop robust digital libraries filled with literature relevant to their fields of interest. In addition to easily creating manuscript bibliographies, various citation managers allow physicians to readily access medical literature, share references for teaching purposes, collaborate with colleagues, and even participate in social networking. If physicians are willing to invest the time to familiarize themselves with modern citation managers, they will reap great benefits in the future.
From experimental imaging techniques to virtual embryology.
Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis
2004-01-01
Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.
Enabling On-Demand Database Computing with MIT SuperCloud Database Management System
2015-09-15
arc.liv.ac.uk/trac/SGE) provides these services and is independent of programming language (C, Fortran, Java , Matlab, etc) or parallel programming...a MySQL database to store DNS records. The DNS records are controlled via a simple web service interface that allows records to be created
A Tutorial in Creating Web-Enabled Databases with Inmagic DB/TextWorks through ODBC.
ERIC Educational Resources Information Center
Breeding, Marshall
2000-01-01
Explains how to create Web-enabled databases. Highlights include Inmagic's DB/Text WebPublisher product called DB/TextWorks; ODBC (Open Database Connectivity) drivers; Perl programming language; HTML coding; Structured Query Language (SQL); Common Gateway Interface (CGI) programming; and examples of HTML pages and Perl scripts. (LRW)
Enabling GEODSS for Space Situational Awareness (SSA)
NASA Astrophysics Data System (ADS)
Wootton, S.
2016-09-01
The Ground-Based Electro-Optical Deep Space Surveillance (GEODSS) System has been in operation since the mid-1980's. While GEODSS has been the Space Surveillance Network's (SSN's) workhorse in terms of deep space surveillance, it has not undergone a significant modernization since the 1990's. This means GEODSS continues to operate under a mostly obsolete, legacy data processing baseline. The System Program Office (SPO) responsible for GEODSS, SMC/SYGO, has a number of advanced Space Situational Awareness (SSA)-related efforts in progress, in the form of innovative optical capabilities, data processing algorithms, and hardware upgrades. Each of these efforts is in various stages of evaluation and acquisition. These advanced capabilities rely upon a modern computing environment in which to integrate, but GEODSS does not have one—yet. The SPO is also executing a Service Life Extension Program (SLEP) to modernize the various subsystems within GEODSS, along with a parallel effort to implement a complete, modern software re-architecture. The goal is to use a modern, service-based architecture to provide expedient integration as well as easier and more sustainable expansion. This presentation will describe these modernization efforts in more detail and discuss how adopting such modern paradigms and practices will help ensure the GEODSS system remains relevant and sustainable far beyond 2027.
The future of medical diagnostics: large digitized databases.
Kerr, Wesley T; Lau, Edward P; Owens, Gwen E; Trefler, Aaron
2012-09-01
The electronic health record mandate within the American Recovery and Reinvestment Act of 2009 will have a far-reaching affect on medicine. In this article, we provide an in-depth analysis of how this mandate is expected to stimulate the production of large-scale, digitized databases of patient information. There is evidence to suggest that millions of patients and the National Institutes of Health will fully support the mining of such databases to better understand the process of diagnosing patients. This data mining likely will reaffirm and quantify known risk factors for many diagnoses. This quantification may be leveraged to further develop computer-aided diagnostic tools that weigh risk factors and provide decision support for health care providers. We expect that creation of these databases will stimulate the development of computer-aided diagnostic support tools that will become an integral part of modern medicine.
The quest for the perfect gravity anomaly: Part 1 - New calculation standards
Li, X.; Hildenbrand, T.G.; Hinze, W. J.; Keller, Gordon R.; Ravat, D.; Webring, M.
2006-01-01
The North American gravity database together with databases from Canada, Mexico, and the United States are being revised to improve their coverage, versatility, and accuracy. An important part of this effort is revision of procedures and standards for calculating gravity anomalies taking into account our enhanced computational power, modern satellite-based positioning technology, improved terrain databases, and increased interest in more accurately defining different anomaly components. The most striking revision is the use of one single internationally accepted reference ellipsoid for the horizontal and vertical datums of gravity stations as well as for the computation of the theoretical gravity. The new standards hardly impact the interpretation of local anomalies, but do improve regional anomalies. Most importantly, such new standards can be consistently applied to gravity database compilations of nations, continents, and even the entire world. ?? 2005 Society of Exploration Geophysicists.
Delshad, Elahe; Yousefi, Mahdi; Sasannezhad, Payam; Rakhshandeh, Hasan; Ayati, Zahra
2018-04-01
Carthamus tinctorius L. , known as Kafesheh (Persian) and safflower (English) is vastly utilized in Traditional Medicine for various medical conditions, namely dysmenorrhea, amenorrhea, postpartum abdominal pain and mass, trauma and pain of joints. It is largely used for flavoring and coloring purposes among the local population. Recent reviews have addressed the uses of the plant in various ethnomedical systems. This review was an update to provide a summary on the botanical features, uses in Iranian folklore and modern medical applications of safflower. A main database containing important early published texts written in Persian, together with electronic papers was established on ethnopharmacology and modern pharmacology of C. tinctorius. Literature review was performed on the years from 1937 to 2016 in Web of Science, PubMed, Scientific Information Database, Google Scholar, and Scopus for the terms "Kafesheh", "safflower", "Carthamus tinctorius", and so forth. Safflower is an indispensable element of Iranian folklore medicine, with a variety of applications due to laxative effects. Also, it was recommended as treatment for rheumatism and paralysis, vitiligo and black spots, psoriasis, mouth ulcers, phlegm humor, poisoning, numb limbs, melancholy humor, and the like. According to the modern pharmacological and clinical examinations, safflower provides promising opportunities for the amelioration of myocardial ischemia, coagulation, thrombosis, inflammation, toxicity, cancer, and so forth. However, there have been some reports on its undesirable effects on male and female fertility. Most of these beneficial therapeutic effects were correlated to hydroxysafflor yellow A. More attention should be drawn to the lack of a thorough phytochemical investigation. The potential implications of safflower based on Persian traditional medicine, such as the treatment of rheumatism and paralysis, vitiligo and black spots, psoriasis, mouth ulcers, phlegm humor, poisoning, numb limbs, and melancholy humor warrant further consideration.
Network Configuration of Oracle and Database Programming Using SQL
NASA Technical Reports Server (NTRS)
Davis, Melton; Abdurrashid, Jibril; Diaz, Philip; Harris, W. C.
2000-01-01
A database can be defined as a collection of information organized in such a way that it can be retrieved and used. A database management system (DBMS) can further be defined as the tool that enables us to manage and interact with the database. The Oracle 8 Server is a state-of-the-art information management environment. It is a repository for very large amounts of data, and gives users rapid access to that data. The Oracle 8 Server allows for sharing of data between applications; the information is stored in one place and used by many systems. My research will focus primarily on SQL (Structured Query Language) programming. SQL is the way you define and manipulate data in Oracle's relational database. SQL is the industry standard adopted by all database vendors. When programming with SQL, you work on sets of data (i.e., information is not processed one record at a time).
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
Implementation of a data management software system for SSME test history data
NASA Technical Reports Server (NTRS)
Abernethy, Kenneth
1986-01-01
The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.
Morgan, Perri; Humeniuk, Katherine M; Everett, Christine M
2015-09-01
As physician assistant (PA) roles expand and diversify in the United States and around the world, there is a pressing need for research that illuminates how PAs may best be selected, educated, and used in health systems to maximize their potential contributions to health. Physician assistant education programs are well positioned to advance this research by collecting and organizing data on applicants, students, and graduates. Our PA program is creating a permanent longitudinal education database for research that contains extensive student-level data. This database will allow us to conduct research on all phases of PA education, from admission processes through the professional practice of our graduates. In this article, we describe our approach to constructing a longitudinal student-level research database and discuss the strengths and limitations of longitudinal databases for research on education and the practice of PAs. We hope to encourage other PA programs to initiate similar projects so that, in the future, data can be combined for use in multi-institutional research that can contribute to improved education for PA students across programs.
NASA Technical Reports Server (NTRS)
Cotter, Gladys A.
1993-01-01
Foreign competitors are challenging the world leadership of the U.S. aerospace industry, and increasingly tight budgets everywhere make international cooperation in aerospace science necessary. The NASA STI Program has as part of its mission to support NASA R&D, and to that end has developed a knowledge base of aerospace-related information known as the NASA Aerospace Database. The NASA STI Program is already involved in international cooperation with NATO/AGARD/TIP, CENDI, ICSU/ICSTI, and the U.S. Japan Committee on STI. With the new more open political climate, the perceived dearth of foreign information in the NASA Aerospace Database, and the development of the ESA database and DELURA, the German databases, the NASA STI Program is responding by sponsoring workshops on foreign acquisitions and by increasing its cooperation with international partners and with other U.S. agencies. The STI Program looks to the future of improved database access through networking and a GUI; new media; optical disk, video, and full text; and a Technology Focus Group that will keep the NASA STI Program current with technology.
Uzunovic, Slavoljub; Kostic, Radmila; Zivkovic, Dobrica
2010-09-01
This study aimed to determine the effects of two different programs of modern sports dancing on coordination, strength, and speed in 60 beginner-level female dancers, aged 13 and 14 yrs. The subjects were divided into two experimental groups (E1 and E2), each numbering 30 subjects, drawn from local dance clubs. In order to determine motor coordination, strength, and speed, we used 15 measurements. The groups were tested before and after the experimental programs. Both experimental programs lasted for 18 wks, with training sessions twice a week for 60 minutes. The subjects from the E1 group trained according to a new experimental program of disco dance (DD) modern sports dance, and the E2 group trained according to the classic DD program of the same kind for beginner selections. The obtained results were assessed by statistical analysis: a paired-samples t-test and MANCOVA/ANCOVA. The results indicated that following the experimental programs, both groups showed a statistically significant improvement in the evaluated skills, but the changes among the E1 group subjects were more pronounced. The basic assumption of this research was confirmed, that the new experimental DD program has a significant influence on coordination, strength, and speed. In relation to these changes, the application of the new DD program was recommended for beginner dancers.
U.S. Army Modernizes Munitions Plants
ERIC Educational Resources Information Center
Environmental Science and Technology, 1972
1972-01-01
Headquartered at Joliet, Illinois, the Army Ammunition Procurement and Supply Agency aims to mechanize and clean up its manufacturing facilities. Six go-co (government owned - contractor operated) plants involved in the modernization program are described. (BL)
Rural Water Quality Database: Educational Program to Collect Information.
ERIC Educational Resources Information Center
Lemley, Ann; Wagenet, Linda
1993-01-01
A New York State project created a water quality database for private drinking water supplies, using the statewide educational program to collect the data. Another goal was to develop this program so rural residents could increase their knowledge of water supply management. (Author)
Osteoporosis therapies: evidence from health-care databases and observational population studies.
Silverman, Stuart L
2010-11-01
Osteoporosis is a well-recognized disease with severe consequences if left untreated. Randomized controlled trials are the most rigorous method for determining the efficacy and safety of therapies. Nevertheless, randomized controlled trials underrepresent the real-world patient population and are costly in both time and money. Modern technology has enabled researchers to use information gathered from large health-care or medical-claims databases to assess the practical utilization of available therapies in appropriate patients. Observational database studies lack randomization but, if carefully designed and successfully completed, can provide valuable information that complements results obtained from randomized controlled trials and extends our knowledge to real-world clinical patients. Randomized controlled trials comparing fracture outcomes among osteoporosis therapies are difficult to perform. In this regard, large observational database studies could be useful in identifying clinically important differences among therapeutic options. Database studies can also provide important information with regard to osteoporosis prevalence, health economics, and compliance and persistence with treatment. This article describes the strengths and limitations of both randomized controlled trials and observational database studies, discusses considerations for observational study design, and reviews a wealth of information generated by database studies in the field of osteoporosis.
Carvajal-Rodríguez, Antonio
2012-07-01
Mutate is a program developed for teaching purposes to impart a virtual laboratory class for undergraduate students of Genetics in Biology. The program emulates the so-called fluctuation test whose aim is to distinguish between spontaneous and adaptive mutation hypotheses in bacteria. The plan is to train students in certain key multidisciplinary aspects of current genetics such as sequence databases, DNA mutations, and hypothesis testing, while introducing the fluctuation test. This seminal experiment was originally performed studying Escherichia coli resistance to the infection by bacteriophage T1. The fluctuation test initiated the modern bacterial genetics that 25 years later ushered in the era of the recombinant DNA. Nowadays we know that some deletions in fhuA, the gene responsible for E. coli membrane receptor of T1, could cause the E. coli resistance to this phage. For the sake of simplicity, we will introduce the assumption that a single mutation generates the resistance to T1. During the practical, the students use the program to download some fhuA gene sequences, manually introduce some stop codon mutations, and design a fluctuation test to obtain data for distinguishing between preadaptative (spontaneous) and induced (adaptive) mutation hypotheses. The program can be launched from a browser or, if preferred, its executable file can be downloaded from http://webs.uvigo.es/acraaj/MutateWeb/Mutate.html. It requires the Java 5.0 (or higher) Runtime Environment (freely available at http://www.java.com). Copyright © 2012 Wiley Periodicals, Inc.
Nuclear weapons modernizations
NASA Astrophysics Data System (ADS)
Kristensen, Hans M.
2014-05-01
This article reviews the nuclear weapons modernization programs underway in the world's nine nuclear weapons states. It concludes that despite significant reductions in overall weapons inventories since the end of the Cold War, the pace of reductions is slowing - four of the nuclear weapons states are even increasing their arsenals, and all the nuclear weapons states are busy modernizing their remaining arsenals in what appears to be a dynamic and counterproductive nuclear competition. The author questions whether perpetual modernization combined with no specific plan for the elimination of nuclear weapons is consistent with the nuclear Non-Proliferation Treaty and concludes that new limits on nuclear modernizations are needed.
24 CFR 968.210 - Procedures for obtaining approval of a modernization program.
Code of Federal Regulations, 2011 CFR
2011-04-01
... of Modernization capability as defined at § 968.205. (c) ACC amendment. HUD and the PHA shall enter into an ACC amendment in order for the PHA to draw down modernization funds. The ACC amendment shall require low-income use of the housing for not less than 20 years from the date of the ACC amendment...
24 CFR 968.210 - Procedures for obtaining approval of a modernization program.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of Modernization capability as defined at § 968.205. (c) ACC amendment. HUD and the PHA shall enter into an ACC amendment in order for the PHA to draw down modernization funds. The ACC amendment shall require low-income use of the housing for not less than 20 years from the date of the ACC amendment...
ANALOG: a program for estimating paleoclimate parameters using the method of modern analogs
Schweitzer, Peter N.
1994-01-01
Beginning in the 1970s with CLIMAP, paleoclimatologists have been trying to derive quantitative estimates of climatic parameters from the sedimentary record. In general the procedure is to observe the modern distribution of some component of surface sediment that depends on climate, find an empirical relationship between climate and the character of sediments, then extrapolate past climate by studying older sediments in the same way. Initially the empirical relationship between climate and components of the sediment was determined using a multiple regression technique (Imbrie and Kipp, 1971). In these studies sea-floor sediments were examined to determine the percentage of various species of planktonic foraminifera present in them. Supposing that the distribution of foraminiferal assemblages depended strongly on the extremes of annual sea-surface temperature (SST), the foraminiferal assemblages (refined through use of varimax factor analysis) were regressed against the average SST during the coolest and warmest months of the year. The result was a set of transfer functions, equations that could be used to estimate cool and warm SST from the faunal composition of a sediment sample. Assuming that the ecological preference of the species had remained constant throughout the last several hundred thousand years, these transfer functions could be used to estimate SSTs during much of the late Pleistocene. Hutson (1980) and Overpeck, Webb, and Prentice (1985) proposed an alternative approach to estimating paleoclimatic parameters. Their 'method of modern analogs' revolved not around the existence of a few climatically-sensitive faunal assemblages but rather on the expectation that similar climatic regimes should foster similar faunal and floral assemblages. From a large pool of modern samples, those few are selected whose faunal compositions are most similar to a given fossil sample. Paleoclimate estimates are derived using the climatic character of only the most similar modern samples, the modern analogs of the fossil sample. This report describes how to use the program ANALOG to carry out the method of modern analogs. It is assumed that the user has faunal census estimates of one or more fossil samples, and one or more sets of faunal data from modern samples. Furthermore, the user must understand the taxonomic categories represented in the data sets, and be able to recognize taxa that are or may be considered equivalent in the analysis. ANALOG provides the user with flexibility in input data format, output data content, and choice of distance measure, and allows the user to determine which taxa from each modern and fossil data file are compared. Most of the memory required by the program is allocated dynamically, so that, on systems that permit program segments to grow, the program consumes only as many system resources as are needed to accomplish its task.
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
The North American Bird Banding Program: Into the 21st century
Buckley, P.A.; Francis, C.M.; Blancher, P.; DeSante, D.F.; Robbins, C.S.; Smith, G.; Cannell, P.
1998-01-01
The authors examined the legal, scientific, and philosophical underpinnings of the North American Bird Banding Program [BBP], with emphasis on the U.S. Bird Banding Laboratory [BBL], but also considering the Canadian Bird Banding Office [BBO]. In this report, we review the value of banding data, enumerate and expand on tile principles under which any modern BBP should operate, and from them derive our recommendations. These are cast into a Mission Statement, a Role and Function Statement, and a series of specific recommendations addressing five areas: (1) permitting procedures and practices; (2) operational issues; (3) data management; (4) BBL organization and staffing; and (5) implementation. Our major tenets and recommendations are as follows: banding provides valuable data for numerous scientific, management, and educational purposes, and its benefits far outweigh necessary biological and fiscal costs, especially those incurred by the BBL and BBO; because of the value of banding data for management of avian resources, including both game and nongame birds, government support of the program is fully justified and appropriate; all banding data, if collected to appropriate standards, are potentially valuable; there are many ways to increase the value of banding data such as by endorsing, promoting, and applying competence and/or training standards for permit issuance; promoting bander participation in well-designed projects; and by encouraging the use of banding data for meta-analytical approaches; the BBL should apply, promote, and encourage such standards, participation, and approaches; the BBP should be driven by the needs of users, including scientists and managers; all exchange of data and most communication between banders and the BBL should become electronic in the near future; the computer system at the BBL should be modernized to one designed for a true client-server relationship and storage of data in on-line relational databases; the BBL should continue to maintain high quality control and editing standards and should strive to bring all data in the database up to current standards; however, the BBL should transfer a major portion of the responsibility for editing banding data to the bander by providing software that will permit the bander to edit his/her own data electronically before submission to the BBL; the BBL should build the capacity to store additional data tied to original band records able to be pre-edited and submitted electronically, such as recapture data, appropriate data from auxiliary marking (e.g. resightings of color-marked birds), and other data that gain value when pooled from many banders (e.g., measurements); however, the BBL should only accept such data if they are collected using standardized methods and as part of an established program designed to utilize such data; now is the time to consider options for implementing a Western Hemisphere banding program, with leadership from the BBL; the Patuxent Electronic Data Processing Section should become part of the BBL; additional scientific and technical staff must be added to the BBL; an Implementation Team should be formed to expedite our recommendations, following timetables outlined in this document.
NASA Astrophysics Data System (ADS)
Husseini, B.; Bali, Z.
2015-08-01
Architectural Heritage is a strong witness to a people's history that symbolizes their identity. The Old city of Jerusalem, and as a UNESCO world heritage site 1 is a living city especially with its great wealth of historic structures, including places of worships for the three monotheistic religions, significant monuments, and whole historical residential neighbourhoods, Figure 1. In spite of the prevailing political conditions, difficulties that Palestinians encounter in Jerusalem, and the demands of the modern life and ever-growing population, several attempts had been stimulated to protect this Heritage. A specialized program (Old City of Jerusalem Revitalization Program - OCJRP) has been working since 1994. The program was established by the Welfare Association2 to help protect Jerusalem's cultural heritage applying international conventions and the highest professional standards for the direct benefit of residents, building users and visitors to the Old City as well as for future generations. This paper aims to describe the various activities and main findings carried out by the Technical Office of OCJRP - in the last twenty years as well as stressing on problems encountered by the team. It will rely on the team experience accumulated during the implementation of the projects, the research, surveys and studies undertaken by the team who helped in the creation of the database and its ongoing process.
2016-03-24
Corporation found that increases in schedule effort tend to be the reason for increases in the cost of acquiring a new weapons system due to, at a minimum...in-depth finance and schedule data for selected programs (Brown et al., 2015). We also give extra focus on Research Development Test & Evaluation...we create and employ an entirely new database. The database we utilize for our research is a database originally built by the RAND Corporation for
ERIC Educational Resources Information Center
Liu, Xia; Liu, Lai C.; Koong, Kai S.; Lu, June
2003-01-01
Analysis of 300 information technology job postings in two Internet databases identified the following skill categories: programming languages (Java, C/C++, and Visual Basic were most frequent); website development (57% sought SQL and HTML skills); databases (nearly 50% required Oracle); networks (only Windows NT or wide-area/local-area networks);…
BRONX HEALTH EDUCATION PROJECT FOR WEST AFRICAN IMMIGRANTS.
Wilson, Rebecca Dover; Elgoghail, Nadia
2016-01-01
The transition from a traditional West African diet and lifestyle to a modern diet has a significant impact on health and the risk of chronic disease. To implement a health education program for West African immigrants in the U.S. to address health risks associated with the modern diet. A health education program model targeted at West African immigrants in the Bronx was determined based on existing health education programs with educational materials, group education sessions, and targeted individual counseling. A health education program was successfully implemented at a clinic comprised of West African immigrant patients in the Bronx. This project demonstrates an example of a targeted health education program for West African immigrants to address health risks related to diet.
GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan
2015-04-01
Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.
The Steward Observatory asteroid relational database
NASA Technical Reports Server (NTRS)
Sykes, Mark V.; Alvarezdelcastillo, Elizabeth M.
1992-01-01
The Steward Observatory Asteroid Relational Database (SOARD) was created as a flexible tool for undertaking studies of asteroid populations and sub-populations, to probe the biases intrinsic to asteroid databases, to ascertain the completeness of data pertaining to specific problems, to aid in the development of observational programs, and to develop pedagogical materials. To date SOARD has compiled an extensive list of data available on asteroids and made it accessible through a single menu-driven database program. Users may obtain tailored lists of asteroid properties for any subset of asteroids or output files which are suitable for plotting spectral data on individual asteroids. A browse capability allows the user to explore the contents of any data file. SOARD offers, also, an asteroid bibliography containing about 13,000 references. The program has online help as well as user and programmer documentation manuals. SOARD continues to provide data to fulfill requests by members of the astronomical community and will continue to grow as data is added to the database and new features are added to the program.
The challenge of designing a database for auditing surgical in-patients.
Branday, J M; Crandon, I; Carpenter, R; Rhoden, A; Meeks-Aitken, N
1999-12-01
Surgical audit is imperative in modern practice, particularly in the developing world where resources are limited and efficient allocation important. The structure, process and outcome of surgical care can be determined for quality assurance or for research. Improved efficiency and reduction of morbidity and mortality are additional goals which may be accomplished. However, computerization, medical staff cooperation and the availability of dedicated staff are among the hurdles which may be encountered. We report the challenge of designing and establishing a database for auditing surgical inpatients in a developing country and the difficulties which were encountered.
NASA Technical Reports Server (NTRS)
Ebersole, M. M.
1983-01-01
JPL's management and administrative support systems have been developed piece meal and without consistency in design approach over the past twenty years. These systems are now proving to be inadequate to support effective management of tasks and administration of the Laboratory. New approaches are needed. Modern database management technology has the potential for providing the foundation for more effective administrative tools for JPL managers and administrators. Plans for upgrading JPL's management and administrative systems over a six year period evolving around the development of an integrated management and administrative data base are discussed.
[Medical support on human resources and clinical laboratory in Myanmar].
Koide, Norio
2012-03-01
I have been involved in medical cooperation programs between Myanmar and Japan for over 10 years. The purpose of the first visit to Myanmar was the investigation of hepatitis C spreading among thalassemia patients. I learned that the medical system was underdeveloped in this country, and have initiated several cooperation programs together with Professor Shigeru Okada, such as the "Protection against hepatitis C in Myanmar", "Scientist exchange between the Ministry of Health, Myanmar and Okayama University", and "Various activities sponsored by a Non-Profit Organization". As for clinical laboratories, the laboratory system itself is pre-constructed and the benefit of a clinical laboratory in modern medicine is not given to patients in Myanmar. The donation of drugs and reagents for laboratory tests is helpful, but it will be more helpful to assist the future leaders to learn modern medicine and develop their own various systems to support modern medicine. Our activity in the cooperation program is described.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Proposed Collection; Comment Request (60-Day FRN); The Clinical Trials Reporting Program (CTRP) Database (NCI) SUMMARY: In compliance... publication. Proposed Collection: The Clinical Trials Reporting Program (CTRP) Database, 0925-0600, Expiration...
Do Apprentices' Communities of Practice Block Unwelcome Knowledge?
ERIC Educational Resources Information Center
Sligo, Frank; Tilley, Elspeth; Murray, Niki
2011-01-01
Purpose: This study aims to examine how well print-literacy support being provided to New Zealand Modern Apprentices (MAs) is supporting their study and practical work. Design/methodology/approach: The authors undertook a qualitative analysis of a database of 191 MAs in the literacy programme, then in 14 case studies completed 46 interviews with…
Learners' Reflections in Technological Learning Environments: Why To Promote and How To Evaluate.
ERIC Educational Resources Information Center
Rimor, Rikki; Kozminsky, Ely
In this study, 24 9th-grade students investigated several issues related to modern Israeli society. In their investigation, students were engaged in activities such as data search, data sorting, making inquiries, project writing, and construction of a new computerized database related to the subjects of their investigations. Students were…
The Design and Realization of Net Testing System on Campus Network
ERIC Educational Resources Information Center
Ren, Zhanying; Liu, Shijie
2005-01-01
According to the requirement of modern teaching theory and technology, based on software engineering, database theory, the technique of net information security and system integration, a net testing system on local network was designed and realized. The system benefits for dividing of testing & teaching and settles the problems of random…
Educational System Efficiency Improvement Using Knowledge Discovery in Databases
ERIC Educational Resources Information Center
Lukaš, Mirko; Leškovic, Darko
2007-01-01
This study describes one of possible way of usage ICT in education system. We basically treated educational system like Business Company and develop appropriate model for clustering of student population. Modern educational systems are forced to extract the most necessary and purposeful information from a large amount of available data. Clustering…
NASA Astrophysics Data System (ADS)
Ishkov, V. N.; Zabarinskaya, L. P.; Sergeeva, N. A.
2017-11-01
The development of studies of solar sources and their effects on the state of the near-Earth space required systematization of the corresponding information in the form of databases and catalogs for the entire time of observation of any geoeffective phenomenon that includes, if possible at the time of creation, all of the characteristics of the phenomena themselves and the sources of these phenomena on the Sun. A uniform presentation of information in the form of a series of similar catalogs that cover long time intervals is of particular importance. The large amount of information collected in such catalogs makes it necessary to use modern methods of its organization and presentation that allow a transition between individual parts of the catalog and a quick search for necessary events and their characteristics, which is implemented in the presented Catalog of Solar Proton Events in the 23rd Cycle of Solar Activity of the sequence of catalogs (six separate issues) that cover the period from 1970 to 2009 (20th-23rd solar cycles).
Geoinformatics paves the way for a zoo information system
NASA Astrophysics Data System (ADS)
Michel, Ulrich
2008-10-01
The use of modern electronic media offers new ways of (environmental) knowledge transfer. All kind of information can be made quickly available as well as queryable and can be processed individually. The Institute for Geoinformatics and Remote Sensing (IGF) in collaboration with the Osnabrueck Zoo, is developing a zoo information system, especially for new media (e.g. mobile devices), which provides information about the animals living there, their natural habitat and endangerment status. Thereby multimedia information is being offered to the zoo visitors. The implementation of the 2D/3D components is realized by modern database and Mapserver technologies. Among other technologies, the VRML (Virtual Reality Modeling Language) standard is used for the realization of the 3D visualization so that it can be viewed in every conventional web browser. Also, a mobile information system for Pocket PCs, Smartphones and Ultra Mobile PCs (UMPC) is being developed. All contents, including the coordinates, are stored in a PostgreSQL database. The data input, the processing and other administrative operations are executed by a content management system (CMS).
University of Maryland MRSEC - For Members
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
University of Maryland MRSEC - News: Calendar
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
NASA Astrophysics Data System (ADS)
Kuzma, H. A.; Boyle, K.; Pullman, S.; Reagan, M. T.; Moridis, G. J.; Blasingame, T. A.; Rector, J. W.; Nikolaou, M.
2010-12-01
A Self Teaching Expert System (SeTES) is being developed for the analysis, design and prediction of gas production from shales. An Expert System is a computer program designed to answer questions or clarify uncertainties that its designers did not necessarily envision which would otherwise have to be addressed by consultation with one or more human experts. Modern developments in computer learning, data mining, database management, web integration and cheap computing power are bringing the promise of expert systems to fruition. SeTES is a partial successor to Prospector, a system to aid in the identification and evaluation of mineral deposits developed by Stanford University and the USGS in the late 1970s, and one of the most famous early expert systems. Instead of the text dialogue used in early systems, the web user interface of SeTES helps a non-expert user to articulate, clarify and reason about a problem by navigating through a series of interactive wizards. The wizards identify potential solutions to queries by retrieving and combining together relevant records from a database. Inferences, decisions and predictions are made from incomplete and noisy inputs using a series of probabilistic models (Bayesian Networks) which incorporate records from the database, physical laws and empirical knowledge in the form of prior probability distributions. The database is mainly populated with empirical measurements, however an automatic algorithm supplements sparse data with synthetic data obtained through physical modeling. This constitutes the mechanism for how SeTES self-teaches. SeTES’ predictive power is expected to grow as users contribute more data into the system. Samples are appropriately weighted to favor high quality empirical data over low quality or synthetic data. Finally, a set of data visualization tools digests the output measurements into graphical outputs.
Strength training for the warfighter.
Kraemer, William J; Szivak, Tunde K
2012-07-01
Optimizing strength training for the warfighter is challenged by past training philosophies that no longer serve the modern warfighter facing the "anaerobic battlefield." Training approaches for integration of strength with other needed physical capabilities have been shown to require a periodization model that has the flexibility for changes and is able to adapt to ever-changing circumstances affecting the quality of workouts. Additionally, sequencing of workouts to limit over-reaching and development of overtraining syndromes that end in loss of duty time and injury are paramount to long-term success. Allowing adequate time for rest and recovery and recognizing the negative influences of extreme exercise programs and excessive endurance training will be vital in moving physical training programs into a more modern perspective as used by elite strength-power anaerobic athletes in sports today. Because the warfighter is an elite athlete, it is time that training approaches that are scientifically based are updated within the military to match the functional demands of modern warfare and are given greater credence and value at the command levels. A needs analysis, development of periodized training modules, and individualization of programs are needed to optimize the strength of the modern warfighter. We now have the knowledge, professional coaches and nonprofit organization certifications with continuing education units, and modern training technology to allow this to happen. Ultimately, it only takes command decisions and implementation to make this possible.
Enhancement to Hitran to Support the NASA EOS Program
NASA Technical Reports Server (NTRS)
Kirby, Kate P.; Rothman, Laurence S.
1998-01-01
The HITRAN molecular database has been enhanced with the object of providing improved capabilities for the EOS program scientists. HITRAN itself is the database of high-resolution line parameters of gaseous species expected to be observed by the EOS program in its remote sensing activities. The database is part of a larger compilation that includes IR cross-sections, aerosol indices of refraction, and software for filtering and plotting portions of the database. These properties have also been improved. The software has been advanced in order to work on multiple platforms. Besides the delivery of the compilation on CD-ROM, the effort has been directed toward making timely access of data and software on the world wide web.
Enhancement to HITRAN to Support the NASA EOS Program
NASA Technical Reports Server (NTRS)
Kirby, Kate P.; Rothman, Laurence S.
1999-01-01
The HITRAN molecular database has been enhanced with the object of providing improved capabilities for the EOS program scientists. HITRAN itself is the database of high-resolution line parameters of gaseous species expected to be observed by the EOS program in its remote sensing activities. The database is part of a larger compilation that includes IR cross-sections, aerosol indices of refraction, and software for filtering and plotting portions of the database. These properties have also been improved. The software has been advanced in order to work on multiple platforms. Besides the delivery of the compilation on CD-ROM, the effort has been directed toward making timely access of data and software on the world wide web.
A VBA Desktop Database for Proposal Processing at National Optical Astronomy Observatories
NASA Astrophysics Data System (ADS)
Brown, Christa L.
National Optical Astronomy Observatories (NOAO) has developed a relational Microsoft Windows desktop database using Microsoft Access and the Microsoft Office programming language, Visual Basic for Applications (VBA). The database is used to track data relating to observing proposals from original receipt through the review process, scheduling, observing, and final statistical reporting. The database has automated proposal processing and distribution of information. It allows NOAO to collect and archive data so as to query and analyze information about our science programs in new ways.
Technologies and standards in the information systems of the soil-geographic database of Russia
NASA Astrophysics Data System (ADS)
Golozubov, O. M.; Rozhkov, V. A.; Alyabina, I. O.; Ivanov, A. V.; Kolesnikova, V. M.; Shoba, S. A.
2015-01-01
The achievements, problems, and challenges of the modern stage of the development of the Soil-Geographic Database of Russia (SGDBR) and the history of this project are outlined. The structure of the information system of the SGDBR as an internet-based resource to collect data on soil profiles and to integrate the geographic and attribute databases on the same platform is described. The pilot project in Rostov oblast illustrates the inclusion of regional information in the SGDBR and its application for solving practical problems. For the first time in Russia, the GeoRSS standard based on the structured hypertext representation of the geographic and attribute information has been applied in the state system for the agromonitoring of agricultural lands in Rostov oblast and information exchange through the internet.
Migration of legacy mumps applications to relational database servers.
O'Kane, K C
2001-07-01
An extended implementation of the Mumps language is described that facilitates vendor neutral migration of legacy Mumps applications to SQL-based relational database servers. Implemented as a compiler, this system translates Mumps programs to operating system independent, standard C code for subsequent compilation to fully stand-alone, binary executables. Added built-in functions and support modules extend the native hierarchical Mumps database with access to industry standard, networked, relational database management servers (RDBMS) thus freeing Mumps applications from dependence upon vendor specific, proprietary, unstandardized database models. Unlike Mumps systems that have added captive, proprietary RDMBS access, the programs generated by this development environment can be used with any RDBMS system that supports common network access protocols. Additional features include a built-in web server interface and the ability to interoperate directly with programs and functions written in other languages.
2008-06-01
Charlie Chaplin , in 1936, to make the aptly titled film Modern Times . Watch the movie to see what we mean. Postmodernism: The Humanist Reaction Along...PM, plus more. Further, because Pomo PMs do not insist on standardiza- tion to the degree Modern PMs do, they spend much less time producing the...voluminous, detailed documentation that Modern PMs require to ensure precise repeatability, and much more time on actually doing things (perhaps
Kriz, J; Baues, C; Engenhart-Cabillic, R; Haverkamp, U; Herfarth, K; Lukas, P; Schmidberger, H; Marnitz-Schulze, S; Fuchs, M; Engert, A; Eich, H T
2017-02-01
Field design changed substantially from extended-field RT (EF-RT) to involved-field RT (IF-RT) and now to involved-node RT (IN-RT) and involved-site RT (IS-RT) as well as treatment techniques in radiotherapy (RT) of Hodgkin's lymphoma (HL). The purpose of this article is to demonstrate the establishment of a quality assurance program (QAP) including modern RT techniques and field designs within the German Hodgkin Study Group (GHSG). In the era of modern conformal RT, this QAP had to be fundamentally adapted and a new evaluation process has been intensively discussed by the radiotherapeutic expert panel of the GHSG. The expert panel developed guidelines and criteria to analyse "modern" field designs and treatment techniques. This work is based on a dataset of 11 patients treated within the sixth study generation (HD16-17). To develop a QAP of "modern RT", the expert panel defined criteria for analysing current RT procedures. The consensus of a modified QAP in ongoing and future trials is presented. With this schedule, the QAP of the GHSG could serve as a model for other study groups.
University of Maryland MRSEC - For Members: Templates
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher
Foot and Ankle Fellowship Websites: An Assessment of Accessibility and Quality.
Hinds, Richard M; Danna, Natalie R; Capo, John T; Mroczek, Kenneth J
2017-08-01
The Internet has been reported to be the first informational resource for many fellowship applicants. The objective of this study was to assess the accessibility of orthopaedic foot and ankle fellowship websites and to evaluate the quality of information provided via program websites. The American Orthopaedic Foot and Ankle Society (AOFAS) and the Fellowship and Residency Electronic Interactive Database (FREIDA) fellowship databases were accessed to generate a comprehensive list of orthopaedic foot and ankle fellowship programs. The databases were reviewed for links to fellowship program websites and compared with program websites accessed from a Google search. Accessible fellowship websites were then analyzed for the quality of recruitment and educational content pertinent to fellowship applicants. Forty-seven orthopaedic foot and ankle fellowship programs were identified. The AOFAS database featured direct links to 7 (15%) fellowship websites with the independent Google search yielding direct links to 29 (62%) websites. No direct website links were provided in the FREIDA database. Thirty-six accessible websites were analyzed for content. Program websites featured a mean 44% (range = 5% to 75%) of the total assessed content. The most commonly presented recruitment and educational content was a program description (94%) and description of fellow operative experience (83%), respectively. There is substantial variability in the accessibility and quality of orthopaedic foot and ankle fellowship websites. Recognition of deficits in accessibility and content quality may assist foot and ankle fellowships in improving program information online. Level IV.
MINC 2.0: A Flexible Format for Multi-Modal Images.
Vincent, Robert D; Neelin, Peter; Khalili-Mahani, Najmeh; Janke, Andrew L; Fonov, Vladimir S; Robbins, Steven M; Baghdadi, Leila; Lerch, Jason; Sled, John G; Adalat, Reza; MacDonald, David; Zijdenbos, Alex P; Collins, D Louis; Evans, Alan C
2016-01-01
It is often useful that an imaging data format can afford rich metadata, be flexible, scale to very large file sizes, support multi-modal data, and have strong inbuilt mechanisms for data provenance. Beginning in 1992, MINC was developed as a system for flexible, self-documenting representation of neuroscientific imaging data with arbitrary orientation and dimensionality. The MINC system incorporates three broad components: a file format specification, a programming library, and a growing set of tools. In the early 2000's the MINC developers created MINC 2.0, which added support for 64-bit file sizes, internal compression, and a number of other modern features. Because of its extensible design, it has been easy to incorporate details of provenance in the header metadata, including an explicit processing history, unique identifiers, and vendor-specific scanner settings. This makes MINC ideal for use in large scale imaging studies and databases. It also makes it easy to adapt to new scanning sequences and modalities.
Web3DMol: interactive protein structure visualization based on WebGL.
Shi, Maoxiang; Gao, Juntao; Zhang, Michael Q
2017-07-03
A growing number of web-based databases and tools for protein research are being developed. There is now a widespread need for visualization tools to present the three-dimensional (3D) structure of proteins in web browsers. Here, we introduce our 3D modeling program-Web3DMol-a web application focusing on protein structure visualization in modern web browsers. Users submit a PDB identification code or select a PDB archive from their local disk, and Web3DMol will display and allow interactive manipulation of the 3D structure. Featured functions, such as sequence plot, fragment segmentation, measure tool and meta-information display, are offered for users to gain a better understanding of protein structure. Easy-to-use APIs are available for developers to reuse and extend Web3DMol. Web3DMol can be freely accessed at http://web3dmol.duapp.com/, and the source code is distributed under the MIT license. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Computational Methods to Work as First-Pass Filter in Deleterious SNP Analysis of Alkaptonuria
Magesh, R.; George Priya Doss, C.
2012-01-01
A major challenge in the analysis of human genetic variation is to distinguish functional from nonfunctional SNPs. Discovering these functional SNPs is one of the main goals of modern genetics and genomics studies. There is a need to effectively and efficiently identify functionally important nsSNPs which may be deleterious or disease causing and to identify their molecular effects. The prediction of phenotype of nsSNPs by computational analysis may provide a good way to explore the function of nsSNPs and its relationship with susceptibility to disease. In this context, we surveyed and compared variation databases along with in silico prediction programs to assess the effects of deleterious functional variants on protein functions. In other respects, we attempted these methods to work as first-pass filter to identify the deleterious substitutions worth pursuing for further experimental research. In this analysis, we used the existing computational methods to explore the mutation-structure-function relationship in HGD gene causing alkaptonuria. PMID:22606059
Conflation and integration of archived geologic maps and associated uncertainties
Shoberg, Thomas G.
2016-01-01
Old, archived geologic maps are often available with little or no associated metadata. This creates special problems in terms of extracting their data to use with a modern database. This research focuses on some problems and uncertainties associated with conflating older geologic maps in regions where modern geologic maps are, as yet, non-existent as well as vertically integrating the conflated maps with layers of modern GIS data (in this case, The National Map of the U.S. Geological Survey). Ste. Genevieve County, Missouri was chosen as the test area. It is covered by six archived geologic maps constructed in the years between 1928 and 1994. Conflating these maps results in a map that is internally consistent with these six maps, is digitally integrated with hydrography, elevation and orthoimagery data, and has a 95% confidence interval useful for further data set integration.
Nuclear weapons modernizations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristensen, Hans M.
This article reviews the nuclear weapons modernization programs underway in the world's nine nuclear weapons states. It concludes that despite significant reductions in overall weapons inventories since the end of the Cold War, the pace of reductions is slowing - four of the nuclear weapons states are even increasing their arsenals, and all the nuclear weapons states are busy modernizing their remaining arsenals in what appears to be a dynamic and counterproductive nuclear competition. The author questions whether perpetual modernization combined with no specific plan for the elimination of nuclear weapons is consistent with the nuclear Non-Proliferation Treaty and concludesmore » that new limits on nuclear modernizations are needed.« less
1987-06-15
GENERAL DYNAMICS FORT WORTH DIVISION INDUSTRIAL TECHNOLOGY00 N MODERNIZATION PROGRAM Phase 2 Final Project Report DT C JUNO 7 1989J1K PROJECT 20...CLASSIFICATION O THIS PAGE All other editions are obsolete. unclassified Honeywell JUNE 15, 1987 GENERAL DYNAMICS FORT WORTH DIVISION INDUSTRIAL ...SYSTEMIEQUIPMENT/MACHINING SPECIFICATIONS 33 9 VENDOR/ INDUSTRY ANALYSIS FINDING 39 10 MIS REQUIREMENTS/IMPROVEMENTS 45 11 COST BENEFIT ANALYSIS 48 12 IMPLEMENTATION
New master program in management in biophotonics and biotechnologies
NASA Astrophysics Data System (ADS)
Meglinski, I. V.; Tuchin, V. V.
2006-08-01
We develop new graduate educational highly interdisciplinary program that will be useful for addressing problems in worldwide biotechnologies and related biomedical industries. This Master program called Management in Biophotonics and Biotechnologies provides students with the necessary training, education and problem-solving skills to produce managers who are better equipped to handle the challenges of modern business in modern biotechnologies. Administered jointly by Cranfield University (UK) and Saratov State University, Russia) graduates possess a blend of engineering, biotechnologies, business and interpersonal skills necessary for success in industry. The Master courses combine a regular year program in biophotonics & biotechnologies disciplines with the core requirements of a Master degree. A major advantage of the program is that it will provide skills not currently available to graduates in any other program, and it will give the graduates an extra competitive edge for getting a job then.
The Marshall Islands Data Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoker, A.C.; Conrado, C.L.
1995-09-01
This report is a resource document of the methods and procedures used currently in the Data Management Program of the Marshall Islands Dose Assessment and Radioecology Project. Since 1973, over 60,000 environmental samples have been collected. Our program includes relational database design, programming and maintenance; sample and information management; sample tracking; quality control; and data entry, evaluation and reduction. The usefulness of scientific databases involves careful planning in order to fulfill the requirements of any large research program. Compilation of scientific results requires consolidation of information from several databases, and incorporation of new information as it is generated. The successmore » in combining and organizing all radionuclide analysis, sample information and statistical results into a readily accessible form, is critical to our project.« less
Benchmarking distributed data warehouse solutions for storing genomic variant information
Wiewiórka, Marek S.; Wysakowicz, Dawid P.; Okoniewski, Michał J.
2017-01-01
Abstract Genomic-based personalized medicine encompasses storing, analysing and interpreting genomic variants as its central issues. At a time when thousands of patientss sequenced exomes and genomes are becoming available, there is a growing need for efficient database storage and querying. The answer could be the application of modern distributed storage systems and query engines. However, the application of large genomic variant databases to this problem has not been sufficiently far explored so far in the literature. To investigate the effectiveness of modern columnar storage [column-oriented Database Management System (DBMS)] and query engines, we have developed a prototypic genomic variant data warehouse, populated with large generated content of genomic variants and phenotypic data. Next, we have benchmarked performance of a number of combinations of distributed storages and query engines on a set of SQL queries that address biological questions essential for both research and medical applications. In addition, a non-distributed, analytical database (MonetDB) has been used as a baseline. Comparison of query execution times confirms that distributed data warehousing solutions outperform classic relational DBMSs. Moreover, pre-aggregation and further denormalization of data, which reduce the number of distributed join operations, significantly improve query performance by several orders of magnitude. Most of distributed back-ends offer a good performance for complex analytical queries, while the Optimized Row Columnar (ORC) format paired with Presto and Parquet with Spark 2 query engines provide, on average, the lowest execution times. Apache Kudu on the other hand, is the only solution that guarantees a sub-second performance for simple genome range queries returning a small subset of data, where low-latency response is expected, while still offering decent performance for running analytical queries. In summary, research and clinical applications that require the storage and analysis of variants from thousands of samples can benefit from the scalability and performance of distributed data warehouse solutions. Database URL: https://github.com/ZSI-Bio/variantsdwh PMID:29220442
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health Submission for OMB Review; 30-day Comment Request: The Clinical Trials Reporting Program (CTRP) Database (NCI) SUMMARY: Under... Program (CTRP) Database, 0925-0600, Expiration Date 3/31/2013--REINSTATEMENT WITH CHANGE, National Cancer...
NASA Astrophysics Data System (ADS)
Wallace, K.; Leonard, G.; Stewart, C.; Wilson, T. M.; Randall, M.; Stovall, W. K.
2015-12-01
The internationally collaborative volcanic ash website (http://volcanoes.usgs.gov/ash/) has been an important global information resource for ashfall preparedness and impact guidance since 2004. Recent volcanic ashfalls with significant local, regional, and global impacts highlighted the need to improve the website to make it more accessible and pertinent to users worldwide. Recently, the Volcanic Ash Impacts Working Group (Cities and Volcanoes Commission of IAVCEI) redesigned and modernized the website. Improvements include 1) a database-driven back end, 2) reorganized menu navigation, 3) language translation, 4) increased downloadable content, 5) addition of ash-impact case studies, 7) expanded and updated references , 8) an image database, and 9) inclusion of cooperating organization's logos. The database-driven platform makes the website more dynamic and efficient to operate and update. New menus provide information about specific impact topics (buildings, transportation, power, health, agriculture, water and waste water, equipment and communications, clean up) and updated content has been added throughout all topics. A new "for scientists" menu includes information on ash collection and analysis. Website translation using Google translate will significantly increase user base. Printable resources (e.g. checklists, pamphlets, posters) provide information to people without Internet access. Ash impact studies are used to improve mitigation measures during future eruptions, and links to case studies will assist communities' preparation and response plans. The Case Studies menu is intended to be a living topic area, growing as new case studies are published. A database of all images from the website allows users to access larger resolution images and additional descriptive details. Logos clarify linkages among key contributors and assure users that the site is authoritative and science-based.
A Summary of the Naval Postgraduate School Research Program
1989-08-30
5 Fundamental Theory for Automatically Combining Changes to Software Systems ............................ 6 Database -System Approach to...Software Engineering Environments(SEE’s) .................................. 10 Multilevel Database Security .......................... 11 Temporal... Database Management and Real-Time Database Computers .................................... 12 The Multi-lingual, Multi Model, Multi-Backend Database
NASA Astrophysics Data System (ADS)
Bono, Andrea
2007-01-01
The recovery and preservation of the patrimony made of the instrumental registrations regarding the historical earthquakes is with no doubt a subject of great interest. This attention, besides being purely historical, must necessarily be also scientific. In fact, the availability of a great amount of parametric information on the seismic activity in a given area is a doubtless help to the seismologic researcher's activities. In this article the project of the Sismos group of the National Institute of Geophysics and Volcanology of Rome new database is presented. In the structure of the new scheme the matured experience of five years of activity is summarized. We consider it useful for those who are approaching to "recovery and reprocess" computer based facilities. In the past years several attempts on Italian seismicity have followed each other. It has almost never been real databases. Some of them have had positive success because they were well considered and organized. In others it was limited in supplying lists of events with their relative hypocentral standards. What makes this project more interesting compared to the previous work is the completeness and the generality of the managed information. For example, it will be possible to view the hypocentral information regarding a given historical earthquake; it will be possible to research the seismograms in raster, digital or digitalized format, the information on times of arrival of the phases in the various stations, the instrumental standards and so on. The relational modern logic on which the archive is based, allows the carrying out of all these operations with little effort. The database described below will completely substitute Sismos' current data bank. Some of the organizational principles of this work are similar to those that inspire the database for the real-time monitoring of the seismicity in use in the principal offices of international research. A modern planning logic in a distinctly historical context is introduced. Following are the descriptions of the various planning phases, from the conceptual level to the physical implementation of the scheme. Each time principle instructions, rules, considerations of technical-scientific nature are highlighted that take to the final result: a vanguard relational scheme for historical data.
ERIC Educational Resources Information Center
Cooper, David E.
This Congressional testimony focuses on the challenges faced by the District of Columbia in modernizing its public schools. Specifically, it addresses: (1) increases in the cost of modernizing the schools; (2) delays in completing the schools; (3) quality inspection problems; and (4) concerns about managing asbestos hazards. The testimony…
Modernizing the Federal Government: Paying for Performance
2007-01-01
works (Barr, 2007d). Employees are rated on performance measures such as “fair and equitable treatment of taxpayers” and “customer satisfaction ... Performance Act of 2007, Senate bill 1046, Washington, D.C., 2007b. 38 Modernizing the Federal Government: Paying for Performance Vroom , Victor H...AND SUBTITLE Modernizing the federal government paying for performance 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR
Effects of Budget Reductions on Army Acquisition Support of Equipping and Modernization Goals
2015-04-16
overall Army budgets are significantly reduced (34% since 2008), maintaining the entire equipment portfolio reduces the funding available to meet...the Mission Command portfolio , examine their impact on equipping and modernization, and make recommendations on how to divest the equipment no longer... portfolio of equipment being managed and the link to the new Defense guidance and Army equipping guidance and modernization plans. Any systems or programs
NASA Technical Reports Server (NTRS)
Steck, Daniel
2009-01-01
This report documents the generation of an outbound Earth to Moon transfer preliminary database consisting of four cases calculated twice a day for a 19 year period. The database was desired as the first step in order for NASA to rapidly generate Earth to Moon trajectories for the Constellation Program using the Mission Assessment Post Processor. The completed database was created running a flight trajectory and optimization program, called Copernicus, in batch mode with the use of newly created Matlab functions. The database is accurate and has high data resolution. The techniques and scripts developed to generate the trajectory information will also be directly used in generating a comprehensive database.
Using Geocoded Databases in Teaching Urban Historical Geography.
ERIC Educational Resources Information Center
Miller, Roger P.
1986-01-01
Provides information regarding hardware and software requirements for using geocoded databases in urban historical geography. Reviews 11 IBM and Apple Macintosh database programs and describes the pen plotter and digitizing table interface used with the databases. (JDH)
A World Wide Web (WWW) server database engine for an organelle database, MitoDat.
Lemkin, P F; Chipperfield, M; Merril, C; Zullo, S
1996-03-01
We describe a simple database search engine "dbEngine" which may be used to quickly create a searchable database on a World Wide Web (WWW) server. Data may be prepared from spreadsheet programs (such as Excel, etc.) or from tables exported from relationship database systems. This Common Gateway Interface (CGI-BIN) program is used with a WWW server such as available commercially, or from National Center for Supercomputer Algorithms (NCSA) or CERN. Its capabilities include: (i) searching records by combinations of terms connected with ANDs or ORs; (ii) returning search results as hypertext links to other WWW database servers; (iii) mapping lists of literature reference identifiers to the full references; (iv) creating bidirectional hypertext links between pictures and the database. DbEngine has been used to support the MitoDat database (Mendelian and non-Mendelian inheritance associated with the Mitochondrion) on the WWW.
Al-Nasheri, Ahmed; Muhammad, Ghulam; Alsulaiman, Mansour; Ali, Zulfiqar; Mesallam, Tamer A; Farahat, Mohamed; Malki, Khalid H; Bencherif, Mohamed A
2017-01-01
Automatic voice-pathology detection and classification systems may help clinicians to detect the existence of any voice pathologies and the type of pathology from which patients suffer in the early stages. The main aim of this paper is to investigate Multidimensional Voice Program (MDVP) parameters to automatically detect and classify the voice pathologies in multiple databases, and then to find out which parameters performed well in these two processes. Samples of the sustained vowel /a/ of normal and pathological voices were extracted from three different databases, which have three voice pathologies in common. The selected databases in this study represent three distinct languages: (1) the Arabic voice pathology database; (2) the Massachusetts Eye and Ear Infirmary database (English database); and (3) the Saarbruecken Voice Database (German database). A computerized speech lab program was used to extract MDVP parameters as features, and an acoustical analysis was performed. The Fisher discrimination ratio was applied to rank the parameters. A t test was performed to highlight any significant differences in the means of the normal and pathological samples. The experimental results demonstrate a clear difference in the performance of the MDVP parameters using these databases. The highly ranked parameters also differed from one database to another. The best accuracies were obtained by using the three highest ranked MDVP parameters arranged according to the Fisher discrimination ratio: these accuracies were 99.68%, 88.21%, and 72.53% for the Saarbruecken Voice Database, the Massachusetts Eye and Ear Infirmary database, and the Arabic voice pathology database, respectively. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Evolution of the use of relational and NoSQL databases in the ATLAS experiment
NASA Astrophysics Data System (ADS)
Barberis, D.
2016-09-01
The ATLAS experiment used for many years a large database infrastructure based on Oracle to store several different types of non-event data: time-dependent detector configuration and conditions data, calibrations and alignments, configurations of Grid sites, catalogues for data management tools, job records for distributed workload management tools, run and event metadata. The rapid development of "NoSQL" databases (structured storage services) in the last five years allowed an extended and complementary usage of traditional relational databases and new structured storage tools in order to improve the performance of existing applications and to extend their functionalities using the possibilities offered by the modern storage systems. The trend is towards using the best tool for each kind of data, separating for example the intrinsically relational metadata from payload storage, and records that are frequently updated and benefit from transactions from archived information. Access to all components has to be orchestrated by specialised services that run on front-end machines and shield the user from the complexity of data storage infrastructure. This paper describes this technology evolution in the ATLAS database infrastructure and presents a few examples of large database applications that benefit from it.
DB Dehydrogenase: an online integrated structural database on enzyme dehydrogenase.
Nandy, Suman Kumar; Bhuyan, Rajabrata; Seal, Alpana
2012-01-01
Dehydrogenase enzymes are almost inevitable for metabolic processes. Shortage or malfunctioning of dehydrogenases often leads to several acute diseases like cancers, retinal diseases, diabetes mellitus, Alzheimer, hepatitis B & C etc. With advancement in modern-day research, huge amount of sequential, structural and functional data are generated everyday and widens the gap between structural attributes and its functional understanding. DB Dehydrogenase is an effort to relate the functionalities of dehydrogenase with its structures. It is a completely web-based structural database, covering almost all dehydrogenases [~150 enzyme classes, ~1200 entries from ~160 organisms] whose structures are known. It is created by extracting and integrating various online resources to provide the true and reliable data and implemented by MySQL relational database through user friendly web interfaces using CGI Perl. Flexible search options are there for data extraction and exploration. To summarize, sequence, structure, function of all dehydrogenases in one place along with the necessary option of cross-referencing; this database will be utile for researchers to carry out further work in this field. The database is available for free at http://www.bifku.in/DBD/
Authentication Based on Pole-zero Models of Signature Velocity
Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad
2013-01-01
With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797
The Results of Development of the Project ZOOINT and its Future Perspectives
NASA Astrophysics Data System (ADS)
Smirnov, I. S.; Lobanov, A. L.; Alimov, A. F.; Medvedev, S. G.; Golikov, A. A.
The work on a computerization of main processes of accumulation and analysis of the collection, expert and literary data on a systematics and faunistics of various taxa of animal (a basis of studying of a biological diversity) was started in the Zoological Institute in 1987. In 1991 the idea of creating of the software package, ZOOlogical INTegrated system (ZOOINT) was born. ZOOINT could provide a loading operation about collections and simultaneously would allow to analyze the accumulated data with the help of various queries. During execution, the project ZOOINT was transformed slightly and has given results a little bit distinguished from planned earlier, but even more valuable. In the Internet the site about the information retrieval system (IRS) ZOOINT was built also ( ZOOINT ). The implementation of remote access to the taxonomic information, with possibility to work with databases (DB) of the IRS ZOOINT in the on-line mode was scheduled. It has required not only innovation of computer park of the developers and users, but also mastering of new software: language HTML, operating system of Windows NT, and technology of Active Server Pages (ASP). One of the serious problems of creating of databases and the IRS on zoology is the problem of representation of hierarchical classification. Building the classifiers, specialized standard taxonomic databases, which have obtained the name ZOOCOD solved this problem. The lately magnified number of attempts of creating of taxonomic electronic lists, tables and DB has required development of some primary rules of unification of zoological systematic databases. These rules assume their application in institutes of the biological profile, in which the processes of a computerization are very slowly, and the building of databases is in the most rudimentary state. These some positions and the standards of construction of biological (taxonomic) databases should facilitate dialogue of the biologists, application in the near future of most advanced technologies of development of the DB (for example, usage of the XML platform) and, eventually, building of the modern information systems. The work on the project is carried out at support of the RFBR grant N 02-07-90217; programs "The Information system on a biodiversity of Russia" and Project N 15 "Antarctic Regions".
34 CFR 657.5 - What definitions apply?
Code of Federal Regulations, 2011 CFR
2011-07-01
..., DEPARTMENT OF EDUCATION FOREIGN LANGUAGE AND AREA STUDIES FELLOWSHIPS PROGRAM General § 657.5 What... activities, including training in modern foreign languages and various academic disciplines, in its subject... activities in modern foreign language training and related studies. (Authority: 20 U.S.C. 1122) ...
ERIC Educational Resources Information Center
Myint-U, Athi; O'Donnell, Lydia; Osher, David; Petrosino, Anthony; Stueve, Ann
2008-01-01
Despite evidence that some dropout prevention programs have positive effects, whether districts in the region are using such evidence-based programs has not been documented. To generate and share knowledge on dropout programs and policies, this report details a project to create a searchable database with information on target audiences,…
Surface Transportation Security Priority Assessment
2010-03-01
intercity buses), and pipelines, and related infrastructure (including roads and highways), that are within the territory of the United States...Modernizing the information technology infrastructure used to vet the identity of travelers and transportation workers Using terrorist databases to...examination of persons travelling , surface transportation modes tend to operate in a much more open environment, making it difficult to screen workers
Human and biophysical factors influencing modern fire disturbance in northern Wisconsin
Brian R. Sturtevant; David T. Cleland
2007-01-01
Humans cause most wildfires in northern Wisconsin, but interactions between human and biophysical variables affecting fire starts and size are not well understood. We applied classification tree analyses to a 16-year fire database from northern Wisconsin to evaluate the relative importance of human v. biophysical variables affecting fire occurrence within (1) all cover...
1985-05-24
Tracor INDUSTRIAL TECHNOLOGY MODERNIZATION PROGRAM DTICRt .1ECTE CDJUN07 1989 00 PHASE 3 PROPOSAL CATEGORY 1 PROJECT COUNTERMEASURES ASSEMBLY...package in bin C V_ Put-package back in bin C Put part in plastic bag 0CDV _7 _ ] Seal plastic bag with stapler CDDV _ _- 1 Mark paperwork CDV __ I Peel...part in plastic bag CDV7 Seal plastic bag with stapler C>CDV _ Mark paperwork ~CV_ _ Peel preprinted tag from sheet ~ D Put preprinted tag on plastic
MPD3: a useful medicinal plants database for drug designing.
Mumtaz, Arooj; Ashfaq, Usman Ali; Ul Qamar, Muhammad Tahir; Anwar, Farooq; Gulzar, Faisal; Ali, Muhammad Amjad; Saari, Nazamid; Pervez, Muhammad Tariq
2017-06-01
Medicinal plants are the main natural pools for the discovery and development of new drugs. In the modern era of computer-aided drug designing (CADD), there is need of prompt efforts to design and construct useful database management system that allows proper data storage, retrieval and management with user-friendly interface. An inclusive database having information about classification, activity and ready-to-dock library of medicinal plant's phytochemicals is therefore required to assist the researchers in the field of CADD. The present work was designed to merge activities of phytochemicals from medicinal plants, their targets and literature references into a single comprehensive database named as Medicinal Plants Database for Drug Designing (MPD3). The newly designed online and downloadable MPD3 contains information about more than 5000 phytochemicals from around 1000 medicinal plants with 80 different activities, more than 900 literature references and 200 plus targets. The designed database is deemed to be very useful for the researchers who are engaged in medicinal plants research, CADD and drug discovery/development with ease of operation and increased efficiency. The designed MPD3 is a comprehensive database which provides most of the information related to the medicinal plants at a single platform. MPD3 is freely available at: http://bioinform.info .
Resources | Office of Cancer Genomics
OCG provides a variety of scientific and educational resources for both cancer researchers and members of the general public. These resources are divided into the following types: OCG-Supported Resources: Tools, databases, and reagents generated by initiated and completed OCG programs for researchers, educators, and students. (Note: Databases for current OCG programs are available through program-specific data matrices)
ERIC Educational Resources Information Center
Pfeiffer, Jay J.
Florida's Education and Training Placement Information Program (FETPIP) is a statewide system linking the administrative databases of certain state and federal agencies to collect follow-up data on former students or program participants. The databases that are collected include those of the Florida Department of Corrections; Florida Department of…
The NSO FTS database program and archive (FTSDBM)
NASA Technical Reports Server (NTRS)
Lytle, D. M.
1992-01-01
Data from the NSO Fourier transform spectrometer is being re-archived from half inch tape onto write-once compact disk. In the process, information about each spectrum and a low resolution copy of each spectrum is being saved into an on-line database. FTSDBM is a simple database management program in the NSO external package for IRAF. A command language allows the FTSDBM user to add entries to the database, delete entries, select subsets from the database based on keyword values including ranges of values, create new database files based on these subsets, make keyword lists, examine low resolution spectra graphically, and make disk number/file number lists. Once the archive is complete, FTSDBM will allow the database to be efficiently searched for data of interest to the user and the compact disk format will allow random access to that data.
The GCP molecular marker toolkit, an instrument for use in breeding food security crops.
Van Damme, Veerle; Gómez-Paniagua, Humberto; de Vicente, M Carmen
2011-12-01
Crop genetic resources carry variation useful for overcoming the challenges of modern agriculture. Molecular markers can facilitate the selection of agronomically important traits. The pervasiveness of genomics research has led to an overwhelming number of publications and databases, which are, nevertheless, scattered and hence often difficult for plant breeders to access, particularly those in developing countries. This situation separates them from developed countries, which have better endowed programs for developing varieties. To close this growing knowledge gap, we conducted an intensive literature review and consulted with more than 150 crop experts on the use of molecular markers in the breeding program of 19 food security crops. The result was a list of effectively used and highly reproducible sequence tagged site (STS), simple sequence repeat (SSR), single nucleotide polymorphism (SNP), and sequence characterized amplified region (SCAR) markers. However, only 12 food crops had molecular markers suitable for improvement. That is, marker-assisted selection is not yet used for Musa spp., coconut, lentils, millets, pigeonpea, sweet potato, and yam. For the other 12 crops, 214 molecular markers were found to be effectively used in association with 74 different traits. Results were compiled as the GCP Molecular Marker Toolkit, a free online tool that aims to promote the adoption of molecular approaches in breeding activities.
PSOVina: The hybrid particle swarm optimization algorithm for protein-ligand docking.
Ng, Marcus C K; Fong, Simon; Siu, Shirley W I
2015-06-01
Protein-ligand docking is an essential step in modern drug discovery process. The challenge here is to accurately predict and efficiently optimize the position and orientation of ligands in the binding pocket of a target protein. In this paper, we present a new method called PSOVina which combined the particle swarm optimization (PSO) algorithm with the efficient Broyden-Fletcher-Goldfarb-Shannon (BFGS) local search method adopted in AutoDock Vina to tackle the conformational search problem in docking. Using a diverse data set of 201 protein-ligand complexes from the PDBbind database and a full set of ligands and decoys for four representative targets from the directory of useful decoys (DUD) virtual screening data set, we assessed the docking performance of PSOVina in comparison to the original Vina program. Our results showed that PSOVina achieves a remarkable execution time reduction of 51-60% without compromising the prediction accuracies in the docking and virtual screening experiments. This improvement in time efficiency makes PSOVina a better choice of a docking tool in large-scale protein-ligand docking applications. Our work lays the foundation for the future development of swarm-based algorithms in molecular docking programs. PSOVina is freely available to non-commercial users at http://cbbio.cis.umac.mo .
The purpose of this SOP is to define the procedures involved in appending cleaned individual data batches to the master databases. This procedure applies to the Arizona NHEXAS project and the Border study. Keywords: data; database.
The U.S.-Mexico Border Program is sponsored b...
A TEX86 surface sediment database and extended Bayesian calibration
NASA Astrophysics Data System (ADS)
Tierney, Jessica E.; Tingley, Martin P.
2015-06-01
Quantitative estimates of past temperature changes are a cornerstone of paleoclimatology. For a number of marine sediment-based proxies, the accuracy and precision of past temperature reconstructions depends on a spatial calibration of modern surface sediment measurements to overlying water temperatures. Here, we present a database of 1095 surface sediment measurements of TEX86, a temperature proxy based on the relative cyclization of marine archaeal glycerol dialkyl glycerol tetraether (GDGT) lipids. The dataset is archived in a machine-readable format with geospatial information, fractional abundances of lipids (if available), and metadata. We use this new database to update surface and subsurface temperature calibration models for TEX86 and demonstrate the applicability of the TEX86 proxy to past temperature prediction. The TEX86 database confirms that surface sediment GDGT distribution has a strong relationship to temperature, which accounts for over 70% of the variance in the data. Future efforts, made possible by the data presented here, will seek to identify variables with secondary relationships to GDGT distributions, such as archaeal community composition.
Providing R-Tree Support for Mongodb
NASA Astrophysics Data System (ADS)
Xiang, Longgang; Shao, Xiaotian; Wang, Dehao
2016-06-01
Supporting large amounts of spatial data is a significant characteristic of modern databases. However, unlike some mature relational databases, such as Oracle and PostgreSQL, most of current burgeoning NoSQL databases are not well designed for storing geospatial data, which is becoming increasingly important in various fields. In this paper, we propose a novel method to provide R-tree index, as well as corresponding spatial range query and nearest neighbour query functions, for MongoDB, one of the most prevalent NoSQL databases. First, after in-depth analysis of MongoDB's features, we devise an efficient tabular document structure which flattens R-tree index into MongoDB collections. Further, relevant mechanisms of R-tree operations are issued, and then we discuss in detail how to integrate R-tree into MongoDB. Finally, we present the experimental results which show that our proposed method out-performs the built-in spatial index of MongoDB. Our research will greatly facilitate big data management issues with MongoDB in a variety of geospatial information applications.
Relax with CouchDB - Into the non-relational DBMS era of Bioinformatics
Manyam, Ganiraju; Payton, Michelle A.; Roth, Jack A.; Abruzzo, Lynne V.; Coombes, Kevin R.
2012-01-01
With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. PMID:22609849
PseudoBase: a database with RNA pseudoknots.
van Batenburg, F H; Gultyaev, A P; Pleij, C W; Ng, J; Oliehoek, J
2000-01-01
PseudoBase is a database containing structural, functional and sequence data related to RNA pseudo-knots. It can be reached at http://wwwbio. Leiden Univ.nl/ approximately Batenburg/PKB.html. This page will direct the user to a retrieval page from where a particular pseudoknot can be chosen, or to a submission page which enables the user to add pseudoknot information to the database or to an informative page that elaborates on the various aspects of the database. For each pseudoknot, 12 items are stored, e.g. the nucleotides of the region that contains the pseudoknot, the stem positions of the pseudoknot, the EMBL accession number of the sequence that contains this pseudoknot and the support that can be given regarding the reliability of the pseudoknot. Access is via a small number of steps, using 16 different categories. The development process was done by applying the evolutionary methodology for software development rather than by applying the methodology of the classical waterfall model or the more modern spiral model.
Laboratory Information Systems.
Henricks, Walter H
2015-06-01
Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Lawrence
Calibrations and conditions databases can be accessed from within the JANA Event Processing framework through the API defined in its JCalibration base class. The API is designed to support everything from databases, to web services to flat files for the backend. A Web Service backend using the gSOAP toolkit has been implemented which is particularly interesting since it addresses many modern cybersecurity issues including support for SSL. The API allows constants to be retrieved through a single line of C++ code with most of the context, including the transport mechanism, being implied by the run currently being analyzed and themore » environment relieving developers from implementing such details.« less
Williams, Robin; Johnson, Paul
2005-01-01
This paper examines the increasing police use of DNA profiling and databasing as a developing instrumentality of modern state surveillance. It briefly notes previously published work on a variety of surveillance technologies and their role in the governance of social action and social order. It then argues that there are important differences amongst the ways in which several such technologies construct and use identificatory artefacts, their orientations to human subjectivity, and their role in the governmentality of citizens and others. The paper then describes the novel and powerful form of bio-surveillance offered by DNA profiling and illustrates this by reference to an ongoing empirical study of the police uses of the UK National DNA Database for the investigation of crime. It is argued that DNA profiling and databasing enable the construction of a ‘closed circuit’ of surveillance of a defined population. PMID:16467920
Lubricant Formulations to Enhance Engine Efficiency in Modern Internal Combustion Engines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Wai; Wong, Victor; Plumley, Michael
2017-04-19
The research program presented aimed to investigate, develop, and demonstrate low-friction, environmentally-friendly and commercially-feasible lubricant formulations that would significantly improve the mechanical efficiency of modern engines without incurring increased wear, emissions or deterioration of the emission-aftertreatment system.
Practice-Oriented Teachers' Training: Innovative Approach
ERIC Educational Resources Information Center
Shukshina, Tatjana I.; Gorshenina, Svetlana N.; Buyanova, Irina B.; Neyasova, Irina A.
2016-01-01
Modernization of Russian education meets the global trend of professionalization of teachers' training which assumes strengthening the practical orientation of educational programs as a significant factor in increasing the competitiveness of the teacher in the modern educational environment. The purpose of the article is to identify and…
Kentucky geotechnical database.
DOT National Transportation Integrated Search
2005-03-01
Development of a comprehensive dynamic, geotechnical database is described. Computer software selected to program the client/server application in windows environment, components and structure of the geotechnical database, and primary factors cons...
Alternatives for Modernizing U.S. Fighter Forces
2009-05-01
Plans for Modernizing Fighter Forces xS -2. The Possible Role of the F-22 Program in Mitigating the Air Force’s Projected Inventory Shortfall xvi1- 1 ... Northrop Grumman won that competition with its proposal for an aircraft based on the RQ-4 Global Hawk. 38 ALTERNATIVES FOR MODERNIZING U.S. FIGHTER...flyable demonstrator for exploring technologies that could lead to an operational UCAV-N, was rolled out by Northrop Grumman in December 2008. Its
ERIC Educational Resources Information Center
Joint Economic Committee, Washington, DC.
The policies and performance of the post-Mao Chinese government (1976 to the present) in the four modernization areas of industry, agriculture, science and technology, and the military are examined. Realizing that the program to modernize the economy of the People's Republic of China, which was initiated by Mao's successors in 1977, was much too…
Fielding a Division Staff in the Modern Day
2016-06-10
FIELDING A DIVISION STAFF IN THE MODERN DAY A thesis presented to the Faculty of the U.S. Army Command and General Staff ...TITLE AND SUBTITLE Fielding a Division Staff in the Modern Day 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...ADDRESS(ES) U.S. Army Command and General Staff College ATTN: ATZL-SWD-GD Fort Leavenworth, KS 66027-2301 8. PERFORMING ORG REPORT NUMBER 9
Manned Certification Tests of the Modernized MK 16 MOD 1
2013-11-01
CERTIFICATION TESTS OF THE MODERNIZED MK 16 MOD 1 Authors: D. E . Warkander, Ph.D. D. J. Doolette, Ph.D...PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dan E . Warkander; David J. Doolette; Paul C. Algra 5d. PROJECT NUMBER 5e. TASK NUMBER 10-08 5f. WORK UNIT...electronics must be turned on when calibrating the secondary display. 11 REFERENCES 1. R. P. Layton , Unmanned Evaluation of the Modernized MK 16 MOD
State variations in women's socioeconomic status and use of modern contraceptives in Nigeria.
Lamidi, Esther O
2015-01-01
According to the 2014 World Population Data Sheet, Nigeria has one of the highest fertility and lowest contraceptive prevalence rates around the world. However, research suggests that national contraceptive prevalence rate overshadows enormous spatial variations in reproductive behavior in the country. I examined the variations in women's socioeconomic status and modern contraceptive use across states in Nigeria. Using the 2013 Nigeria Demographic and Health Survey data (n = 18,910), I estimated the odds of modern contraceptive use among sexually active married and cohabiting women in a series of multilevel logistic regression models. The share of sexually active, married and cohabiting women using modern contraceptives widely varied, from less than one percent in Kano, Yobe, and Jigawa states, to 40 percent in Osun state. Most of the states with low contraceptive prevalence rates also ranked low on women's socioeconomic attributes. Results of multilevel logistic regression analyses showed that women residing in states with greater shares of women with secondary or higher education, higher female labor force participation rates, and more women with health care decision-making power, had significantly higher odds of using modern contraceptives. Differences in women's participation in health care decisions across states remained significantly associated with modern contraceptive use, net of individual-level socioeconomic status and other covariates of modern contraceptive use. Understanding of state variations in contraceptive use is crucial to the design and implementation of family planning programs. The findings reinforce the need for state-specific family planning programs in Nigeria.
GLOBE Program's Data and Information System
NASA Astrophysics Data System (ADS)
Memarsadeghi, N.; Overoye, D.; Lewis, C.; Butler, D. M.; Ramapriyan, H.
2016-12-01
"The Global Learning and Observations to Benefit the Environment (GLOBE) Program is an international science and education program that provides students and the public worldwide with the opportunity to participate in data collection and the scientific process, and contribute meaningfully to our understanding of the Earth system and global environment" (www.globe.gov ). GLOBE Program has a rich community of students, teachers, scientists, trainers, country coordinators, and alumni across the world, technologically spanning both high- and low-end users. There are 117 GLOBE participating countries from around the world. GLOBE's Science data protocols and educational material span atmosphere, biosphere, hydrosphere, soil (pedosphere), and Earth as a System scientific areas (http://www.globe.gov/do-globe/globe-teachers-guide). GLOBE's Data and Information System (DIS), when first introduced in 1995, was a cutting edge system that was well-received and innovative for its time. However, internet-based technologies have changed dramatically since then. Projects to modernize and evolve the GLOBE DIS started in 2010, resulting in today's GLOBE DIS. The current GLOBE DIS is now built upon the latest information technologies and is engaging and supporting the user community with advanced tools and services to further the goals of the GLOBE Program. GLOBE DIS consists of over 20 years of observation and training data, a rich set of software systems and applications for data entry, visualization, and analysis, as well as tools for training users in various science data protocols and enabling collaborations among members of the international user community. We present the existing GLOBE DIS, application technologies, and lessons learned for their operations, development, sustaining engineering, and data management practices. Examples of GLOBE DIS technologies include Liferay System for integrated user and content management, a Postgress/PostGIS database, Ruby on Rails for Data Entry systems, and OpenGeo for Visualization system.
The 3D Elevation Program—Flood risk management
Carswell, William J.; Lukas, Vicki
2018-01-25
Flood-damage reduction in the United States has been a longstanding but elusive societal goal. The national strategy for reducing flood damage has shifted over recent decades from a focus on construction of flood-control dams and levee systems to a three-pronged strategy to (1) improve the design and operation of such structures, (2) provide more accurate and accessible flood forecasting, and (3) shift the Federal Emergency Management Agency (FEMA) National Flood Insurance Program to a more balanced, less costly flood-insurance paradigm. Expanding the availability and use of high-quality, three-dimensional (3D) elevation information derived from modern light detection and ranging (lidar) technologies to provide essential terrain data poses a singular opportunity to dramatically enhance the effectiveness of all three components of this strategy. Additionally, FEMA, the National Weather Service, and the U.S. Geological Survey (USGS) have developed tools and joint program activities to support the national strategy.The USGS 3D Elevation Program (3DEP) has the programmatic infrastructure to produce and provide essential terrain data. This infrastructure includes (1) data acquisition partnerships that leverage funding and reduce duplicative efforts, (2) contracts with experienced private mapping firms that ensure acquisition of consistent, low-cost 3D elevation data, and (3) the technical expertise, standards, and specifications required for consistent, edge-to-edge utility across multiple collection platforms and public access unfettered by individual database designs and limitations.High-quality elevation data, like that collected through 3DEP, are invaluable for assessing and documenting flood risk and communicating detailed information to both responders and planners alike. Multiple flood-mapping programs make use of USGS streamflow and 3DEP data. Flood insurance rate maps, flood documentation studies, and flood-inundation map libraries are products of these programs.
An integrative approach to cultural competence in the psychiatric curriculum.
Fung, Kenneth; Andermann, Lisa; Zaretsky, Ari; Lo, Hung-Tat
2008-01-01
As it is increasingly recognized that cultural competence is an essential quality for any practicing psychiatrist, postgraduate psychiatry training programs need to incorporate cultural competence training into their curricula. This article documents the unique approach to resident cultural competence training being developed in the Department of Psychiatry at the University of Toronto, which has the largest residency training program in North America and is situated in an ethnically diverse city and country. The authors conducted a systematic review of cultural competence by searching databases including PubMed, PsycINFO, PsycArticles, CINAHL, Social Science Abstracts, and Sociological Abstracts; by searching government and professional association publications; and through on-site visits to local cross-cultural training programs. Based on the results of the review, a resident survey, and a staff retreat, the authors developed a deliberate "integrative" approach with a mindful, balanced emphasis on both generic and specific cultural competencies. Learning objectives were derived from integrating the seven core competencies of a physician as defined by the Canadian Medical Education Directions for Specialists (CanMEDS) roles framework with the tripartite model of attitudes, knowledge, and skills. The learning objectives and teaching program were further integrated across different psychiatric subspecialties and across the successive years of residency. Another unique strategy used to foster curricular and institutional change was the program's emphasis on evaluation, making use of insights from modern educational theories such as formative feedback and blueprinting. Course evaluations of the core curriculum from the first group of residents were positive. The authors propose that these changes to the curriculum may lead to enhanced cultural competence and clinical effectiveness in health care.
Citric Acid Alternative to Nitric Acid Passivation
NASA Technical Reports Server (NTRS)
Lewis, Pattie L. (Compiler)
2013-01-01
The Ground Systems Development and Operations GSDO) Program at NASA John F. Kennedy Space Center (KSC) has the primary objective of modernizing and transforming the launch and range complex at KSC to benefit current and future NASA programs along with other emerging users. Described as the launch support and infrastructure modernization program in the NASA Authorization Act of 2010, the GSDO Program will develop and implement shared infrastructure and process improvements to provide more flexible, affordable, and responsive capabilities to a multi-user community. In support of the GSDO Program, the purpose of this project is to demonstratevalidate citric acid as a passivation agent for stainless steel. Successful completion of this project will result in citric acid being qualified for use as an environmentally preferable alternative to nitric acid for passivation of stainless steel alloys in NASA and DoD applications.
Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C
2002-04-01
During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.
Mina, Cheraghi Niroumand; Farzaei, Mohammad Hosein; Gholamreza, Amin
2015-02-01
To review the pharmacological activities of Peganum harmala L. (P. harmala, Nitrariaceae) in traditional Iranian medicine (TIM) and modern phytotherapy. Opinions of TIM and modern phytotherapy about safety and acceptable dosage of this plant are discussed. Various medical properties of P. harmala were collected from important TIM references and added to scientific reports derived from modern medical databases like PubMed, Scirus, ScienceDirect and Scopus. The main medicinal part of the plant is the seed. In TIM resources, this plant possesses various Pharmacological activities such as carminative, galactagogue, diuretic, emmenagogue, antithrombotic and analgesic. In modern phytotherapy, P. harmala demonstrated numerous medicinal effects including cardiovascular, neurologic, antimicrobial, insecticidal, antineoplasmic, antiproliferative, gastrointestinal and antidiabetic effects. Adverse events such as neuro-sensorial symptoms, visual hallucination, bradycardia, hypotension, agitation, tremors, ataxia, abortion and vomiting cause people to use this plant cautiously. P. harmala is contraindicated during pregnancy, due to its abortive and mutagenic activities. Because of increasing the expression of CYP1A2, 2C19, and 3A4 and inhibition of monoamine oxidase, the pharmacokinetic parameters of drugs which are mainly metabolized by these enzymes may be affected by P. harmala. The medicinal properties declared for this plant in TIM are compared with those showed in modern phytotherapy. Some of the TIM properties were confirmed in modern phytotherapy like emetic and analgesic activities and some have not been evaluated in modern phytotherapy such as its therapeutic effects on paralysis, epilepsy and numbness. Finally, the current review provides the evidence for other researchers to use TIM properties of P. harmala as an efficacious natural drug. Further preclinical and clinical studies for adequate evaluating safety and therapeutic efficacy are recommended.
The First Modern Human Dispersals across Africa
Rito, Teresa; Richards, Martin B.; Fernandes, Verónica; Alshamali, Farida; Cerny, Viktor
2013-01-01
The emergence of more refined chronologies for climate change and archaeology in prehistoric Africa, and for the evolution of human mitochondrial DNA (mtDNA), now make it feasible to test more sophisticated models of early modern human dispersals suggested by mtDNA distributions. Here we have generated 42 novel whole-mtDNA genomes belonging to haplogroup L0, the most divergent clade in the maternal line of descent, and analysed them alongside the growing database of African lineages belonging to L0’s sister clade, L1’6. We propose that the last common ancestor of modern human mtDNAs (carried by “mitochondrial Eve”) possibly arose in central Africa ~180 ka, at a time of low population size. By ~130 ka two distinct groups of anatomically modern humans co-existed in Africa: broadly, the ancestors of many modern-day Khoe and San populations in the south and a second central/eastern African group that includes the ancestors of most extant worldwide populations. Early modern human dispersals correlate with climate changes, particularly the tropical African “megadroughts” of MIS 5 (marine isotope stage 5, 135–75 ka) which paradoxically may have facilitated expansions in central and eastern Africa, ultimately triggering the dispersal out of Africa of people carrying haplogroup L3 ~60 ka. Two south to east migrations are discernible within haplogroup LO. One, between 120 and 75 ka, represents the first unambiguous long-range modern human dispersal detected by mtDNA and might have allowed the dispersal of several markers of modernity. A second one, within the last 20 ka signalled by L0d, may have been responsible for the spread of southern click-consonant languages to eastern Africa, contrary to the view that these eastern examples constitute relicts of an ancient, much wider distribution. PMID:24236171
Amick, G D
1999-01-01
A database containing names of mass spectral data files generated in a forensic toxicology laboratory and two Microsoft Visual Basic programs to maintain and search this database is described. The data files (approximately 0.5 KB/each) were collected from six mass spectrometers during routine casework. Data files were archived on 650 MB (74 min) recordable CD-ROMs. Each recordable CD-ROM was given a unique name, and its list of data file names was placed into the database. The present manuscript describes the use of search and maintenance programs for searching and routine upkeep of the database and creation of CD-ROMs for archiving of data files.
Configuration management program plan for Hanford site systems engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kellie, C.L.
This plan establishes the integrated management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford Site Technical Baseline.
ERIC Educational Resources Information Center
Lloyd-Strovas, Jenny D.; Arsuffi, Thomas L.
2016-01-01
We examined the diversity of environmental education (EE) in Texas, USA, by developing a framework to assess EE organizations and programs at a large scale: the Environmental Education Database of Organizations and Programs (EEDOP). This framework consisted of the following characteristics: organization/visitor demographics, pedagogy/curriculum,…
NASA Astrophysics Data System (ADS)
Lebedeva, Liudmila; Semenova, Olga
2013-04-01
One of widely claimed problems in modern modelling hydrology is lack of available information to investigate hydrological processes and improve their representation in the models. In spite of this, one hardly might confidently say that existing "traditional" data sources have been already fully analyzed and made use of. There existed the network of research watersheds in USSR called water-balance stations where comprehensive and extensive hydrometeorological measurements were conducted according to more or less single program during the last 40-60 years. The program (where not ceased) includes observations of discharges in several, often nested and homogeneous, small watersheds, meteorological elements, evaporation, soil temperature and moisture, snow depths, etc. The network covered different climatic and landscape zones and was established in the middle of the last century with the aim of investigation of the runoff formation in different conditions. Until recently the long-term observational data accompanied by descriptions and maps had existed only in hard copies. It partly explains why these datasets are not enough exploited yet and very rarely or even never were used for the purposes of hydrological modelling although they seem to be much more promising than implementation of the completely new measuring techniques not detracting from its importance. The goal of the presented work is development of a database of observational data and supportive materials from small research watersheds across the territory of the former Soviet Union. The first version of the database will include the following information for 12 water-balance stations across Russia, Ukraine, Kazahstan and Turkmenistan: daily values of discharges (one or several watersheds), air temperature, humidity, precipitation (one or several gauges), soil and snow state variables, soil and snow evaporation. The stations will cover desert and semi desert, steppe and forest steppe, forest, permafrost and mountainous zones. Supportive material will include maps of watershed boundaries and location of observational sites. Text descriptions of the data, measuring techniques and hydrometeorological conditions related to each of the water-balance station will accompany the datasets. The database is supposed to be expanded with time in number of the stations (by 20) and available data series for each of them. It will be uploaded to the internet with open access to everyone interested in. Such a database allows one to test hydrological models and separate modules for their adequacy and workability in different conditions and can serve as a base for models comparison and evaluation. Special profit of the database will gain models that don't rely on calibration but on the adequate process representation and use of the observable parameters. One of such models, process-based Hydrograph model, will be tested against the data from every watershed from the developed database. The aim of the Hydrograph model application to the as many as possible number of research data-rich watersheds in different climatic zones is both amending the algorithms and creation and adjustment of the model parameters that allow using the model across the geographic spectrum.
Developing a High Level Data Base to Teach Reproductive Endocrinology Using the HyperCard Program.
ERIC Educational Resources Information Center
Friedler, Yael; Shabo, Amnon
1990-01-01
Describes a database courseware using the HyperCard program on the subject of human reproductive endocrinology and feedback mechanisms. Discusses some issues concerning database courseware development. Presents several examples of the courseware display. (Author/YP)
University of Maryland MRSEC - Education: College
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher MRSEC Templates Opportunities Search Home » Education » Undergraduate/Graduate Programs Undergraduate
Opportune Landing Site CBR and Low-Density Laboratory Database
2008-05-01
Program Opportune Landing Site CBR and Low- Density Laboratory Database Larry S. Danyluk, Sally A. Shoop, Rosa T. Affleck, and Wendy L. Wieder...Opportune Landing Site Program ERDC/CRREL TR-08-9 May 2008 Opportune Landing Site CBR and Low- Density Laboratory Database Larry S. Danyluk, Sally A...reproduce in-situ density , moisture, and CBR values and therefore do not accurately repre- sent the complete range of these values measured in the field
How Programming Fits with Technology Education Curriculum
ERIC Educational Resources Information Center
Wright, Geoffrey A.; Rich, Peter; Leatham, Keith R.
2012-01-01
Programming is a fundamental component of modern society. Programming and its applications influence much of how people work and interact. Because of people's reliance on programming in one or many of its applications, there is a need to teach students to be programming literate. Because the purpose of the International Technology and Engineering…
Okigbo, Chinelo C; Speizer, Ilene S; Corroon, Meghan; Gueye, Abdou
2015-07-22
Family planning (FP) researchers and policy makers have often overlooked the importance of involving men in couples' fertility choices and contraception, despite the fact that male involvement is a vital factor in sexual and reproductive health programming. This study aimed to assess whether men's exposure to FP demand-generation activities is associated with their reported use of modern contraceptive methods. We used evaluation data from the Measurement, Learning & Evaluation project for the Urban Reproductive Health Initiative (URHI) in select cities of three African countries (Kenya, Nigeria, and Senegal) collected in 2012/2013. A two-stage cluster sampling design was used to select a representative sample of men in the study sites. The sample for this study includes men aged 15-59 years who had no missing data on any of the key variables: 696 men in Kenya, 2311 in Nigeria, and 1613 in Senegal. We conducted descriptive analyses and multivariate logistic regression analyses to assess the associations of interest. All analyses were weighted to account for the study design and non-response rates using Stata version 13. The proportion of men who reported use of modern contraceptive methods was 58 % in Kenya, 43 % in Nigeria, and 27 % in Senegal. About 80 % were exposed to at least one URHI demand-generation activity in each country. Certain URHI demand-generation activities were significantly associated with men's reported use of modern contraception. In Kenya, those who participated in URHI-led community events had four times higher odds of reporting use of modern methods (aOR: 3.70; p < 0.05) while in Senegal, exposure to URHI-television programs (aOR: 1.40; p < 0.05) and having heard a religious leader speak favorably about FP (aOR: 1.72; p < 0.05) were associated with modern contraceptive method use. No such associations were observed in Nigeria. Study findings are important for informing future FP program activities that seek to engage men. Program activities should be tailored by geographic context as results from this study indicate city and country-level variations. These types of gender-comprehensive and context-specific programs are likely to be the most successful at reducing unmet need for FP.
Status report for the 3D Elevation Program, 2013-2014
Lukas, Vicki; Eldridge, Diane F.; Jason, Allyson L.; Saghy, David L.; Steigerwald, Pamela R.; Stoker, Jason M.; Sugarbaker, Larry J.; Thunen, Diana R.
2015-09-25
The 3D Elevation Program (3DEP) goal is to acquire, manage, and distribute enhanced three-dimensional elevation data for the Nation and U.S. territories by 2023. This status report covers implementation activities during 2013–2014 to include meeting funding objectives, developing a management structure, modernizing systems, and collecting and producing initial 3DEP data and products. The Nation will not have complete coverage of 3DEP quality data until 2023 assuming that sufficient funding is available. In spite of the overall condition of government budgets, the 3DEP initiative has gained widespread support and had incremental budget success to include supplemental funding resulting from natural disasters. The 3DEP Executive Forum and a wide range of professional organizations are actively working to maintain support for the program. The systems that have been developed to support increasing acquisition and processing levels are largely in place. The first 3DEP quality datasets were released to the public in late 2014. In addition, light detection and ranging (lidar), interferometric synthetic aperture radar (ifsar), and digital elevation models (DEMs) acquired before 2014 are all supported within the new infrastructure and available for download. Research is ongoing to expand the suite of products and services, and to increase overall throughput and data management efficiency. Emerging technologies may result in lower acquisition costs in the future. Elevation data acquired by 3DEP partnerships will be available through The National Map representing one of the largest and most comprehensive databases publicly available for the United States.
Socrates Lives: Dialogue as a Means of Teaching and Learning
ERIC Educational Resources Information Center
Moberg, Eric M.
2008-01-01
The purpose of this paper is to argue for the ongoing use of dialogue as a modern pedagogical and andragogical method. The author reviewed 18 scholarly sources from three education databases in this literature review. The use of dialogue as mode of instruction dates from the Socratic Method of 399 B.C.E. to present uses. The literature reveals…
ERIC Educational Resources Information Center
Hlavaty, Greg; Townsend, Murphy
2010-01-01
Modern composition instructors often use and teach research methods for Internet search engines and electronic databases. It is not their intent to turn back the clock. However, if they can help students connect the world of Internet searches and the university library, they can promote information literacy in its broadest sense by developing…
Delshad, Elahe; Yousefi, Mahdi; Sasannezhad, Payam; Rakhshandeh, Hasan
2018-01-01
Background Carthamus tinctorius L., known as Kafesheh (Persian) and safflower (English) is vastly utilized in Traditional Medicine for various medical conditions, namely dysmenorrhea, amenorrhea, postpartum abdominal pain and mass, trauma and pain of joints. It is largely used for flavoring and coloring purposes among the local population. Recent reviews have addressed the uses of the plant in various ethnomedical systems. Objective This review was an update to provide a summary on the botanical features, uses in Iranian folklore and modern medical applications of safflower. Methods A main database containing important early published texts written in Persian, together with electronic papers was established on ethnopharmacology and modern pharmacology of C. tinctorius. Literature review was performed on the years from 1937 to 2016 in Web of Science, PubMed, Scientific Information Database, Google Scholar, and Scopus for the terms “Kafesheh”, “safflower”, “Carthamus tinctorius”, and so forth. Results Safflower is an indispensable element of Iranian folklore medicine, with a variety of applications due to laxative effects. Also, it was recommended as treatment for rheumatism and paralysis, vitiligo and black spots, psoriasis, mouth ulcers, phlegm humor, poisoning, numb limbs, melancholy humor, and the like. According to the modern pharmacological and clinical examinations, safflower provides promising opportunities for the amelioration of myocardial ischemia, coagulation, thrombosis, inflammation, toxicity, cancer, and so forth. However, there have been some reports on its undesirable effects on male and female fertility. Most of these beneficial therapeutic effects were correlated to hydroxysafflor yellow A. Conclusion More attention should be drawn to the lack of a thorough phytochemical investigation. The potential implications of safflower based on Persian traditional medicine, such as the treatment of rheumatism and paralysis, vitiligo and black spots, psoriasis, mouth ulcers, phlegm humor, poisoning, numb limbs, and melancholy humor warrant further consideration.
Cybersecurity in healthcare: A systematic review of modern threats and trends.
Kruse, Clemens Scott; Frederick, Benjamin; Jacobson, Taylor; Monticone, D Kyle
2017-01-01
The adoption of healthcare technology is arduous, and it requires planning and implementation time. Healthcare organizations are vulnerable to modern trends and threats because it has not kept up with threats. The objective of this systematic review is to identify cybersecurity trends, including ransomware, and identify possible solutions by querying academic literature. The reviewers conducted three separate searches through the CINAHL and PubMed (MEDLINE) and the Nursing and Allied Health Source via ProQuest databases. Using key words with Boolean operators, database filters, and hand screening, we identified 31 articles that met the objective of the review. The analysis of 31 articles showed the healthcare industry lags behind in security. Like other industries, healthcare should clearly define cybersecurity duties, establish clear procedures for upgrading software and handling a data breach, use VLANs and deauthentication and cloud-based computing, and to train their users not to open suspicious code. The healthcare industry is a prime target for medical information theft as it lags behind other leading industries in securing vital data. It is imperative that time and funding is invested in maintaining and ensuring the protection of healthcare technology and the confidentially of patient information from unauthorized access.
NASA Astrophysics Data System (ADS)
Brunet, V.; Molton, P.; Bézard, H.; Deck, S.; Jacquin, L.
2012-01-01
This paper describes the results obtained during the European Union JEDI (JEt Development Investigations) project carried out in cooperation between ONERA and Airbus. The aim of these studies was first to acquire a complete database of a modern-type engine jet installation set under a wall-to-wall swept wing in various transonic flow conditions. Interactions between the engine jet, the pylon, and the wing were studied thanks to ¤advanced¥ measurement techniques. In parallel, accurate Reynolds-averaged Navier Stokes (RANS) simulations were carried out from simple ones with the Spalart Allmaras model to more complex ones like the DRSM-SSG (Differential Reynolds Stress Modef of Speziale Sarkar Gatski) turbulence model. In the end, Zonal-Detached Eddy Simulations (Z-DES) were also performed to compare different simulation techniques. All numerical results are accurately validated thanks to the experimental database acquired in parallel. This complete and complex study of modern civil aircraft engine installation allowed many upgrades in understanding and simulation methods to be obtained. Furthermore, a setup for engine jet installation studies has been validated for possible future works in the S3Ch transonic research wind-tunnel. The main conclusions are summed up in this paper.
77 FR 66880 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... the database that stores information for the Lost and Stolen Securities Program. We estimate that 26... Lost and Stolen Securities Program database will be kept confidential. The Commission may not conduct... SECURITIES AND EXCHANGE COMMISSION Submission for OMB Review; Comment Request Upon Written Request...
DOT National Transportation Integrated Search
2006-05-01
Specific objectives of the Peer Exchange were: : Discuss and exchange information about databases and other software : used to support the program-cycles managed by state transportation : research offices. Elements of the program cycle include: :...
Code of Federal Regulations, 2011 CFR
2011-04-01
... purpose of funding physical and management improvements. Modernization program. A PHA's program for... substantially the same kind does qualify, but reconstruction, substantial improvement in the quality or kind of... resident participation in each of the required program components. PHMAP. The Public Housing Management...
34 CFR 662.3 - Who is eligible to receive a fellowship under this program?
Code of Federal Regulations, 2010 CFR
2010-07-01
... RESEARCH ABROAD FELLOWSHIP PROGRAM General § 662.3 Who is eligible to receive a fellowship under this..., is admitted to candidacy in a doctoral degree program in modern foreign languages and area studies at...
34 CFR 662.3 - Who is eligible to receive a fellowship under this program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... RESEARCH ABROAD FELLOWSHIP PROGRAM General § 662.3 Who is eligible to receive a fellowship under this..., is admitted to candidacy in a doctoral degree program in modern foreign languages and area studies at...
34 CFR 662.3 - Who is eligible to receive a fellowship under this program?
Code of Federal Regulations, 2013 CFR
2013-07-01
... RESEARCH ABROAD FELLOWSHIP PROGRAM General § 662.3 Who is eligible to receive a fellowship under this..., is admitted to candidacy in a doctoral degree program in modern foreign languages and area studies at...
34 CFR 662.3 - Who is eligible to receive a fellowship under this program?
Code of Federal Regulations, 2014 CFR
2014-07-01
... RESEARCH ABROAD FELLOWSHIP PROGRAM General § 662.3 Who is eligible to receive a fellowship under this..., is admitted to candidacy in a doctoral degree program in modern foreign languages and area studies at...
34 CFR 662.3 - Who is eligible to receive a fellowship under this program?
Code of Federal Regulations, 2012 CFR
2012-07-01
... RESEARCH ABROAD FELLOWSHIP PROGRAM General § 662.3 Who is eligible to receive a fellowship under this..., is admitted to candidacy in a doctoral degree program in modern foreign languages and area studies at...
Laboratory Resources Management in Manufacturing Systems Programs
ERIC Educational Resources Information Center
Obi, Samuel C.
2004-01-01
Most, if not all, industrial technology (IT) programs have laboratories or workshops. Often equipped with modern equipment, tools, materials, and measurement and test instruments, these facilities constitute a major investment for IT programs. Improper use or over use of program facilities may result in dirty lab equipment, lost or damaged tools,…
Utilizing Modern Technology in Adult and Continuing Education Programs.
ERIC Educational Resources Information Center
New York State Education Dept., Albany. Bureau of Curriculum Development.
This publication, designed as a supplement to the manual entitled "Managing Programs for Adults" (1983), provides guidelines for establishing or expanding the use of video and computers by administration and staff of adult education programs. The first section presents the use of video technology for program promotion, instruction, and staff…
77 FR 43084 - Multiple Award Schedule (MAS) Program Continuous Open Season-Operational Change
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-23
... Award Schedule (MAS) Program Continuous Open Season- Operational Change AGENCY: Federal Acquisition... proposing this operational change to enhance the performance of and modernize the MAS program in three key program areas: Small business viability, operational efficiency, and cost control. The DBM will realign...
Gueye, Abdou; Speizer, Ilene S; Corroon, Meghan; Okigbo, Chinelo C
2015-12-01
Negative myths and misconceptions about family planning are a barrier to modern contraceptive use. Most research on the subject has focused on individual beliefs about contraception; however, given that myths spread easily within communities, it is also important to examine how the prevalence of negative myths in a community affects the aggregate level of method use. Baseline data collected in 2010-2011 by the Measurement, Learning & Evaluation project on women aged 15-49 living in selected cities in Kenya, Nigeria and Senegal were used. Multivariate analyses examined associations between modern contraceptive use and belief in negative myths for individuals and communities. In each country, the family planning myths most prevalent at the individual and community levels were that "people who use contraceptives end up with health problems," "contraceptives are dangerous to women's health" and "contraceptives can harm your womb." On average, women in Nigeria and Kenya believed 2.7 and 4.6 out of eight selected myths, respectively, and women in Senegal believed 2.6 out of seven. Women's individual-level belief in myths was negatively associated with their modern contraceptive use in all three countries (odds ratios, 0.2-0.7). In Nigeria, the women's community-level myth variable was positively associated with modern contraceptive use (1.6), whereas the men's community-level myth variable was negatively associated with use (0.6); neither community-level variable was associated with modern contraceptive use in Kenya or Senegal. Education programs are needed to dispel common myths and misconceptions about modern contraceptives. In Nigeria, programs that encourage community-level discussions may be effective at reducing myths and increasing modern contraceptive use.
Gueye, Abdou; Speizer, Ilene S.; Corroon, Meghan; Okigbo, Chinelo C.
2016-01-01
Context Negative myths and misconceptions about family planning are a barrier to modern contraceptive use. Most research on the subject has focused on individual beliefs about contraception; however, given that myths spread easily within communities, it is also important to examine how the prevalence of negative myths in a community affects the aggregate level of method use. Methods Baseline data collected in 2010–2011 by the Measurement, Learning & Evaluation project on women aged 15–49 living in selected cities in Kenya, Nigeria and Senegal were used. Multivariate analyses examined associations between modern contraceptive use and belief in negative myths for individuals and communities. Results In each country, the family planning myths most prevalent at the individual and community levels were that “people who use contraceptives end up with health problems,” “contraceptives are dangerous to women's health” and “contraceptives can harm your womb.” On average, women in Nigeria and Kenya believed 2.7 and 4.6 out of eight selected myths, respectively, and women in Senegal believed 2.6 out of seven. Women's individual-level belief in myths was negatively associated with their modern contraceptive use in all three countries (odds ratios, 0.2–0.7). In Nigeria, the women's community-level myth variable was positively associated with modern contraceptive use (1.6), whereas the men's community-level myth variable was negatively associated with use (0.6); neither community-level variable was associated with modern contraceptive use in Kenya or Senegal. Conclusion Education programs are needed to dispel common myths and misconceptions about modern contraceptives. In Nigeria, programs that encourage community-level discussions may be effective at reducing myths and increasing modern contraceptive use. PMID:26871727
Facilitating the Progression of Modern Apprentices into Undergraduate Business Education.
ERIC Educational Resources Information Center
Chadwick, Simon
1999-01-01
A case study of a program to give apprentices access to undergraduate business education at a British university in cooperation with a local chamber of commerce identified these success factors: recognition that modern apprentices are unlike traditional college students and focus on technology, outcome-based learning, personal development, and…
Modern Psychometrics for Assessing Achievement Goal Orientation: A Rasch Analysis
ERIC Educational Resources Information Center
Muis, Krista R.; Winne, Philip H.; Edwards, Ordene V.
2009-01-01
Background: A program of research is needed that assesses the psychometric properties of instruments designed to quantify students' achievement goal orientations to clarify inconsistencies across previous studies and to provide a stronger basis for future research. Aim: We conducted traditional psychometric and modern Rasch-model analyses of the…
Synopses for Modern Secondary School Mathematics.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France). Directorate for Scientific Affairs.
The 1959 Royaumont seminar "New Thinking in School Mathematics," having agreed on the need for modernization, recommended that a second group of experts work out detailed synopses of the entire subject matter of secondary school mathematics. This book is the report of the second seminar and contains the Dubrovnik Program which stimulated…
Feasibility of modern airships - Preliminary assessment
NASA Technical Reports Server (NTRS)
Ardema, M. D.
1977-01-01
Attention is given to the NASA program, Feasibility Study of Modern Airships, initiated to investigate potential research and technology programs associated with airship development. A historical survey of the program is presented, including the development of past airship concepts, aerodynamical and design improvements, structure and material concepts, and research in controls, avionics, instrumentation, flight operations, and ground handling. A mission analysis was carried out which considered passenger and cargo transportation, heavy-lift, short-haul applications, surveillance missions, and the transportation of natural gas. A vehicle parametric analysis examined the entire range of airship concepts, discussing both conventional airships and hybrids. Various design options were evaluated, such as choice of structural materials, use of boundary-layer control, and choice of lifting gas.
Configuration management program plan for Hanford site systems engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, A.G.
This plan establishes the integrated configuration management program for the evolving technical baseline developed through the systems engineering process. This configuration management program aligns with the criteria identified in the DOE Standard, DOE-STD-1073-93. Included are specific requirements for control of the systems engineering RDD-100 database, and electronic data incorporated in the database that establishes the Hanford site technical baseline.
ERIC Educational Resources Information Center
Darancik, Yasemin
2016-01-01
It has been observed that data-based translation programs are often used both in and outside the class unconsciously and thus there occurs many problems in foreign language learning and teaching. To draw attention to this problem, with this study, whether the program has satisfactory results or not has been revealed by making translations from…
National Rehabilitation Information Center
... search the NARIC website or one of our databases Select a database or search for a webpage A NARIC webpage ... Projects conducting research and/or development (NIDILRR Program Database). Organizations, agencies, and online resources that support people ...
The Reach Address Database (RAD)
The Reach Address Database (RAD) stores reach address information for each Water Program feature that has been linked to the underlying surface water features (streams, lakes, etc) in the National Hydrology Database (NHD) Plus dataset.
ACHP | Federal Agency Historic Preservation Programs and Officers
Working with Section 106 Federal, State, & Tribal Programs Training & Education Publications foster conditions under which modern society and prehistoric and historic resources can exist in
[A SAS marco program for batch processing of univariate Cox regression analysis for great database].
Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin
2015-02-01
To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.
Concentrations of indoor pollutants (CIP) database user's manual (Version 4. 0)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apte, M.G.; Brown, S.R.; Corradi, C.A.
1990-10-01
This is the latest release of the database and the user manual. The user manual is a tutorial and reference for utilizing the CIP Database system. An installation guide is included to cover various hardware configurations. Numerous examples and explanations of the dialogue between the user and the database program are provided. It is hoped that this resource will, along with on-line help and the menu-driven software, make for a quick and easy learning curve. For the purposes of this manual, it is assumed that the user is acquainted with the goals of the CIP Database, which are: (1) tomore » collect existing measurements of concentrations of indoor air pollutants in a user-oriented database and (2) to provide a repository of references citing measured field results openly accessible to a wide audience of researchers, policy makers, and others interested in the issues of indoor air quality. The database software, as distinct from the data, is contained in two files, CIP. EXE and PFIL.COM. CIP.EXE is made up of a number of programs written in dBase III command code and compiled using Clipper into a single, executable file. PFIL.COM is a program written in Turbo Pascal that handles the output of summary text files and is called from CIP.EXE. Version 4.0 of the CIP Database is current through March 1990.« less
LigandBox: A database for 3D structures of chemical compounds
Kawabata, Takeshi; Sugihara, Yusuke; Fukunishi, Yoshifumi; Nakamura, Haruki
2013-01-01
A database for the 3D structures of available compounds is essential for the virtual screening by molecular docking. We have developed the LigandBox database (http://ligandbox.protein.osaka-u.ac.jp/ligandbox/) containing four million available compounds, collected from the catalogues of 37 commercial suppliers, and approved drugs and biochemical compounds taken from KEGG_DRUG, KEGG_COMPOUND and PDB databases. Each chemical compound in the database has several 3D conformers with hydrogen atoms and atomic charges, which are ready to be docked into receptors using docking programs. The 3D conformations were generated using our molecular simulation program package, myPresto. Various physical properties, such as aqueous solubility (LogS) and carcinogenicity have also been calculated to characterize the ADME-Tox properties of the compounds. The Web database provides two services for compound searches: a property/chemical ID search and a chemical structure search. The chemical structure search is performed by a descriptor search and a maximum common substructure (MCS) search combination, using our program kcombu. By specifying a query chemical structure, users can find similar compounds among the millions of compounds in the database within a few minutes. Our database is expected to assist a wide range of researchers, in the fields of medical science, chemical biology, and biochemistry, who are seeking to discover active chemical compounds by the virtual screening. PMID:27493549
LigandBox: A database for 3D structures of chemical compounds.
Kawabata, Takeshi; Sugihara, Yusuke; Fukunishi, Yoshifumi; Nakamura, Haruki
2013-01-01
A database for the 3D structures of available compounds is essential for the virtual screening by molecular docking. We have developed the LigandBox database (http://ligandbox.protein.osaka-u.ac.jp/ligandbox/) containing four million available compounds, collected from the catalogues of 37 commercial suppliers, and approved drugs and biochemical compounds taken from KEGG_DRUG, KEGG_COMPOUND and PDB databases. Each chemical compound in the database has several 3D conformers with hydrogen atoms and atomic charges, which are ready to be docked into receptors using docking programs. The 3D conformations were generated using our molecular simulation program package, myPresto. Various physical properties, such as aqueous solubility (LogS) and carcinogenicity have also been calculated to characterize the ADME-Tox properties of the compounds. The Web database provides two services for compound searches: a property/chemical ID search and a chemical structure search. The chemical structure search is performed by a descriptor search and a maximum common substructure (MCS) search combination, using our program kcombu. By specifying a query chemical structure, users can find similar compounds among the millions of compounds in the database within a few minutes. Our database is expected to assist a wide range of researchers, in the fields of medical science, chemical biology, and biochemistry, who are seeking to discover active chemical compounds by the virtual screening.
Inferring Network Controls from Topology Using the Chomp Database
2015-12-03
AFRL-AFOSR-VA-TR-2016-0033 INFERRING NETWORK CONTROLS FROM TOPOLOGY USING THE CHOMP DATABASE John Harer DUKE UNIVERSITY Final Report 12/03/2015...INFERRING NETWORK CONTROLS FROM TOPOLOGY USING THE CHOMP DATABASE 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-10-1-0436 5c. PROGRAM ELEMENT NUMBER 6...area of Topological Data Analysis (TDA) and it’s application to dynamical systems. The role of this work in the Complex Networks program is based on
A Recommender System in the Cyber Defense Domain
2014-03-27
monitoring software is a java based program sending updates to the database on the sensor machine. The host monitoring program gathers information about...3.2.2 Database. A MySQL database located on the sensor machine acts as the storage for the sensors on the network. Snort, Nmap, vulnerability scores, and...machine with the IDS and the recommender is labeled “sensor”. The recommender system code is written in java and compiled using java version 1.6.024
1987-12-01
Application Programs Intelligent Disk Database Controller Manangement System Operating System Host .1’ I% Figure 2. Intelligent Disk Controller Application...8217. /- - • Database Control -% Manangement System Disk Data Controller Application Programs Operating Host I"" Figure 5. Processor-Per- Head data. Therefore, the...However. these ad- ditional properties have been proven in classical set and relation theory [75]. These additional properties are described here
Benigni, Romualdo; Battistelli, Chiara Laura; Bossa, Cecilia; Tcheremenskaia, Olga; Crettaz, Pierre
2013-07-01
Currently, the public has access to a variety of databases containing mutagenicity and carcinogenicity data. These resources are crucial for the toxicologists and regulators involved in the risk assessment of chemicals, which necessitates access to all the relevant literature, and the capability to search across toxicity databases using both biological and chemical criteria. Towards the larger goal of screening chemicals for a wide range of toxicity end points of potential interest, publicly available resources across a large spectrum of biological and chemical data space must be effectively harnessed with current and evolving information technologies (i.e. systematised, integrated and mined), if long-term screening and prediction objectives are to be achieved. A key to rapid progress in the field of chemical toxicity databases is that of combining information technology with the chemical structure as identifier of the molecules. This permits an enormous range of operations (e.g. retrieving chemicals or chemical classes, describing the content of databases, finding similar chemicals, crossing biological and chemical interrogations, etc.) that other more classical databases cannot allow. This article describes the progress in the technology of toxicity databases, including the concepts of Chemical Relational Database and Toxicological Standardized Controlled Vocabularies (Ontology). Then it describes the ISSTOX cluster of toxicological databases at the Istituto Superiore di Sanitá. It consists of freely available databases characterised by the use of modern information technologies and by curation of the quality of the biological data. Finally, this article provides examples of analyses and results made possible by ISSTOX.
Environmentally-Preferable Launch Coatings
NASA Technical Reports Server (NTRS)
Kessel, Kurt R.
2015-01-01
The Ground Systems Development and Operations (GSDO) Program at NASA Kennedy Space Center (KSC), Florida, has the primary objective of modernizing and transforming the launch and range complex at KSC to benefit current and future NASA programs along with other emerging users. Described as the launch support and infrastructure modernization program in the NASA Authorization Act of 2010, the GSDO Program will develop and implement shared infrastructure and process improvements to provide more flexible, affordable, and responsive capabilities to a multi-user community. In support of NASA and the GSDO Program, the objective of this project is to determine the feasibility of environmentally friendly corrosion protecting coatings for launch facilities and ground support equipment (GSE). The focus of the project is corrosion resistance and survivability with the goal to reduce the amount of maintenance required to preserve the performance of launch facilities while reducing mission risk. The project compares coating performance of the selected alternatives to existing coating systems or standards.
Choosing the Right Database Management Program.
ERIC Educational Resources Information Center
Vockell, Edward L.; Kopenec, Donald
1989-01-01
Provides a comparison of four database management programs commonly used in schools: AppleWorks, the DOS 3.3 and ProDOS versions of PFS, and MECC's Data Handler. Topics discussed include information storage, spelling checkers, editing functions, search strategies, graphs, printout formats, library applications, and HyperCard. (LRW)
Exploiting a wheat EST database to assess genetic diversity
2010-01-01
Expressed sequence tag (EST) markers have been used to assess variety and genetic diversity in wheat (Triticum aestivum). In this study, 1549 ESTs from wheat infested with yellow rust were used to examine the genetic diversity of six susceptible and resistant wheat cultivars. The aim of using these cultivars was to improve the competitiveness of public wheat breeding programs through the intensive use of modern, particularly marker-assisted, selection technologies. The F2 individuals derived from cultivar crosses were screened for resistance to yellow rust at the seedling stage in greenhouses and adult stage in the field to identify DNA markers genetically linked to resistance. Five hundred and sixty ESTs were assembled into 136 contigs and 989 singletons. BlastX search results showed that 39 (29%) contigs and 96 (10%) singletons were homologous to wheat genes. The database-matched contigs and singletons were assigned to eight functional groups related to protein synthesis, photosynthesis, metabolism and energy, stress proteins, transporter proteins, protein breakdown and recycling, cell growth and division and reactive oxygen scavengers. PCR analyses with primers based on the contigs and singletons showed that the most polymorphic functional categories were photosynthesis (contigs) and metabolism and energy (singletons). EST analysis revealed considerable genetic variability among the Turkish wheat cultivars resistant and susceptible to yellow rust disease and allowed calculation of the mean genetic distance between cultivars, with the greatest similarity (0.725) being between Harmankaya99 and Sönmez2001, and the lowest (0.622) between Aytin98 and Izgi01. PMID:21637582
Exploiting a wheat EST database to assess genetic diversity.
Karakas, Ozge; Gurel, Filiz; Uncuoglu, Ahu Altinkut
2010-10-01
Expressed sequence tag (EST) markers have been used to assess variety and genetic diversity in wheat (Triticum aestivum). In this study, 1549 ESTs from wheat infested with yellow rust were used to examine the genetic diversity of six susceptible and resistant wheat cultivars. The aim of using these cultivars was to improve the competitiveness of public wheat breeding programs through the intensive use of modern, particularly marker-assisted, selection technologies. The F(2) individuals derived from cultivar crosses were screened for resistance to yellow rust at the seedling stage in greenhouses and adult stage in the field to identify DNA markers genetically linked to resistance. Five hundred and sixty ESTs were assembled into 136 contigs and 989 singletons. BlastX search results showed that 39 (29%) contigs and 96 (10%) singletons were homologous to wheat genes. The database-matched contigs and singletons were assigned to eight functional groups related to protein synthesis, photosynthesis, metabolism and energy, stress proteins, transporter proteins, protein breakdown and recycling, cell growth and division and reactive oxygen scavengers. PCR analyses with primers based on the contigs and singletons showed that the most polymorphic functional categories were photosynthesis (contigs) and metabolism and energy (singletons). EST analysis revealed considerable genetic variability among the Turkish wheat cultivars resistant and susceptible to yellow rust disease and allowed calculation of the mean genetic distance between cultivars, with the greatest similarity (0.725) being between Harmankaya99 and Sönmez2001, and the lowest (0.622) between Aytin98 and Izgi01.
13 CFR 120.800 - The purpose of the 504 program.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... 120.800 Section 120.800 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) § 120.800 The purpose of the 504 program. As authorized by Congress..., and stimulate growth, expansion, and modernization of small businesses. ...
13 CFR 120.800 - The purpose of the 504 program.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... 120.800 Section 120.800 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) § 120.800 The purpose of the 504 program. As authorized by Congress..., and stimulate growth, expansion, and modernization of small businesses. ...
13 CFR 120.800 - The purpose of the 504 program.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... 120.800 Section 120.800 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) § 120.800 The purpose of the 504 program. As authorized by Congress..., and stimulate growth, expansion, and modernization of small businesses. ...
13 CFR 120.800 - The purpose of the 504 program.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... 120.800 Section 120.800 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Development Company Loan Program (504) § 120.800 The purpose of the 504 program. As authorized by Congress..., and stimulate growth, expansion, and modernization of small businesses. ...
Accelerating Pathology Image Data Cross-Comparison on CPU-GPU Hybrid Systems
Wang, Kaibo; Huai, Yin; Lee, Rubao; Wang, Fusheng; Zhang, Xiaodong; Saltz, Joel H.
2012-01-01
As an important application of spatial databases in pathology imaging analysis, cross-comparing the spatial boundaries of a huge amount of segmented micro-anatomic objects demands extremely data- and compute-intensive operations, requiring high throughput at an affordable cost. However, the performance of spatial database systems has not been satisfactory since their implementations of spatial operations cannot fully utilize the power of modern parallel hardware. In this paper, we provide a customized software solution that exploits GPUs and multi-core CPUs to accelerate spatial cross-comparison in a cost-effective way. Our solution consists of an efficient GPU algorithm and a pipelined system framework with task migration support. Extensive experiments with real-world data sets demonstrate the effectiveness of our solution, which improves the performance of spatial cross-comparison by over 18 times compared with a parallelized spatial database approach. PMID:23355955
A COSTAR interface using WWW technology.
Rabbani, U.; Morgan, M.; Barnett, O.
1998-01-01
The concentration of industry on modern relational databases has left many nonrelational and proprietary databases without support for integration with new technologies. Emerging interface tools and data-access methodologies can be applied with difficulty to medical record systems which have proprietary data representation. Users of such medical record systems usually must access the clinical content of such record systems with keyboard-intensive and time-consuming interfaces. COSTAR is a legacy ambulatory medical record system developed over 25 years ago that is still popular and extensively used at the Massachusetts General Hospital. We define a model for using middle layer services to extract and cache data from non-relational databases, and present an intuitive World-Wide Web interface to COSTAR. This model has been implemented and successfully piloted in the Internal Medicine Associates at Massachusetts General Hospital. Images Figure 1 Figure 2 Figure 3 Figure 4 PMID:9929310
Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.
Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio
2009-12-01
Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.
Strategies for Introducing Databasing into Science.
ERIC Educational Resources Information Center
Anderson, Christopher L.
1990-01-01
Outlines techniques used in the context of a sixth grade science class to teach database structure and search strategies for science using the AppleWorks program. Provides templates and questions for class and element databases. (Author/YP)
Vacuum status-display and sector-conditioning programs
NASA Astrophysics Data System (ADS)
Skelly, J.; Yen, S.
1990-08-01
Two programs have been developed for observation and control of the AGS vacuum system, which include the following notable features: (1) they incorporate a graphical user interface and (2) they are driven by a relational database which describes the vacuum system. The vacuum system comprises some 440 devices organized into 28 vacuum sectors. The status-display program invites menu selection of a sector, interrogates the relational database for relevant vacuum devices, acquires live readbacks and posts a graphical display of their status. The sector-conditioning program likewise invites sector selection, produces the same status display and also implements process control logic on the sector devices to pump the sector down from atmospheric pressure to high vacuum over a period extending several hours. As additional devices are installed in the vacuum system, the devices are added to the relational database; these programs then automatically include the new devices.
Why Save Your Course as a Relational Database?
ERIC Educational Resources Information Center
Hamilton, Gregory C.; Katz, David L.; Davis, James E.
2000-01-01
Describes a system that stores course materials for computer-based training programs in a relational database called Of Course! Outlines the basic structure of the databases; explains distinctions between Of Course! and other authoring languages; and describes how data is retrieved from the database and presented to the student. (Author/LRW)
First Database Course--Keeping It All Organized
ERIC Educational Resources Information Center
Baugh, Jeanne M.
2015-01-01
All Computer Information Systems programs require a database course for their majors. This paper describes an approach to such a course in which real world examples, both design projects and actual database application projects are incorporated throughout the semester. Students are expected to apply the traditional database concepts to actual…
75 FR 18255 - Passenger Facility Charge Database System for Air Carrier Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... Facility Charge Database System for Air Carrier Reporting AGENCY: Federal Aviation Administration (FAA... the Passenger Facility Charge (PFC) database system to report PFC quarterly report information. In... developed a national PFC database system in order to more easily track the PFC program on a nationwide basis...
48 CFR 52.227-14 - Rights in Data-General.
Code of Federal Regulations, 2011 CFR
2011-10-01
... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...
48 CFR 52.227-14 - Rights in Data-General.
Code of Federal Regulations, 2014 CFR
2014-10-01
... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...
48 CFR 52.227-14 - Rights in Data-General.
Code of Federal Regulations, 2012 CFR
2012-10-01
... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...
48 CFR 52.227-14 - Rights in Data-General.
Code of Federal Regulations, 2013 CFR
2013-10-01
... database or database means a collection of recorded information in a form capable of, and for the purpose... enable the computer program to be produced, created, or compiled. (2) Does not include computer databases... databases and computer software documentation). This term does not include computer software or financial...
ChlamyCyc: an integrative systems biology database and web-portal for Chlamydomonas reinhardtii.
May, Patrick; Christian, Jan-Ole; Kempa, Stefan; Walther, Dirk
2009-05-04
The unicellular green alga Chlamydomonas reinhardtii is an important eukaryotic model organism for the study of photosynthesis and plant growth. In the era of modern high-throughput technologies there is an imperative need to integrate large-scale data sets from high-throughput experimental techniques using computational methods and database resources to provide comprehensive information about the molecular and cellular organization of a single organism. In the framework of the German Systems Biology initiative GoFORSYS, a pathway database and web-portal for Chlamydomonas (ChlamyCyc) was established, which currently features about 250 metabolic pathways with associated genes, enzymes, and compound information. ChlamyCyc was assembled using an integrative approach combining the recently published genome sequence, bioinformatics methods, and experimental data from metabolomics and proteomics experiments. We analyzed and integrated a combination of primary and secondary database resources, such as existing genome annotations from JGI, EST collections, orthology information, and MapMan classification. ChlamyCyc provides a curated and integrated systems biology repository that will enable and assist in systematic studies of fundamental cellular processes in Chlamydomonas. The ChlamyCyc database and web-portal is freely available under http://chlamycyc.mpimp-golm.mpg.de.
Two Student Self-Management Techniques Applied to Data-Based Program Modification.
ERIC Educational Resources Information Center
Wesson, Caren
Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…
"Hyperstat": an educational and working tool in epidemiology.
Nicolosi, A
1995-01-01
The work of a researcher in epidemiology is based on studying literature, planning studies, gathering data, analyzing data and writing results. Therefore he has need for performing, more or less, simple calculations, the need for consulting or quoting literature, the need for consulting textbooks about certain issues or procedures, and the need for looking at a specific formula. There are no programs conceived as a workstation to assist the different aspects of researcher work in an integrated fashion. A hypertextual system was developed which supports different stages of the epidemiologist's work. It combines database management, statistical analysis or planning, and literature searches. The software was developed on Apple Macintosh by using Hypercard 2.1 as a database and HyperTalk as a programming language. The program is structured in 7 "stacks" or files: Procedures; Statistical Tables; Graphs; References; Text; Formulas; Help. Each stack has its own management system with an automated Table of Contents. Stacks contain "cards" which make up the databases and carry executable programs. The programs are of four kinds: association; statistical procedure; formatting (input/output); database management. The system performs general statistical procedures, procedures applicable to epidemiological studies only (follow-up and case-control), and procedures for clinical trials. All commands are given by clicking the mouse on self-explanatory "buttons". In order to perform calculations, the user only needs to enter the data into the appropriate cells and then click on the selected procedure's button. The system has a hypertextual structure. The user can go from a procedure to other cards following the preferred order of succession and according to built-in associations. The user can access different levels of knowledge or information from any stack he is consulting or operating. From every card, the user can go to a selected procedure to perform statistical calculations, to the reference database management system, to the textbook in which all procedures and issues are discussed in detail, to the database of statistical formulas with automated table of contents, to statistical tables with automated table of contents, or to the help module. he program has a very user-friendly interface and leaves the user free to use the same format he would use on paper. The interface does not require special skills. It reflects the Macintosh philosophy of using windows, buttons and mouse. This allows the user to perform complicated calculations without losing the "feel" of data, weight alternatives, and simulations. This program shares many features in common with hypertexts. It has an underlying network database where the nodes consist of text, graphics, executable procedures, and combinations of these; the nodes in the database correspond to windows on the screen; the links between the nodes in the database are visible as "active" text or icons in the windows; the text is read by following links and opening new windows. The program is especially useful as an educational tool, directed to medical and epidemiology students. The combination of computing capabilities with a textbook and databases of formulas and literature references, makes the program versatile and attractive as a learning tool. The program is also helpful in the work done at the desk, where the researcher examines results, consults literature, explores different analytic approaches, plans new studies, or writes grant proposals or scientific articles.
NASA Astrophysics Data System (ADS)
Al-Mishwat, Ali T.
2016-05-01
PHASS99 is a FORTRAN program designed to retrieve and decode radiometric and other physical age information of igneous rocks contained in the international database IGBADAT (Igneous Base Data File). In the database, ages are stored in a proprietary format using mnemonic representations. The program can handle up to 99 ages in an igneous rock specimen and caters to forty radiometric age systems. The radiometric age alphanumeric strings assigned to each specimen description in the database consist of four components: the numeric age and its exponential modifier, a four-character mnemonic method identification, a two-character mnemonic name of analysed material, and the reference number in the rock group bibliography vector. For each specimen, the program searches for radiometric age strings, extracts them, parses them, decodes the different age components, and converts them to high-level English equivalents. IGBADAT and similarly-structured files are used for input. The output includes three files: a flat raw ASCII text file containing retrieved radiometric age information, a generic spreadsheet-compatible file for data import to spreadsheets, and an error file. PHASS99 builds on the old program TSTPHA (Test Physical Age) decoder program and expands greatly its capabilities. PHASS99 is simple, user friendly, fast, efficient, and does not require users to have knowledge of programing.
Therrell, Bradford L
2003-01-01
At birth, patient demographic and health information begin to accumulate in varied databases. There are often multiple sources of the same or similar data. New public health programs are often created without considering data linkages. Recently, newborn hearing screening (NHS) programs and immunization programs have virtually ignored the existence of newborn dried blood spot (DBS) newborn screening databases containing similar demographic data, creating data duplication in their 'new' systems. Some progressive public health departments are developing data warehouses of basic, recurrent patient information, and linking these databases to other health program databases where programs and services can benefit from such linkages. Demographic data warehousing saves time (and money) by eliminating duplicative data entry and reducing the chances of data errors. While newborn screening data are usually the first data available, they should not be the only data source considered for early data linkage or for populating a data warehouse. Birth certificate information should also be considered along with other data sources for infants that may not have received newborn screening or who may have been born outside of the jurisdiction and not have birth certificate information locally available. This newborn screening serial number provides a convenient identification number for use in the DBS program and for linking with other systems. As a minimum, data linkages should exist between newborn dried blood spot screening, newborn hearing screening, immunizations, birth certificates and birth defect registries.
NASA Astrophysics Data System (ADS)
Haddam, N. A.; Michel, E.; Siani, G.; Cortese, G.; Bostock, H. C.; Duprat, J. M.; Isguder, G.
2016-06-01
We present an improved database of planktonic foraminiferal census counts from the Southern Hemisphere oceans (SHO) from 15°S to 64°S. The SHO database combines three existing databases. Using this SHO database, we investigated dissolution biases that might affect faunal census counts. We suggest a depth/ΔCO32- threshold of ~3800 m/ΔCO32- = ~ -10 to -5 µmol/kg for the Pacific and Indian Oceans and ~4000 m/ΔCO32- = ~0 to 10 µmol/kg for the Atlantic Ocean, under which core-top assemblages can be affected by dissolution and are less reliable for paleo-sea surface temperature (SST) reconstructions. We removed all core tops beyond these thresholds from the SHO database. This database has 598 core tops and is able to reconstruct past SST variations from 2° to 25.5°C, with a root mean square error of 1.00°C, for annual temperatures. To inspect how dissolution affects SST reconstruction quality, we tested the data base with two "leave-one-out" tests, with and without the deep core tops. We used this database to reconstruct summer SST (SSST) over the last 20 ka, using the Modern Analog Technique method, on the Southeast Pacific core MD07-3100. This was compared to the SSST reconstructed using the three databases used to compile the SHO database, thus showing that the reconstruction using the SHO database is more reliable, as its dissimilarity values are the lowest. The most important aspect here is the importance of a bias-free, geographic-rich database. We leave this data set open-ended to future additions; the new core tops must be carefully selected, with their chronological frameworks, and evidence of dissolution assessed.
Moving towards the goals of FP2020 - classifying contraceptives.
Festin, Mario Philip R; Kiarie, James; Solo, Julie; Spieler, Jeffrey; Malarcher, Shawn; Van Look, Paul F A; Temmerman, Marleen
2016-10-01
With the renewed focus on family planning, a clear and transparent understanding is needed for the consistent classification of contraceptives, especially in the commonly used modern/traditional system. The World Health Organization Department of Reproductive Health and Research and the United States Agency for International Development (USAID) therefore convened a technical consultation in January 2015 to address issues related to classifying contraceptives. The consultation defined modern contraceptive methods as having a sound basis in reproductive biology, a precise protocol for correct use and evidence of efficacy under various conditions based on appropriately designed studies. Methods in country programs like Fertility Awareness Based Methods [such as Standard Days Method (SDM) and TwoDay Method], Lactational Amenorrhea Method (LAM) and emergency contraception should be reported as modern. Herbs, charms and vaginal douching are not counted as contraceptive methods as they have no scientific basis in preventing pregnancy nor are in country programs. More research is needed on defining and measuring use of emergency contraceptive methods, to reflect their contribution to reducing unmet need. The ideal contraceptive classification system should be simple, easy to use, clear and consistent, with greater parsimony. Measurement challenges remain but should not be the driving force to determine what methods are counted or reported as modern or not. Family planning programs should consider multiple attributes of contraceptive methods (e.g., level of effectiveness, need for program support, duration of labeled use, hormonal or nonhormonal) to ensure they provide a variety of methods to meet the needs of women and men. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kravchenko, Iulia; Luhmann, Thomas; Shults, Roman
2016-06-01
For the preparation of modern specialists in the acquisition and processing of three-dimensional data, a broad and detailed study of related modern methods and technologies is necessary. One of the most progressive and effective methods of acquisition and analyzing spatial data is terrestrial laser scanning. The study of methods and technologies for terrestrial laser scanning is of great importance not only for GIS specialists, but also for surveying engineers who make decisions in traditional engineering tasks (monitoring, executive surveys, etc.). The understanding and formation of the right approach in preparing new professionals need to develop a modern and variable educational program. This educational program must provide effective practical and laboratory work and the student's coursework. The resulting knowledge of the study should form the basis for practical or research of young engineers. In 2014, the Institute of Applied Sciences (Jade University Oldenburg, Germany) and Kyiv National University of Construction and Architecture (Kiev, Ukraine) had launched a joint educational project for the introduction of terrestrial laser scanning technology for collection and processing of spatial data. As a result of this project practical recommendations have been developed for the organization of educational processes in the use of terrestrial laser scanning. An advanced project-oriented educational program was developed which is presented in this paper. In order to demonstrate the effectiveness of the program a 3D model of the big and complex main campus of Kyiv National University of Construction and Architecture has been generated.
ERIC Educational Resources Information Center
Albright, C. E.; Smith, Kenneth
2006-01-01
This article discusses a collaborative program between schools with the purpose of training and providing advanced education in welding. Modern manufacturing is turning to automation to increase productivity, but it can be a great challenge to program robots and other computer-controlled welding and joining systems. Computer programming and…
ERIC Educational Resources Information Center
Ontario Dept. of Education, Toronto. School Planning and Building Research Section.
Modern educational programs stress individualized learning. Such programs place less emphasis on prescribed texts and more emphasis on learning from a variety of resources suited to the interests and learning abilities of the individual. Thus, these programs require an abundance of readily available learning materials. The school media center,…
University of Maryland MRSEC - Education: Professional Development for
"stepped" (we call this type of surface a vicinal surface). Modern scanned-probe microscopes International Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Facilities Logos MRSEC Templates Opportunities Search Home » Education » Teacher Programs Professional
University of Maryland MRSEC - Research: Publications
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher Publications Seed Publications Education Publications MRSEC III Publications (2005-Present) IRG 1 Publications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poliakov, Alexander; Couronne, Olivier
2002-11-04
Aligning large vertebrate genomes that are structurally complex poses a variety of problems not encountered on smaller scales. Such genomes are rich in repetitive elements and contain multiple segmental duplications, which increases the difficulty of identifying true orthologous SNA segments in alignments. The sizes of the sequences make many alignment algorithms designed for comparing single proteins extremely inefficient when processing large genomic intervals. We integrated both local and global alignment tools and developed a suite of programs for automatically aligning large vertebrate genomes and identifying conserved non-coding regions in the alignments. Our method uses the BLAT local alignment program tomore » find anchors on the base genome to identify regions of possible homology for a query sequence. These regions are postprocessed to find the best candidates which are then globally aligned using the AVID global alignment program. In the last step conserved non-coding segments are identified using VISTA. Our methods are fast and the resulting alignments exhibit a high degree of sensitivity, covering more than 90% of known coding exons in the human genome. The GenomeVISTA software is a suite of Perl programs that is built on a MySQL database platform. The scheduler gets control data from the database, builds a queve of jobs, and dispatches them to a PC cluster for execution. The main program, running on each node of the cluster, processes individual sequences. A Perl library acts as an interface between the database and the above programs. The use of a separate library allows the programs to function independently of the database schema. The library also improves on the standard Perl MySQL database interfere package by providing auto-reconnect functionality and improved error handling.« less
Post-Modern Software Development
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2005-01-01
The history of software development includes elements of art, science, engineering, and fashion(though very little manufacturing). In all domains, old ideas give way or evolve to new ones: in the fine arts, the baroque gave way to rococo, romanticism, modernism, postmodernism, and so forth. What is the postmodern programming equivalent? That is, what comes after object orientation?
Modern Problems: Sociology Units. An Experimental Program for Grade 12.
ERIC Educational Resources Information Center
Carlson, Marshall; Fennig, Lois
GRADES OR AGES: Grade 12. SUBJECT MATTER: Sociology; modern problems. ORGANIZATION AND PHYSICAL APPEARANCE: The guide contains two units, one on the problems of minority groups and the other on social pathology. Sub-sections of unit 2 include crime and criminals, criminal investigation, gun control, U.S. criminal law, criminal procedure,…
Twentieth Century Modern Language Teaching: Sources and Readings.
ERIC Educational Resources Information Center
Newmark, Maxim, Ed.
One hundred and twenty-two readings from sources published between 1900 and 1947 cover aspects of language teaching in the United States. Chapters on the history of modern language teaching and on programs, projects, and activities are particularly lengthy. Other chapters discuss values of foreign language study, foreign language in the general…
1988-05-01
not be implemented. A change in foreign exchange rates (which increase the equipment cost) and a reduction in marketing forecast resulted in an...project will not be implemented due to unfavorable changes in foreign exchange rates (which increase the equipment costs) and a reduction in market
Move Over, Word Processors--Here Come the Databases.
ERIC Educational Resources Information Center
Olds, Henry F., Jr.; Dickenson, Anne
1985-01-01
Discusses the use of beginning, intermediate, and advanced databases for instructional purposes. A table listing seven databases with information on ease of use, smoothness of operation, data capacity, speed, source, and program features is included. (JN)
Ocean Drilling Program: Janus Web Database
in Janus Data Types and Examples Leg 199, sunrise. Janus Web Database ODP and IODP data are stored in as time permits (see Database Overview for available data). Data are available to everyone. There are
CANCER PREVENTION AND CONTROL (CP) DATABASE
This database focuses on breast, cervical, skin, and colorectal cancer emphasizing the application of early detection and control program activities and risk reduction efforts. The database provides bibliographic citations and abstracts of various types of materials including jou...
Definition and maintenance of a telemetry database dictionary
NASA Technical Reports Server (NTRS)
Knopf, William P. (Inventor)
2007-01-01
A telemetry dictionary database includes a component for receiving spreadsheet workbooks of telemetry data over a web-based interface from other computer devices. Another component routes the spreadsheet workbooks to a specified directory on the host processing device. A process then checks the received spreadsheet workbooks for errors, and if no errors are detected the spreadsheet workbooks are routed to another directory to await initiation of a remote database loading process. The loading process first converts the spreadsheet workbooks to comma separated value (CSV) files. Next, a network connection with the computer system that hosts the telemetry dictionary database is established and the CSV files are ported to the computer system that hosts the telemetry dictionary database. This is followed by a remote initiation of a database loading program. Upon completion of loading a flatfile generation program is manually initiated to generate a flatfile to be used in a mission operations environment by the core ground system.
Knudsen, Keith L.; Sowers, Janet M.; Witter, Robert C.; Wentworth, Carl M.; Helley, Edward J.; Nicholson, Robert S.; Wright, Heather M.; Brown, Katherine H.
2000-01-01
This report presents a preliminary map and database of Quaternary deposits and liquefaction susceptibility for the nine-county San Francisco Bay region, together with a digital compendium of ground effects associated with past earthquakes in the region. The report consists of (1) a spatial database of fivedata layers (Quaternary deposits, quadrangle index, and three ground effects layers) and two text layers (a labels and leaders layer for Quaternary deposits and for ground effects), (2) two small-scale colored maps (Quaternary deposits and liquefaction susceptibility), (3) a text describing the Quaternary map, liquefaction interpretation, and the ground effects compendium, and (4) the databse description pamphlet. The nine counties surrounding San Francisco Bay straddle the San Andreas fault system, which exposes the region to serious earthquake hazard (Working Group on California Earthquake Probabilities, 1999). Much of the land adjacent to the Bay and the major rivers and streams is underlain by unconsolidated deposits that are particularly vulnerable to earthquake shaking and liquefaction of water-saturated granular sediment. This new map provides a modern and regionally consistent treatment of Quaternary surficial deposits that builds on the pioneering mapping of Helley and Lajoie (Helley and others, 1979) and such intervening work as Atwater (1982), Helley and others (1994), and Helley and Graymer (1997a and b). Like these earlier studies, the current mapping uses geomorphic expression, pedogenic soils, and inferred depositional environments to define and distinguish the map units. In contrast to the twelve map units of Helley and Lajoie, however, this new map uses a complex stratigraphy of some forty units, which permits a more realistic portrayal of the Quaternary depositional system. The two colored maps provide a regional summary of the new mapping at a scale of 1:275,000, a scale that is sufficient to show the general distribution and relationships of the map units but cannot distinguish the more detailed elements that are present in the database. The report is the product of years of cooperative work by the USGS National Earthquake Hazards Reduction Program (NEHRP) and National Cooperative Geologic Mapping Program, William Lettis and & Associates, Inc. (WLA) and, more recently, by the California Division of Mines and Geology as well. An earlier version was submitted to the Geological Survey by WLA as a final report for a NEHRP grant (Knudsen and others, 2000). The mapping has been carried out by WLA geologists under contract to the NEHRP Earthquake Program (Grants #14-08-0001-G2129, 1434-94-G-2499, 1434-HQ-97-GR-03121, and 99-HQ-GR-0095) and with other limited support from the County of Napa, and recently also by the California Division of Mines and Geology. The current map consists of this new mapping and revisions of previous USGS mapping.
DeAngelo, Jacob
1983-01-01
GEOTHERM is a comprehensive system of public databases and software used to store, locate, and evaluate information on the geology, geochemistry, and hydrology of geothermal systems. Three main databases address the general characteristics of geothermal wells and fields, and the chemical properties of geothermal fluids; the last database is currently the most active. System tasks are divided into four areas: (1) data acquisition and entry, involving data entry via word processors and magnetic tape; (2) quality assurance, including the criteria and standards handbook and front-end data-screening programs; (3) operation, involving database backups and information extraction; and (4) user assistance, preparation of such items as application programs, and a quarterly newsletter. The principal task of GEOTHERM is to provide information and research support for the conduct of national geothermal-resource assessments. The principal users of GEOTHERM are those involved with the Geothermal Research Program of the U.S. Geological Survey.
Use of a Relational Database to Support Clinical Research: Application in a Diabetes Program
Lomatch, Diane; Truax, Terry; Savage, Peter
1981-01-01
A database has been established to support conduct of clinical research and monitor delivery of medical care for 1200 diabetic patients as part of the Michigan Diabetes Research and Training Center (MDRTC). Use of an intelligent microcomputer to enter and retrieve the data and use of a relational database management system (DBMS) to store and manage data have provided a flexible, efficient method of achieving both support of small projects and monitoring overall activity of the Diabetes Center Unit (DCU). Simplicity of access to data, efficiency in providing data for unanticipated requests, ease of manipulations of relations, security and “logical data independence” were important factors in choosing a relational DBMS. The ability to interface with an interactive statistical program and a graphics program is a major advantage of this system. Out database currently provides support for the operation and analysis of several ongoing research projects.
New insights from DEM's into form, process and causality in Distributive Fluvial Systems
NASA Astrophysics Data System (ADS)
Scuderi, Louis; Weissmann, Gary; Hartley, Adrian; Kindilien, Peter
2014-05-01
Recent developments in platforms and sensors, as well as advances in our ability to access these rich data sources in near real time presents geoscientists with both opportunities and problems. We currently record raster and point cloud data about the physical world at unprecedented rates with extremely high spatial and spectral resolution. Yet the ability to extract scientifically useful knowledge from such immense data sets has lagged considerably. The interrelated fields of database creation, data mining and modern geostatistics all focus on such interdisciplinary data analysis problems. In recent years these fields have made great advances in analyzing the complex real-world data such as that captured in Digital Elevation Models (DEM's) and satellite imagery and by LIDAR and other geospatially referenced data sets. However, even considering the vast increase in the use of these data sets in the past decade these methods have enjoyed only a relatively modest penetration into the geosciences when compared to data analysis in other scientific disciplines. In part, a great deal of the current research weakness is due to the lack of a unifying conceptual approach and the failure to appreciate the value of highly structured and synthesized compilations of data, organized in user-friendly formats. We report on the application of these new technologies and database approaches to global scale parameterization of Distributive Fluvial Systems (DFS) within continental sedimentary basins and illustrate the value of well-constructed databases and tool-rich analysis environments for understanding form, process and causality in these systems. We analyzed the characteristics of aggradational fluvial systems in more than 700 modern continental sedimentary basins and the links between DFS within these systems and their contributing drainage basins. Our studies show that in sedimentary basins, distributive fluvial and alluvial systems dominate the depositional environment. Consequently, we have found that studies of modern tributary drainage systems in degradational settings are likely insufficient for understanding the geomorphology expressed within these basins and ultimately for understanding the basin-scale architecture of dominantly distributive fluvial deposits preserved in the rock record.
The modern library: lost and found.
Lindberg, D A
1996-01-01
The modern library, a term that was heard frequently in the mid-twentieth century, has fallen into disuse. The over-promotion of computers and all that their enthusiasts promised probably hastened its demise. Today, networking is transforming how libraries provide--and users seek--information. Although the Internet is the natural environment for the health sciences librarian, it is going through growing pains as we face issues of censorship and standards. Today's "modern librarian" must not only be adept at using the Internet but must become familiar with digital information in all its forms--images, full text, and factual data banks. Most important, to stay "modern," today's librarians must embark on a program of lifelong learning that will enable them to make optimum use of the advantages offered by modern technology. PMID:8938334
An expanded mammal mitogenome dataset from Southeast Asia
Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S. Sinding; Riddhi, P. Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K. Lim; Stephen, J. Rossiter; Wilting, Andreas
2017-01-01
Abstract Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals—has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. PMID:28873965
An expanded mammal mitogenome dataset from Southeast Asia.
Mohd Salleh, Faezah; Ramos-Madrigal, Jazmín; Peñaloza, Fernando; Liu, Shanlin; Mikkel-Holger, S Sinding; Riddhi, P Patel; Martins, Renata; Lenz, Dorina; Fickel, Jörns; Roos, Christian; Shamsir, Mohd Shahir; Azman, Mohammad Shahfiz; Burton, K Lim; Stephen, J Rossiter; Wilting, Andreas; Gilbert, M Thomas P
2017-08-01
Southeast (SE) Asia is 1 of the most biodiverse regions in the world, and it holds approximately 20% of all mammal species. Despite this, the majority of SE Asia's genetic diversity is still poorly characterized. The growing interest in using environmental DNA to assess and monitor SE Asian species, in particular threatened mammals-has created the urgent need to expand the available reference database of mitochondrial barcode and complete mitogenome sequences. We have partially addressed this need by generating 72 new mitogenome sequences reconstructed from DNA isolated from a range of historical and modern tissue samples. Approximately 55 gigabases of raw sequence were generated. From this data, we assembled 72 complete mitogenome sequences, with an average depth of coverage of ×102.9 and ×55.2 for modern samples and historical samples, respectively. This dataset represents 52 species, of which 30 species had no previous mitogenome data available. The mitogenomes were geotagged to their sampling location, where known, to display a detailed geographical distribution of the species. Our new database of 52 taxa will strongly enhance the utility of environmental DNA approaches for monitoring mammals in SE Asia as it greatly increases the likelihoods that identification of metabarcoding sequencing reads can be assigned to reference sequences. This magnifies the confidence in species detections and thus allows more robust surveys and monitoring programmes of SE Asia's threatened mammal biodiversity. The extensive collections of historical samples from SE Asia in western and SE Asian museums should serve as additional valuable material to further enrich this reference database. © The Author 2017. Published by Oxford University Press.
Activity computer program for calculating ion irradiation activation
NASA Astrophysics Data System (ADS)
Palmer, Ben; Connolly, Brian; Read, Mark
2017-07-01
A computer program, Activity, was developed to predict the activity and gamma lines of materials irradiated with an ion beam. It uses the TENDL (Koning and Rochman, 2012) [1] proton reaction cross section database, the Stopping and Range of Ions in Matter (SRIM) (Biersack et al., 2010) code, a Nuclear Data Services (NDS) radioactive decay database (Sonzogni, 2006) [2] and an ENDF gamma decay database (Herman and Chadwick, 2006) [3]. An extended version of Bateman's equation is used to calculate the activity at time t, and this equation is solved analytically, with the option to also solve by numeric inverse Laplace Transform as a failsafe. The program outputs the expected activity and gamma lines of the activated material.
Cell death proteomics database: consolidating proteomics data on cell death.
Arntzen, Magnus Ø; Bull, Vibeke H; Thiede, Bernd
2013-05-03
Programmed cell death is a ubiquitous process of utmost importance for the development and maintenance of multicellular organisms. More than 10 different types of programmed cell death forms have been discovered. Several proteomics analyses have been performed to gain insight in proteins involved in the different forms of programmed cell death. To consolidate these studies, we have developed the cell death proteomics (CDP) database, which comprehends data from apoptosis, autophagy, cytotoxic granule-mediated cell death, excitotoxicity, mitotic catastrophe, paraptosis, pyroptosis, and Wallerian degeneration. The CDP database is available as a web-based database to compare protein identifications and quantitative information across different experimental setups. The proteomics data of 73 publications were integrated and unified with protein annotations from UniProt-KB and gene ontology (GO). Currently, more than 6,500 records of more than 3,700 proteins are included in the CDP. Comparing apoptosis and autophagy using overrepresentation analysis of GO terms, the majority of enriched processes were found in both, but also some clear differences were perceived. Furthermore, the analysis revealed differences and similarities of the proteome between autophagosomal and overall autophagy. The CDP database represents a useful tool to consolidate data from proteome analyses of programmed cell death and is available at http://celldeathproteomics.uio.no.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
..., Proposed Collection: IMLS Museum Web Database: MuseumsCount.gov AGENCY: Institute of Museum and Library... general public. Information such as name, address, phone, email, Web site, staff size, program details... Museum Web Database: MuseumsCount.gov collection. The 60-day notice for the IMLS Museum Web Database...
Gendered Obstacles Faced by Historical Women in Physics and Astronomy
NASA Astrophysics Data System (ADS)
Jones, Kristen M.
2007-12-01
A gender gap still exists in modern science; this is especially evident in the fields of physics and astronomy. The cause of such a gap is the center of debate. Is this discrepancy the result of inherent ability or socialization? Most studies have focused on modern issues and how women are socialized today. The role of historical gender perspectives and social opinions in creating the field of modern science and any discrepancies within it has not yet been explored in depth. This project investigates the obstacles faced by historical women in physics and astronomy that stem from the officialized gender biases that accompanied the establishment of modern science. Such obstacles are both formal and informal. Four women were chosen to span the three hundred year period between the standardization of the field and the modern day: Laura Bassi, Mary Somerville, Lise Meitner, and Jocelyn Bell Burnell. The investigation reveals that formal obstacles significantly decreased over the time period, while informal obstacles eroded more gradually. Obstacles also reflected historical events such as the World Wars and the Enlightenment. Trends in obstacles faced by four prominent women physicists indicate that education, finances, support networks, and social opinion played a large role in determining success in the field. The applicability to modern day physics issues and the gender gap is discussed. Many thanks to the Pathways Scholars Program and the Ronald E. McNair Post-Baccalaureate Achievement Program for funding for this project.
ERIC Educational Resources Information Center
Ali, Azad; Smith, David
2014-01-01
This paper presents a debate between two faculty members regarding the teaching of the legacy programming course (COBOL) in a Computer Science (CS) program. Among the two faculty members, one calls for the continuation of teaching this language and the other calls for replacing it with another modern language. Although CS programs are notorious…
Expanding Omics Resources for Improvement of Soybean Seed Composition Traits
Chaudhary, Juhi; Patil, Gunvant B.; Sonah, Humira; Deshmukh, Rupesh K.; Vuong, Tri D.; Valliyodan, Babu; Nguyen, Henry T.
2015-01-01
Food resources of the modern world are strained due to the increasing population. There is an urgent need for innovative methods and approaches to augment food production. Legume seeds are major resources of human food and animal feed with their unique nutrient compositions including oil, protein, carbohydrates, and other beneficial nutrients. Recent advances in next-generation sequencing (NGS) together with “omics” technologies have considerably strengthened soybean research. The availability of well annotated soybean genome sequence along with hundreds of identified quantitative trait loci (QTL) associated with different seed traits can be used for gene discovery and molecular marker development for breeding applications. Despite the remarkable progress in these technologies, the analysis and mining of existing seed genomics data are still challenging due to the complexity of genetic inheritance, metabolic partitioning, and developmental regulations. Integration of “omics tools” is an effective strategy to discover key regulators of various seed traits. In this review, recent advances in “omics” approaches and their use in soybean seed trait investigations are presented along with the available databases and technological platforms and their applicability in the improvement of soybean. This article also highlights the use of modern breeding approaches, such as genome-wide association studies (GWAS), genomic selection (GS), and marker-assisted recurrent selection (MARS) for developing superior cultivars. A catalog of available important resources for major seed composition traits, such as seed oil, protein, carbohydrates, and yield traits are provided to improve the knowledge base and future utilization of this information in the soybean crop improvement programs. PMID:26635846
Contemporary machine learning: techniques for practitioners in the physical sciences
NASA Astrophysics Data System (ADS)
Spears, Brian
2017-10-01
Machine learning is the science of using computers to find relationships in data without explicitly knowing or programming those relationships in advance. Often without realizing it, we employ machine learning every day as we use our phones or drive our cars. Over the last few years, machine learning has found increasingly broad application in the physical sciences. This most often involves building a model relationship between a dependent, measurable output and an associated set of controllable, but complicated, independent inputs. The methods are applicable both to experimental observations and to databases of simulated output from large, detailed numerical simulations. In this tutorial, we will present an overview of current tools and techniques in machine learning - a jumping-off point for researchers interested in using machine learning to advance their work. We will discuss supervised learning techniques for modeling complicated functions, beginning with familiar regression schemes, then advancing to more sophisticated decision trees, modern neural networks, and deep learning methods. Next, we will cover unsupervised learning and techniques for reducing the dimensionality of input spaces and for clustering data. We'll show example applications from both magnetic and inertial confinement fusion. Along the way, we will describe methods for practitioners to help ensure that their models generalize from their training data to as-yet-unseen test data. We will finally point out some limitations to modern machine learning and speculate on some ways that practitioners from the physical sciences may be particularly suited to help. This work was performed by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
University of Maryland MRSEC - Education: Community
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher MRSEC Templates Opportunities Search Home » Education » Community Outreach Community Outreach
University of Maryland MRSEC - About Us: Committees
; (we call this type of surface a vicinal surface). Modern scanned-probe microscopes, such as the STM Educational Education Pre-College Programs Homeschool Programs Undergraduate & Graduate Programs Teacher -2014 IRG 1 Donna Hammer Hammer, Donna MRSEC Associate Director & Director of Education Outreach
1993-03-25
application of Object-Oriented Programming (OOP) and Human-Computer Interface (HCI) design principles. Knowledge gained from each topic has been incorporated...through the ap- plication of Object-Oriented Programming (OOP) and Human-Computer Interface (HCI) design principles. Knowledge gained from each topic has...programming and Human-Computer Interface (HCI) design. Knowledge gained from each is applied to the design of a Form-based interface for database data
The Structural Ceramics Database: Technical Foundations
Munro, R. G.; Hwang, F. Y.; Hubbard, C. R.
1989-01-01
The development of a computerized database on advanced structural ceramics can play a critical role in fostering the widespread use of ceramics in industry and in advanced technologies. A computerized database may be the most effective means of accelerating technology development by enabling new materials to be incorporated into designs far more rapidly than would have been possible with traditional information transfer processes. Faster, more efficient access to critical data is the basis for creating this technological advantage. Further, a computerized database provides the means for a more consistent treatment of data, greater quality control and product reliability, and improved continuity of research and development programs. A preliminary system has been completed as phase one of an ongoing program to establish the Structural Ceramics Database system. The system is designed to be used on personal computers. Developed in a modular design, the preliminary system is focused on the thermal properties of monolithic ceramics. The initial modules consist of materials specification, thermal expansion, thermal conductivity, thermal diffusivity, specific heat, thermal shock resistance, and a bibliography of data references. Query and output programs also have been developed for use with these modules. The latter program elements, along with the database modules, will be subjected to several stages of testing and refinement in the second phase of this effort. The goal of the refinement process will be the establishment of this system as a user-friendly prototype. Three primary considerations provide the guidelines to the system’s development: (1) The user’s needs; (2) The nature of materials properties; and (3) The requirements of the programming language. The present report discusses the manner and rationale by which each of these considerations leads to specific features in the design of the system. PMID:28053397
Engel, Stacia R.; Cherry, J. Michael
2013-01-01
The first completed eukaryotic genome sequence was that of the yeast Saccharomyces cerevisiae, and the Saccharomyces Genome Database (SGD; http://www.yeastgenome.org/) is the original model organism database. SGD remains the authoritative community resource for the S. cerevisiae reference genome sequence and its annotation, and continues to provide comprehensive biological information correlated with S. cerevisiae genes and their products. A diverse set of yeast strains have been sequenced to explore commercial and laboratory applications, and a brief history of those strains is provided. The publication of these new genomes has motivated the creation of new tools, and SGD will annotate and provide comparative analyses of these sequences, correlating changes with variations in strain phenotypes and protein function. We are entering a new era at SGD, as we incorporate these new sequences and make them accessible to the scientific community, all in an effort to continue in our mission of educating researchers and facilitating discovery. Database URL: http://www.yeastgenome.org/ PMID:23487186
Hibbert, F.D.; Williams, F.H.; Fallon, S.J.; Rohling, E.J.
2018-01-01
The last deglacial was an interval of rapid climate and sea-level change, including the collapse of large continental ice sheets. This database collates carefully assessed sea-level data from peer-reviewed sources for the interval 0 to 25 thousand years ago (ka), from the Last Glacial Maximum to the present interglacial. In addition to facilitating site-specific reconstructions of past sea levels, the database provides a suite of data beyond the range of modern/instrumental variability that may help hone future sea-level projections. The database is global in scope, internally consistent, and contains U-series and radiocarbon dated indicators from both biological and geomorpohological archives. We focus on far-field data (i.e., away from the sites of the former continental ice sheets), but some key intermediate (i.e., from the Caribbean) data are also included. All primary fields (i.e., sample location, elevation, age and context) possess quantified uncertainties, which—in conjunction with available metadata—allows the reconstructed sea levels to be interpreted within both their uncertainties and geological context. PMID:29809175
Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W
2014-12-01
Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.
Relax with CouchDB--into the non-relational DBMS era of bioinformatics.
Manyam, Ganiraju; Payton, Michelle A; Roth, Jack A; Abruzzo, Lynne V; Coombes, Kevin R
2012-07-01
With the proliferation of high-throughput technologies, genome-level data analysis has become common in molecular biology. Bioinformaticians are developing extensive resources to annotate and mine biological features from high-throughput data. The underlying database management systems for most bioinformatics software are based on a relational model. Modern non-relational databases offer an alternative that has flexibility, scalability, and a non-rigid design schema. Moreover, with an accelerated development pace, non-relational databases like CouchDB can be ideal tools to construct bioinformatics utilities. We describe CouchDB by presenting three new bioinformatics resources: (a) geneSmash, which collates data from bioinformatics resources and provides automated gene-centric annotations, (b) drugBase, a database of drug-target interactions with a web interface powered by geneSmash, and (c) HapMap-CN, which provides a web interface to query copy number variations from three SNP-chip HapMap datasets. In addition to the web sites, all three systems can be accessed programmatically via web services. Copyright © 2012 Elsevier Inc. All rights reserved.
Listing of Education in Archaeological Programs: The LEAP Clearinghouse, 1989-1989 Summary Report.
ERIC Educational Resources Information Center
Knoll, Patricia C., Ed.
This catalog incorporates information gathered between 1987 and 1989 for inclusion into the National Park Service's Listing of Education in Archaeological Programs (LEAP) computerized database. This database is a listing of federal, state, local and private projects promoting positive public awareness of U.S. archaeology--prehistoric and historic,…
Unified Database Development Program. Final Report.
ERIC Educational Resources Information Center
Thomas, Everett L., Jr.; Deem, Robert N.
The objective of the unified database (UDB) program was to develop an automated information system that would be useful in the design, development, testing, and support of new Air Force aircraft weapon systems. Primary emphasis was on the development of: (1) a historical logistics data repository system to provide convenient and timely access to…
Vapor Compression Cycle Design Program (CYCLE_D)
National Institute of Standards and Technology Data Gateway
SRD 49 NIST Vapor Compression Cycle Design Program (CYCLE_D) (PC database for purchase) The CYCLE_D database package simulates the vapor compression refrigeration cycles. It is fully compatible with REFPROP 9.0 and covers the 62 single-compound refrigerants . Fluids can be used in mixtures comprising up to five components.
ERIC Educational Resources Information Center
Noell, George H.
2005-01-01
Analyses were conducted replicating pilot work examining the feasibility of using the Louisiana's educational assessment data in concert with the Louisiana Educational Assessment Data System (LEADS) database and other associated databases to assess teacher preparation programs. The degree of matching across years and the degree of matching between…
Extending the Online Public Access Catalog into the Microcomputer Environment.
ERIC Educational Resources Information Center
Sutton, Brett
1990-01-01
Describes PCBIS, a database program for MS-DOS microcomputers that features a utility for automatically converting online public access catalog search results stored as text files into structured database files that can be searched, sorted, edited, and printed. Topics covered include the general features of the program, record structure, record…
Alexander, William M; Ficarro, Scott B; Adelmant, Guillaume; Marto, Jarrod A
2017-08-01
The continued evolution of modern mass spectrometry instrumentation and associated methods represents a critical component in efforts to decipher the molecular mechanisms which underlie normal physiology and understand how dysregulation of biological pathways contributes to human disease. The increasing scale of these experiments combined with the technological diversity of mass spectrometers presents several challenges for community-wide data access, analysis, and distribution. Here we detail a redesigned version of multiplierz, our Python software library which leverages our common application programming interface (mzAPI) for analysis and distribution of proteomic data. New features include support for a wider range of native mass spectrometry file types, interfaces to additional database search engines, compatibility with new reporting formats, and high-level tools to perform post-search proteomic analyses. A GUI desktop environment, mzDesktop, provides access to multiplierz functionality through a user friendly interface. multiplierz is available for download from: https://github.com/BlaisProteomics/multiplierz; and mzDesktop is available for download from: https://sourceforge.net/projects/multiplierz/. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Availability Performance Analysis of Thermal Power Plants
NASA Astrophysics Data System (ADS)
Bhangu, Navneet Singh; Singh, Rupinder; Pahuja, G. L.
2018-03-01
This case study presents the availability evaluation method of thermal power plants for conducting performance analysis in Indian environment. A generic availability model has been proposed for a maintained system (thermal plants) using reliability block diagrams and fault tree analysis. The availability indices have been evaluated under realistic working environment using inclusion exclusion principle. Four year failure database has been used to compute availability for different combinatory of plant capacity, that is, full working state, reduced capacity or failure state. Availability is found to be very less even at full rated capacity (440 MW) which is not acceptable especially in prevailing energy scenario. One of the probable reason for this may be the difference in the age/health of existing thermal power plants which requires special attention of each unit from case to case basis. The maintenance techniques being used are conventional (50 years old) and improper in context of the modern equipment, which further aggravate the problem of low availability. This study highlights procedure for finding critical plants/units/subsystems and helps in deciding preventive maintenance program.
FBI Fingerprint Image Capture System High-Speed-Front-End throughput modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rathke, P.M.
1993-09-01
The Federal Bureau of Investigation (FBI) has undertaken a major modernization effort called the Integrated Automated Fingerprint Identification System (IAFISS). This system will provide centralized identification services using automated fingerprint, subject descriptor, mugshot, and document processing. A high-speed Fingerprint Image Capture System (FICS) is under development as part of the IAFIS program. The FICS will capture digital and microfilm images of FBI fingerprint cards for input into a central database. One FICS design supports two front-end scanning subsystems, known as the High-Speed-Front-End (HSFE) and Low-Speed-Front-End, to supply image data to a common data processing subsystem. The production rate of themore » HSFE is critical to meeting the FBI`s fingerprint card processing schedule. A model of the HSFE has been developed to help identify the issues driving the production rate, assist in the development of component specifications, and guide the evolution of an operations plan. A description of the model development is given, the assumptions are presented, and some HSFE throughput analysis is performed.« less
The Role of NOAA's National Data Centers in the Earth and Space Science Infrastructure
NASA Astrophysics Data System (ADS)
Fox, C. G.
2008-12-01
NOAA's National Data Centers (NNDC) provide access to long-term archives of environmental data from NOAA and other sources. The NNDCs face significant challenges in the volume and complexity of modern data sets. Data volume challenges are being addressed using more capable data archive systems such as the Comprehensive Large Array-Data Stewardship System (CLASS). Challenges in assuring data quality and stewardship are in many ways more challenging. In the past, scientists at the Data Centers could provide reasonable stewardship of data sets in their area of expertise. As staff levels have decreased and data complexity has increased, Data Centers depend on their data providers and user communities to provide high-quality metadata, feedback on data problems and improvements. This relationship requires strong partnerships between the NNDCs and academic, commercial, and international partners, as well as advanced data management and access tools that conform to established international standards when available. The NNDCs are looking to geospatial databases, interactive mapping, web services, and other Application Program Interface approaches to help preserve NNDC data and information and to make it easily available to the scientific community.
The research infrastructure of Chinese foundations, a database for Chinese civil society studies
Ma, Ji; Wang, Qun; Dong, Chao; Li, Huafang
2017-01-01
This paper provides technical details and user guidance on the Research Infrastructure of Chinese Foundations (RICF), a database of Chinese foundations, civil society, and social development in general. The structure of the RICF is deliberately designed and normalized according to the Three Normal Forms. The database schema consists of three major themes: foundations’ basic organizational profile (i.e., basic profile, board member, supervisor, staff, and related party tables), program information (i.e., program information, major program, program relationship, and major recipient tables), and financial information (i.e., financial position, financial activities, cash flow, activity overview, and large donation tables). The RICF’s data quality can be measured by four criteria: data source reputation and credibility, completeness, accuracy, and timeliness. Data records are properly versioned, allowing verification and replication for research purposes. PMID:28742065
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nancy Carlisle: NREL
This publication is one of a series of case studies of energy-efficient modern laboratories; it was prepared for "Laboratories for the 21st Century," a joint program of the Environmental Protection Agency and the U.S. DOE Federal Energy Management Program
Design and Test of Fan/Nacelle Models Quiet High-Speed Fan
NASA Technical Reports Server (NTRS)
Miller, Christopher J. (Technical Monitor); Weir, Donald
2003-01-01
The Quiet High-Speed Fan program is a cooperative effort between Honeywell Engines & Systems (formerly AlliedSignal Engines & Systems) and the NASA Glenn Research Center. Engines & Systems has designed an advanced high-speed fan that will be tested on the Ultra High Bypass Propulsion Simulator in the NASA Glenn 9 x 15 foot wind tunnel, currently scheduled for the second quarter of 2000. An Engines & Systems modern fan design will be used as a baseline. A nacelle model is provided that is characteristic of a typical, modern regional aircraft nacelle and meets all of the program test objectives.
Modernized Techniques for Dealing with Quality Data and Derived Products
NASA Astrophysics Data System (ADS)
Neiswender, C.; Miller, S. P.; Clark, D.
2008-12-01
"I just want a picture of the ocean floor in this area" is expressed all too often by researchers, educators, and students in the marine geosciences. As more sophisticated systems are developed to handle data collection and processing, the demand for quality data, and standardized products continues to grow. Data management is an invisible bridge between science and researchers/educators. The SIOExplorer digital library presents more than 50 years of ocean-going research. Prior to publication, all data is checked for quality using standardized criterion developed for each data stream. Despite the evolution of data formats and processing systems, SIOExplorer continues to present derived products in well- established formats. Standardized products are published for each cruise, and include a cruise report, MGD77 merged data, multi-beam flipbook, and underway profiles. Creation of these products is made possible by processing scripts, which continue to change with ever-evolving data formats. We continue to explore the potential of database-enabled creation of standardized products, such as the metadata-rich MGD77 header file. Database-enabled, automated processing produces standards-compliant metadata for each data and derived product. Metadata facilitates discovery and interpretation of published products. This descriptive information is stored both in an ASCII file, and a searchable digital library database. SIOExplorer's underlying technology allows focused search and retrieval of data and products. For example, users can initiate a search of only multi-beam data, which includes data-specific parameters. This customization is made possible with a synthesis of database, XML, and PHP technology. The combination of standardized products and digital library technology puts quality data and derived products in the hands of scientists. Interoperable systems enable distribution these published resources using technology such as web services. By developing modernized strategies to deal with data, Scripps Institution of Oceanography is able to produce and distribute well-formed, and quality-tested derived products, which aid research, understanding, and education.
The global record of local iron geochemical data from Proterozoic through Paleozoic basins
NASA Astrophysics Data System (ADS)
Sperling, E. A.; Wolock, C.; Johnston, D. T.; Knoll, A. H.
2013-12-01
Iron-based redox proxies represent one of the most mature tools available to sedimentary geochemists. These techniques, which benefit from decades of refinement, are based on the fact that rocks deposited under anoxic conditions tend to be enriched in highly-reactive iron. However, there are myriad local controls on the development of anoxia, and no local section is an exemplar for the global ocean. The global signal must thus be determined using techniques like those developed to solve an analogous problem in paleobiology: the inference of global diversity patterns through time from faunas seen in local stratigraphic sections. Here we analyze a dataset of over 4000 iron speciation measurements (including over 600 de novo analyses) to better understand redox changes from the Proterozoic through the Paleozoic Era. Preliminary database analyses yield interesting observations. We find that although anoxic water columns in the middle Proterozoic were dominantly ferruginous, there was a statistical tendency towards euxinia not seen in early Neoproterozoic or Ediacaran data. Also, we find that in the Neoproterozoic oceans, oxic depositional environments-the likely home for early animals-have exceptionally low pyrite contents, and by inference low levels of porewater sulfide. This runs contrary to notions of sulfide stress on early metazoans. Finally, the current database of iron speciation data does not support an Ediacaran or Cambrian oxygenation event. This conclusion is of course only as sharp as the ability of the Fe-proxy database to track dissolved oxygen and does not rule out the possibility of a small-magnitude change in oxygen. It does suggest, however, that if changing pO2 facilitated animal diversification it did so by a limited rise past critical ecological thresholds, such as seen in the modern Oxygen Minimum Zones benthos. Oxygen increase to modern levels thus becomes a Paleozoic problem, and one in need of better sampling if a database approach is to be employed.
A simple modern correctness condition for a space-based high-performance multiprocessor
NASA Technical Reports Server (NTRS)
Probst, David K.; Li, Hon F.
1992-01-01
A number of U.S. national programs, including space-based detection of ballistic missile launches, envisage putting significant computing power into space. Given sufficient progress in low-power VLSI, multichip-module packaging and liquid-cooling technologies, we will see design of high-performance multiprocessors for individual satellites. In very high speed implementations, performance depends critically on tolerating large latencies in interprocessor communication; without latency tolerance, performance is limited by the vastly differing time scales in processor and data-memory modules, including interconnect times. The modern approach to tolerating remote-communication cost in scalable, shared-memory multiprocessors is to use a multithreaded architecture, and alter the semantics of shared memory slightly, at the price of forcing the programmer either to reason about program correctness in a relaxed consistency model or to agree to program in a constrained style. The literature on multiprocessor correctness conditions has become increasingly complex, and sometimes confusing, which may hinder its practical application. We propose a simple modern correctness condition for a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and a high-performance, shared-memory multiprocessor; the correctness condition is based on a simple interface between the multiprocessor architecture and the parallel programming system.
NASA Astrophysics Data System (ADS)
Stock, Michala K.; Stull, Kyra E.; Garvin, Heather M.; Klales, Alexandra R.
2016-10-01
Forensic anthropologists are routinely asked to estimate a biological profile (i.e., age, sex, ancestry and stature) from a set of unidentified remains. In contrast to the abundance of collections and techniques associated with adult skeletons, there is a paucity of modern, documented subadult skeletal material, which limits the creation and validation of appropriate forensic standards. Many are forced to use antiquated methods derived from small sample sizes, which given documented secular changes in the growth and development of children, are not appropriate for application in the medico-legal setting. Therefore, the aim of this project is to use multi-slice computed tomography (MSCT) data from a large, diverse sample of modern subadults to develop new methods to estimate subadult age and sex for practical forensic applications. The research sample will consist of over 1,500 full-body MSCT scans of modern subadult individuals (aged birth to 20 years) obtained from two U.S. medical examiner's offices. Statistical analysis of epiphyseal union scores, long bone osteometrics, and os coxae landmark data will be used to develop modern subadult age and sex estimation standards. This project will result in a database of information gathered from the MSCT scans, as well as the creation of modern, statistically rigorous standards for skeletal age and sex estimation in subadults. Furthermore, the research and methods developed in this project will be applicable to dry bone specimens, MSCT scans, and radiographic images, thus providing both tools and continued access to data for forensic practitioners in a variety of settings.
Wilf, Eitan
2012-01-01
In this article, I seek to complicate the distinction between imitation and creativity, which has played a dominant role in the modern imaginary and anthropological theory. I focus on a U.S. collegiate jazz music program, in which jazz educators use advanced sound technologies to reestablish immersive interaction with the sounds of past jazz masters against the backdrop of the disappearance of performance venues for jazz. I analyze a key pedagogical practice in the course of which students produce precise replications of the recorded improvisations of past jazz masters and then play them in synchrony with the recordings. Through such synchronous iconization, students inhabit and reenact the creativity epitomized by these recordings. I argue that such a practice, which I call a “ritual of creativity,” suggests a coconstitutive relationship between imitation and creativity, which has intensified under modernity because of the availability of new technologies of digital reproduction.
Charting a Path to Location Intelligence for STD Control.
Gerber, Todd M; Du, Ping; Armstrong-Brown, Janelle; McNutt, Louise-Anne; Coles, F Bruce
2009-01-01
This article describes the New York State Department of Health's GeoDatabase project, which developed new methods and techniques for designing and building a geocoding and mapping data repository for sexually transmitted disease (STD) control. The GeoDatabase development was supported through the Centers for Disease Control and Prevention's Outcome Assessment through Systems of Integrated Surveillance workgroup. The design and operation of the GeoDatabase relied upon commercial-off-the-shelf tools that other public health programs may also use for disease-control systems. This article provides a blueprint of the structure and software used to build the GeoDatabase and integrate location data from multiple data sources into the everyday activities of STD control programs.
Diet History Questionnaire: Database Utility Program
If you need to modify the standard nutrient database, a single nutrient value must be provided by gender and portion size. If you have modified the database to have fewer or greater demographic groups, nutrient values must be included for each group.
Ushijima, Masaru; Mashima, Tetsuo; Tomida, Akihiro; Dan, Shingo; Saito, Sakae; Furuno, Aki; Tsukahara, Satomi; Seimiya, Hiroyuki; Yamori, Takao; Matsuura, Masaaki
2013-03-01
Genome-wide transcriptional expression analysis is a powerful strategy for characterizing the biological activity of anticancer compounds. It is often instructive to identify gene sets involved in the activity of a given drug compound for comparison with different compounds. Currently, however, there is no comprehensive gene expression database and related application system that is; (i) specialized in anticancer agents; (ii) easy to use; and (iii) open to the public. To develop a public gene expression database of antitumor agents, we first examined gene expression profiles in human cancer cells after exposure to 35 compounds including 25 clinically used anticancer agents. Gene signatures were extracted that were classified as upregulated or downregulated after exposure to the drug. Hierarchical clustering showed that drugs with similar mechanisms of action, such as genotoxic drugs, were clustered. Connectivity map analysis further revealed that our gene signature data reflected modes of action of the respective agents. Together with the database, we developed analysis programs that calculate scores for ranking changes in gene expression and for searching statistically significant pathways from the Kyoto Encyclopedia of Genes and Genomes database in order to analyze the datasets more easily. Our database and the analysis programs are available online at our website (http://scads.jfcr.or.jp/db/cs/). Using these systems, we successfully showed that proteasome inhibitors are selectively classified as endoplasmic reticulum stress inducers and induce atypical endoplasmic reticulum stress. Thus, our public access database and related analysis programs constitute a set of efficient tools to evaluate the mode of action of novel compounds and identify promising anticancer lead compounds. © 2012 Japanese Cancer Association.
TEACHER TRAINING AND THE CLASSICS.
ERIC Educational Resources Information Center
REXINE, JOHN E.
WITH THE STUDY OF MODERN FOREIGN LANGUAGES FAR OUTSTRIPPING THE STUDY OF LATIN, IT BEHOOVES LATIN TEACHERS TO REVITALIZE THEIR TEACHER EDUCATION PROGRAMS. MORE SPECIFICALLY, NEW PROGRAMS SHOULD REFLECT THE IDEAS AND RECOMMENDATIONS OF JAMES B. CONANT, THE PLANS OF FIVE EXPLORATORY PROGRAMS IN TEACHER PREPARATION INITIATED BY THE N.Y. STATE…
Web Based Parallel Programming Workshop for Undergraduate Education.
ERIC Educational Resources Information Center
Marcus, Robert L.; Robertson, Douglass
Central State University (Ohio), under a contract with Nichols Research Corporation, has developed a World Wide web based workshop on high performance computing entitled "IBN SP2 Parallel Programming Workshop." The research is part of the DoD (Department of Defense) High Performance Computing Modernization Program. The research…
Military Applications of Curved Focal Plane Arrays Developed by the HARDI Program
2011-01-01
considered one of the main founders of geometrical optics, modern photography, and cinematography . Among his inventions are the Petzval portrait lens...still be a problem. B. HARDI Program/Institute for Defense Analyses (IDA) Task 1. HARDI Program State-of-the- art cameras could be improved by
Humanistic Teacher Education: An Experiment in Systematic Curriculum Innovation.
ERIC Educational Resources Information Center
Wass, Hannelore; And Others
The Florida Childhood Education Program (CEP) at the University of Florida is an innovative teacher education program whose theoretical base lies in 12 years of research in effective teaching combined with modern thinking from perceptual-humanistic psychology. This theory was given practical expression in an experimental program designed and…
The Way to Modernization: Language Ideologies and the Peace Corps English Education in Korea
ERIC Educational Resources Information Center
Lee, Chee Hye
2017-01-01
The language policies and practices embodied in the Peace Corps/Korea program (1966-1981) are the reflection and the implementation of language ideologies that interplay with the socio-historical, political, and economic contexts of Korea during the 1960s and 1970s. Concerned with a nation's modernization, Korea placed an emphasis on educational…
ERIC Educational Resources Information Center
O'Connell, William; Shupe, Margery
2007-01-01
Graduate counseling programs are proficient in training direct service providers but less able to teach the business of sustaining a community agency's services. Modern philanthropy emphasizes social advocacy by investing in change that benefits the local community and respects the diverse cultural experiences of potential clients and…
Programmed Course in Modern Literary Arabic Phonology and Script.
ERIC Educational Resources Information Center
McCarus, Ernest; Rammuny, Raji
Three sets of instructional materials for the teaching of Arabic phonology and script have been prepared on the basis of studies of (1) the phonologies of American English and Modern Literary Arabic (MLA), (2) the MLA writing system, and (3) the vocabularies of 11 Arabic textbooks used in the United States. The effectiveness of these materials was…
Update on GPS Modernization Efforts
2015-06-02
Bilateral Agreements • Adjacent Band Interference • International Committee On Global Navigation Satellite Systems ( GNSS ) Department of...Receivers • Distribute PRNs for the World - 120 for US and 90 for GNSS International Cooperation • 57 Authorized Allied Users - 25+ Years of...Cooperation • GNSS -Europe - Galilee - China - COMPASS -Russia - GLONASS - Japan - QZSS - India- IRNSS 5 GPS Modernization Program SPACE AND MISSILE
Al Roy: The First Modern Strength Coach
ERIC Educational Resources Information Center
Todd, Terry
2008-01-01
This article presents a historical perspective through the story of Alvin Roy, the first modern strength coach. Roy went against the common belief in the 1950s that weight lifting made athletes slow and bulky. When the football coaches at Istrouma High School in Baton Rouge, Louisiana, allowed him to set up and supervise a weight-training program,…
NASA Astrophysics Data System (ADS)
Nakagawa, Y.; Kawahara, S.; Araki, F.; Matsuoka, D.; Ishikawa, Y.; Fujita, M.; Sugimoto, S.; Okada, Y.; Kawazoe, S.; Watanabe, S.; Ishii, M.; Mizuta, R.; Murata, A.; Kawase, H.
2017-12-01
Analyses of large ensemble data are quite useful in order to produce probabilistic effect projection of climate change. Ensemble data of "+2K future climate simulations" are currently produced by Japanese national project "Social Implementation Program on Climate Change Adaptation Technology (SI-CAT)" as a part of a database for Policy Decision making for Future climate change (d4PDF; Mizuta et al. 2016) produced by Program for Risk Information on Climate Change. Those data consist of global warming simulations and regional downscaling simulations. Considering that those data volumes are too large (a few petabyte) to download to a local computer of users, a user-friendly system is required to search and download data which satisfy requests of the users. We develop "a database system for near-future climate change projections" for providing functions to find necessary data for the users under SI-CAT. The database system for near-future climate change projections mainly consists of a relational database, a data download function and user interface. The relational database using PostgreSQL is a key function among them. Temporally and spatially compressed data are registered on the relational database. As a first step, we develop the relational database for precipitation, temperature and track data of typhoon according to requests by SI-CAT members. The data download function using Open-source Project for a Network Data Access Protocol (OPeNDAP) provides a function to download temporally and spatially extracted data based on search results obtained by the relational database. We also develop the web-based user interface for using the relational database and the data download function. A prototype of the database system for near-future climate change projections are currently in operational test on our local server. The database system for near-future climate change projections will be released on Data Integration and Analysis System Program (DIAS) in fiscal year 2017. Techniques of the database system for near-future climate change projections might be quite useful for simulation and observational data in other research fields. We report current status of development and some case studies of the database system for near-future climate change projections.
ERIC Educational Resources Information Center
Irwin, Gretchen; Wessel, Lark; Blackman, Harvey
2012-01-01
This case describes a database redesign project for the United States Department of Agriculture's National Animal Germplasm Program (NAGP). The case provides a valuable context for teaching and practicing database analysis, design, and implementation skills, and can be used as the basis for a semester-long team project. The case demonstrates the…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-28
... identifying individual criminal offenders and alleged offenders and consisting only of identifying data and... to 5 U.S.C. 552a(j)(2): DO .220--SIGTARP Hotline Database. DO .221--SIGTARP Correspondence Database. DO .222--SIGTARP Investigative MIS Database. DO .223--SIGTARP Investigative Files Database. DO .224...
The Design and Analysis of a Network Interface for the Multi-Lingual Database System.
1985-12-01
IDENTIF:CATION NUMBER 0 ORGANIZATION (If applicable) 8c. ADDRESS (City, State. and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT...APPFNlDIX - THE~ KMS PROGRAM SPECIFICATI~bS ........ 94 I4 XST O)F REFEFRENCFS*O*IOebqBS~*OBS 124 Il LIST OF FIrURPS F’igure 1: The multi-Linqual Database...bacKend Database System *CABO0S). In this section, we Provide an overviev of Doti tne MLLS an tne 4B0S to enhance the readers understandin- of the
Technical implementation of an Internet address database with online maintenance module.
Mischke, K L; Bollmann, F; Ehmer, U
2002-01-01
The article describes the technical implementation and management of the Internet address database of the center for ZMK (University of Münster, Dental School) Münster, which is integrated in the "ZMK-Web" website. The editorially maintained system guarantees its topicality primarily due to the electronically organized division of work with the aid of an online maintenance module programmed in JavaScript/PHP, as well as a database-related feedback function for the visitor to the website through configuration-independent direct mail windows programmed in JavaScript/PHP.
Ionizable side chains at catalytic active sites of enzymes.
Jimenez-Morales, David; Liang, Jie; Eisenberg, Bob
2012-05-01
Catalytic active sites of enzymes of known structure can be well defined by a modern program of computational geometry. The CASTp program was used to define and measure the volume of the catalytic active sites of 573 enzymes in the Catalytic Site Atlas database. The active sites are identified as catalytic because the amino acids they contain are known to participate in the chemical reaction catalyzed by the enzyme. Acid and base side chains are reliable markers of catalytic active sites. The catalytic active sites have 4 acid and 5 base side chains, in an average volume of 1,072 Å(3). The number density of acid side chains is 8.3 M (in chemical units); the number density of basic side chains is 10.6 M. The catalytic active site of these enzymes is an unusual electrostatic and steric environment in which side chains and reactants are crowded together in a mixture more like an ionic liquid than an ideal infinitely dilute solution. The electrostatics and crowding of reactants and side chains seems likely to be important for catalytic function. In three types of analogous ion channels, simulation of crowded charges accounts for the main properties of selectivity measured in a wide range of solutions and concentrations. It seems wise to use mathematics designed to study interacting complex fluids when making models of the catalytic active sites of enzymes.
Ionizable Side Chains at Catalytic Active Sites of Enzymes
Jimenez-Morales, David; Liang, Jie
2012-01-01
Catalytic active sites of enzymes of known structure can be well defined by a modern program of computational geometry. The CASTp program was used to define and measure the volume of the catalytic active sites of 573 enzymes in the Catalytic Site Atlas database. The active sites are identified as catalytic because the amino acids they contain are known to participate in the chemical reaction catalyzed by the enzyme. Acid and base side chains are reliable markers of catalytic active sites. The catalytic active sites have 4 acid and 5 base side chains, in an average volume of 1072 Å3. The number density of acid side chains is 8.3 M (in chemical units); the number density of basic side chains is 10.6 M. The catalytic active site of these enzymes is an unusual electrostatic and steric environment in which side chains and reactants are crowded together in a mixture more like an ionic liquid than an ideal infinitely dilute solution. The electrostatics and crowding of reactants and side chains seems likely to be important for catalytic function. In three types of analogous ion channels, simulation of crowded charges accounts for the main properties of selectivity measured in a wide range of solutions and concentrations. It seems wise to use mathematics designed to study interacting complex fluids when making models of the catalytic active sites of enzymes. PMID:22484856
Micronutrient Research, Programs, and Policy: From Meta-analyses to Metabolomics123
Allen, Lindsay H.
2014-01-01
Micronutrient deficiencies are widespread among women and children in undernourished populations. Research has identified effective approaches to their prevention, including supplementation, fortification, and dietary and other public health interventions. These interventions have made tremendous improvements in the quality of life, health, and survival of populations around the world, yet the impact varies by nutrient, population, and the outcomes chosen that reflect nutritionally driven change. The WHO guides governments and agencies toward effective strategies to prevent micronutrient deficiencies in women and children, but these are often informed by imperfect studies with limited measures of impact and the inadequate program evaluations and survey databases produced by the nutrition community. The resulting knowledge gaps limit our ability to discern what interventions are effective, under what conditions, among whom, and perhaps most important, why. However, we are moving into an era of opportunity to apply the tools of modern nutrition science, including improved methods of assessing nutritional status, “omics,” bioarchival access, systems biology thinking, and interdisciplinary collaborations, that can deepen and broaden our understanding of how micronutrients affect health, how their deficiencies diminish human capacity, and how interventions can improve the well-being of those in need. Relevant training and greater cross-disciplinary efforts will be required to ensure a cell-to-society approach that can systematically address where, to whom, and how to provide micronutrients in the future. PMID:24829487
Porting and redesign of Geotool software system to Qt
NASA Astrophysics Data System (ADS)
Miljanovic Tamarit, V.; Carneiro, L.; Henson, I. H.; Tomuta, E.
2016-12-01
Geotool is a software system that allows a user to interactively display and process seismoacoustic data from International Monitoring System (IMS) station. Geotool can be used to perform a number of analysis and review tasks, including data I/O, waveform filtering, quality control, component rotation, amplitude and arrival measurement and review, array beamforming, correlation, Fourier analysis, FK analysis, event review and location, particle motion visualization, polarization analysis, instrument response convolution/deconvolution, real-time display, signal to noise measurement, spectrogram, and travel time model display. The Geotool program was originally written in C using the X11/Xt/Motif libraries for graphics. It was later ported to C++. Now the program is being ported to the Qt graphics system to be more compatible with the other software in the International Data Centre (IDC). Along with this port, a redesign of the architecture is underway to achieve a separation between user interface, control, and data model elements, in line with design patterns such as Model-View-Controller. Qt is a cross-platform application framework that will allow geotool to easily run on Linux, Mac, and Windows. The Qt environment includes modern libraries and user interfaces for standard utilities such as file and database access, printing, and inter-process communications. The Qt Widgets for Technical Applications library (QWT) provides tools for displaying standard data analysis graphics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Kevin G; Yamamoto, Yukinori; Pint, Bruce A
2016-01-01
A large effort is underway under the leadership of US DOE Fuel Cycle R&D program to develop advanced FeCrAl alloys as accident tolerant fuel (ATF) cladding to replace Zr-based alloys in light water reactors. The primary motivation is the excellent oxidation resistance of these alloys in high-temperature steam environments right up to their melting point (roughly three orders of magnitude slower oxidation kinetics than zirconium). A multifaceted effort is ongoing to rapidly advance FeCrAl alloys as a mature ATF concept. The activities span the broad spectrum of alloy development, environmental testing (high-temperature high-pressure water and elevated temperature steam), detailed mechanicalmore » characterization, material property database development, neutron irradiation, thin tube production, and multiple integral fuel test campaigns. Instead of off-the-shelf commercial alloys that might not prove optimal for the LWR fuel cladding application, a large amount of effort has been placed on the alloy development to identify the most optimum composition and microstructure for this application. The development program is targeting a cladding that offers performance comparable to or better than modern Zr-based alloys under normal operating and off-normal conditions. This paper provides a comprehensive overview of the systematic effort to advance nuclear-grade FeCrAl alloys as an ATF cladding in commercial LWRs.« less
Pan Air Geometry Management System (PAGMS): A data-base management system for PAN AIR geometry data
NASA Technical Reports Server (NTRS)
Hall, J. F.
1981-01-01
A data-base management system called PAGMS was developed to facilitate the data transfer in applications computer programs that create, modify, plot or otherwise manipulate PAN AIR type geometry data in preparation for input to the PAN AIR system of computer programs. PAGMS is composed of a series of FORTRAN callable subroutines which can be accessed directly from applications programs. Currently only a NOS version of PAGMS has been developed.
Nuclear Forensics: A Capability at Risk (Abbreviated Version)
DOE Office of Scientific and Technical Information (OSTI.GOV)
National Research Council of the National Academies
Nuclear forensics is important to our national security. Actions, including provision of appropriate funding, are needed now to sustain and improve the nation's nuclear forensics capabilities. The Department of Homeland Security (DHS), working with cooperating agencies and national laboratories, should plan and implement a sustainable, effective nuclear forensics program. Nuclear forensics is the examination and evaluation of discovered or seized nuclear materials and devices or, in cases of nuclear explosions or radiological dispersals, of detonation signals and post-detonation debris. Nuclear forensic evidence helps law enforcement and intelligence agencies work toward preventing, mitigating, and attributing a nuclear or radiological incident. Thismore » report, requested by DHS, the National Nuclear Security Administration, and the Department of Defense, makes recommendations on how to sustain and improve U.S. nuclear forensics capabilities. The United States has developed a nuclear forensics capability that has been demonstrated in real-world incidents of interdicted materials and in exercises of actions required after a nuclear detonation. The committee, however, has concerns about the program and finds that without strong leadership, careful planning, and additional funds, these capabilities will decline. Major areas of concern include: Organization. The responsibility for nuclear forensics is shared by several agencies without central authority and with no consensus on strategic requirements to guide the program. This organizational complexity hampers the program and could prove to be a major hindrance operationally. Sustainability. The nation's current nuclear forensics capabilities are available primarily because the system of laboratories, equipment, and personnel upon which they depend was developed and funded by the nuclear weapons program. However, the weapons program's funds are declining. Workforce and Infrastructure. Personnel skilled in nuclear forensics are too few and are spread too thinly. Some key facilities are in need of replacement because they are old, outdated, and not built to modern environmental, health, and safety standards. Procedures and Tools. Most nuclear forensics techniques were developed to carry out Cold War missions and to satisfy a different, less restrictive set of environmental, health, and safety standards. Some of the equipment also does not reflect today's technical capabilities. The Executive Office of the President established the National Technical Nuclear Forensics Center under the direction of the Secretary of Homeland Security, to coordinate nuclear forensics in the United States. DHS's responsibility can only be carried out with the cooperation and support of the other agencies involved. The committee recommends that DHS and the other cooperating agencies should: 1. Streamline the organizational structure, aligning authority and responsibility; and develop and issue appropriate requirements documents. 2. Issue a coordinated and integrated implementation plan for fulfilling the requirements and sustaining and improving the program's capabilities. This plan would form the basis for the agencies' multi-year program budget requests. 3. Implement a plan to build and maintain an appropriately sized and composed nuclear forensics workforce, ensuring sufficient staffing at the national laboratories and support for university research, training programs, and collaborative relationships among the national laboratories and other organizations. 4. Adapt nuclear forensics to the challenges of real emergency situations, including, for example, conducting more realistic exercises that are unannounced and that challenge regulations and procedures followed in the normal work environment, and implementing lessons learned. The national laboratories should: 5. Optimize procedures and equipment through R&D to meet program requirements. Modeling and simulation should play an increased role in research, development, and planning. The nuclear forensics community should: 6. Develop standards and procedures for nuclear forensics that are rooted in the same underlying principles that have been recommended to guide modern forensic science. DHS and the other cooperating agencies should: 7. Devise and implement a plan that enables access to relevant information in databases including classified and proprietary databases for nuclear forensics missions. The Executive Office of the President and the Department of State, working with the community of nuclear forensics experts, should: 8. Determine the classes of data and methods that are to be shared internationally and explore mechanisms to accomplish that sharing.« less
Journal of Chemical Education: Software.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1988
1988-01-01
Describes a chemistry software program that emulates a modern binary gradient HPLC system with reversed phase column behavior. Allows for solvent selection, adjustment of gradient program, column selection, detectory selection, handling of computer sample data, and sample preparation. (MVL)
Holm, Sven; Russell, Greg; Nourrit, Vincent; McLoughlin, Niall
2017-01-01
A database of retinal fundus images, the DR HAGIS database, is presented. This database consists of 39 high-resolution color fundus images obtained from a diabetic retinopathy screening program in the UK. The NHS screening program uses service providers that employ different fundus and digital cameras. This results in a range of different image sizes and resolutions. Furthermore, patients enrolled in such programs often display other comorbidities in addition to diabetes. Therefore, in an effort to replicate the normal range of images examined by grading experts during screening, the DR HAGIS database consists of images of varying image sizes and resolutions and four comorbidity subgroups: collectively defined as the diabetic retinopathy, hypertension, age-related macular degeneration, and Glaucoma image set (DR HAGIS). For each image, the vasculature has been manually segmented to provide a realistic set of images on which to test automatic vessel extraction algorithms. Modified versions of two previously published vessel extraction algorithms were applied to this database to provide some baseline measurements. A method based purely on the intensity of images pixels resulted in a mean segmentation accuracy of 95.83% ([Formula: see text]), whereas an algorithm based on Gabor filters generated an accuracy of 95.71% ([Formula: see text]).
Yayac, Michael; Javandal, Mitra; Mulcahey, Mary K
2017-01-01
A substantial number of orthopaedic surgeons apply for sports medicine fellowships after residency completion. The Internet is one of the most important resources applicants use to obtain information about fellowship programs, with the program website serving as one of the most influential sources. The American Orthopaedic Society for Sports Medicine (AOSSM), San Francisco Match (SFM), and Arthroscopy Association of North America (AANA) maintain databases of orthopaedic sports medicine fellowship programs. A 2013 study evaluated the content and accessibility of the websites for accredited orthopaedic sports medicine fellowships. To reassess these websites based on the same parameters and compare the results with those of the study published in 2013 to determine whether any improvement has been made in fellowship website content or accessibility. Cross-sectional study. We reviewed all existing websites for the 95 accredited orthopaedic sports medicine fellowships included in the AOSSM, SFM, and AANA databases. Accessibility of the websites was determined by performing a Google search for each program. A total of 89 sports fellowship websites were evaluated for overall content. Websites for the remaining 6 programs could not be identified, so they were not included in content assessment. Of the 95 accredited sports medicine fellowships, 49 (52%) provided links in the AOSSM database, 89 (94%) in the SFM database, and 24 (25%) in the AANA database. Of the 89 websites, 89 (100%) provided a description of the program, 62 (70%) provided selection process information, and 40 (45%) provided a link to the SFM website. Two searches through Google were able to identify links to 88% and 92% of all accredited programs. The majority of accredited orthopaedic sports medicine fellowship programs fail to utilize the Internet to its full potential as a resource to provide applicants with detailed information about the program, which could help residents in the selection and ranking process. Orthopaedic sports medicine fellowship websites that are easily accessible through the AOSSM, SFM, AANA, or Google and that provide all relevant information for applicants would simplify the process of deciding where to apply, interview, and ultimately how to rank orthopaedic sports medicine fellowship programs for the Orthopaedic Sports Medicine Fellowship Match.
Ocean Drilling Program: Privacy Policy
and products Drilling services and tools Online Janus database Search the ODP/TAMU web site ODP's main web site ODP/TAMU Science Operator Home Ocean Drilling Program Privacy Policy The following is the privacy policy for the www-odp.tamu.edu web site. 1. Cookies are used in the Database portion of the web
Design, Development, and Maintenance of the GLOBE Program Website and Database
NASA Technical Reports Server (NTRS)
Brummer, Renate; Matsumoto, Clifford
2004-01-01
This is a 1-year (Fy 03) proposal to design and develop enhancements, implement improved efficiency and reliability, and provide responsive maintenance for the operational GLOBE (Global Learning and Observations to Benefit the Environment) Program website and database. This proposal is renewable, with a 5% annual inflation factor providing an approximate cost for the out years.
Listing of Education in Archaeological Programs: The LEAP Clearinghouse 1990-1991 Summary Report.
ERIC Educational Resources Information Center
Knoll, Patricia C., Ed.
This is the second catalog of the National Park Service's Listing of Education in Archaeological Programs (LEAP). It consists of the information incorporated into the LEAP computerized database between 1990 and 1991. The database is a listing of federal, state, local, and private projects promoting public awareness of U.S. archaeology including…
USDA-ARS?s Scientific Manuscript database
For nearly 20 years, the National Food and Nutrient Analysis Program (NFNAP) has expanded and improved the quantity and quality of data in US Department of Agriculture’s (USDA) food composition databases through the collection and analysis of nationally representative food samples. This manuscript d...
Database Application for a Youth Market Livestock Production Education Program
ERIC Educational Resources Information Center
Horney, Marc R.
2013-01-01
This article offers an example of a database designed to support teaching animal production and husbandry skills in county youth livestock programs. The system was used to manage production goals, animal growth and carcass data, photos and other imagery, and participant records. These were used to produce a variety of customized reports to help…
The Internet as a communication tool for orthopedic spine fellowships in the United States.
Silvestre, Jason; Guzman, Javier Z; Skovrlj, Branko; Overley, Samuel C; Cho, Samuel K; Qureshi, Sheeraz A; Hecht, Andrew C
2015-04-01
Orthopedic residents seeking additional training in spine surgery commonly use the Internet to manage their fellowship applications. Although studies have assessed the accessibility and content of Web sites in other medical specialties, none have looked at orthopedic spine fellowship Web sites (SFWs). The purpose of this study was to evaluate the accessibility of information from commonly used databases and assess the content of SFWs. This was a Web site accessibility and content evaluation study. A comprehensive list of available orthopedic spine fellowship programs was compiled by accessing program lists from the SF Match, North American Spine Society, Fellowship and Residency Electronic Interactive Database (FREIDA), and Orthopaedicsone.com (Ortho1). These databases were assessed for accessibility of information including viable links to SFWs and responsive program contacts. A Google search was used to identify SFWs not readily available on these national databases. SFWs were evaluated based on online education and recruitment content. Evaluators found 45 SFWs of 63 active programs (71%). Available SFWs were often not readily accessible from national program lists, and no program afforded a direct link to their SFW from SF Match. Approximately half of all programs responded via e-mail. Although many programs described surgical experience (91%) and research requirements (87%) during the fellowship, less than half mentioned didactic instruction (46%), journal clubs (41%), and national meetings or courses attended (28%). Evaluators found an average 45% of fellow recruitment content. Comparison of SFWs by program characteristics revealed three significant differences. Programs with greater than one fellowship position had greater online education content than programs with a single fellow (p=.022). Spine fellowships affiliated with an orthopedic residency program maintained greater education (p=.006) and recruitment (p=.046) content on their SFWs. Most orthopedic spine surgery programs underuse the Internet for fellow education and recruitment. The inaccessibility of information and paucity of content on SFWs allow for future opportunity to optimize these resources. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Saltsman, James F.
1992-01-01
This manual presents computer programs for characterizing and predicting fatigue and creep-fatigue resistance of metallic materials in the high-temperature, long-life regime for isothermal and nonisothermal fatigue. The programs use the total strain version of Strainrange Partitioning (TS-SRP). An extensive database has also been developed in a parallel effort. This database is probably the largest source of high-temperature, creep-fatigue test data available in the public domain and can be used with other life prediction methods as well. This users manual, software, and database are all in the public domain and are available through COSMIC (382 East Broad Street, Athens, GA 30602; (404) 542-3265, FAX (404) 542-4807). Two disks accompany this manual. The first disk contains the source code, executable files, and sample output from these programs. The second disk contains the creep-fatigue data in a format compatible with these programs.
Initial Results from the Variable Intensity Sonic Boom Database
NASA Technical Reports Server (NTRS)
Haering, Edward A., Jr.; Cliatt, Larry J., II; Gabrielson, Thomas; Sparrow, Victor W.; Locey, Lance L.; Bunce, Thomas J.
2008-01-01
43 sonic booms generated (a few were evanescent waves) a) Overpressures of 0.08 to 2.20 lbf/sq ft; b) Rise-times of about 0.7 to 50 ms. Objectives: a) Structural response of a house of modern construction; b) Sonic boom propagation code validation. Approach: a) Measure shockwave directionality; b) Determine effect of height above ground on acoustic level; c) Generate atmospheric turbulence filter functions.