Learn about CHP technologies, including reciprocating engines, combustion turbines, steam turbines, microturbines, fuel cells, and waste heat to power. Access the Catalog of CHP Technologies and the Biomass CHP Catalog of Technologies.
SHARED TECHNOLOGY TRANSFER PROGRAM
DOE Office of Scientific and Technical Information (OSTI.GOV)
GRIFFIN, JOHN M. HAUT, RICHARD C.
2008-03-07
The program established a collaborative process with domestic industries for the purpose of sharing Navy-developed technology. Private sector businesses were educated so as to increase their awareness of the vast amount of technologies that are available, with an initial focus on technology applications that are related to the Hydrogen, Fuel Cells and Infrastructure Technologies (Hydrogen) Program of the U.S. Department of Energy. Specifically, the project worked to increase industry awareness of the vast technology resources available to them that have been developed with taxpayer funding. NAVSEA-Carderock and the Houston Advanced Research Center teamed with Nicholls State University to catalog NAVSEA-Carderockmore » unclassified technologies, rated the level of readiness of the technologies and established a web based catalog of the technologies. In particular, the catalog contains technology descriptions, including testing summaries and overviews of related presentations.« less
Federal Technology Catalog 1982: Summaries of practical technology
NASA Astrophysics Data System (ADS)
The catalog presents summaries of practical technology selected for commercial potential and/or promising applications to the fields of computer technology, electrotechnology, energy, engineering, life sciences, machinery and tools, manufacturing, materials, physical sciences, and testing and instrumentation. Each summary not only describes a technology, but gives a source for further information. This publication describes some 1,100 new processes, inventions, equipment, software, and techniques developed by and for dozens of Federal agencies during 1982. Included is coverage of NASA Tech Briefs, DOE Energygrams, and Army Manufacturing Notes.
NASA Technical Reports Server (NTRS)
1973-01-01
A catalog containing data pertaining to the imagery acquired by the Earth Resources Technology Satellite (ERTS) from its date of launch, July 23, 1972 through the first year of activity is presented. The catalog supersedes the previous catalog which supplied data available through May 1973. Two listings of the imagery are included: (1) an observation identifications listing and (2) a listing of the imagery based on geographical location, the coordinate listing.
Technology and the Online Catalog.
ERIC Educational Resources Information Center
Graham, Peter S.
1983-01-01
Discusses trends in computer technology and their use for library catalogs, noting the concept of bandwidth (describes quantity of information transmitted per given unit of time); computer hardware differences (micros, minis, maxis); distributed processing systems and databases; optical disk storage; networks; transmission media; and terminals.…
ERIC Educational Resources Information Center
Weisbrod, David L.
This booklet, one of a series of background papers for the White House Conference, explores the potential of new technologies to improve library services while reducing library costs. Separate subsections describe the application of technology to the following library functions: acquisitions, catalogs and cataloging, serials control, circulation…
The Computer Catalog: A Democratic or Authoritarian Technology?
ERIC Educational Resources Information Center
Adams, Judith A.
1988-01-01
Discussion of consequences of library automation argues that technology should be used to augment access to information. Online public access catalogs are considered in this context, along with several related issues such as system incompatibility, invasion of privacy, barriers to database access and manipulation, and user fees, which contribute…
Earth Resources Technology Satellite: US standard catalog No. U-12
NASA Technical Reports Server (NTRS)
1973-01-01
To provide dissemination of information regarding the availability of Earth Resources Technology Satellite (ERTS) imagery, a U.S. Standard Catalog is published on a monthly schedule. The catalogs identify imagery which has been processed and input to the data files during the preceding month. The U.S. Standard Catalog includes imagery covering the Continental United States, Alaska, and Hawaii. As a supplement to these catalogs, an inventory of ERTS imagery on 16 millimeter microfilm is available. The catalogs consist of four parts: (1) annotated maps which graphically depict the geographic areas covered by the imagery listed in the current catalog, (2) a computer-generated listing organized by observation identification number (D) with pertinent information on each image, (3) a computer listing of observations organized by longitude and latitude, and (4) observations which have had changes made in their catalog information since the original entry in the data base.
Nuclear Fuel Cycle Options Catalog: FY16 Improvements and Additions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Laura L.; Barela, Amanda Crystal; Schetnan, Richard Reed
2016-08-31
The United States Department of Energy, Office of Nuclear Energy, Fuel Cycle Technology Program sponsors nuclear fuel cycle research and development. As part of its Fuel Cycle Options campaign, the DOE has established the Nuclear Fuel Cycle Options Catalog. The catalog is intended for use by the Fuel Cycle Technologies Program in planning its research and development activities and disseminating information regarding nuclear energy to interested parties. The purpose of this report is to document the improvements and additions that have been made to the Nuclear Fuel Cycle Options Catalog in the 2016 fiscal year.
Computer Output Microform Library Catalog: A Survey.
ERIC Educational Resources Information Center
Zink, Steven D.
This discussion of the use of computer output microform (COM) as a feasible alternative to the library card catalog includes a brief history of library catalogs and of microform technology since World War II. It is argued that COM catalogs are to be preferred to card catalogs, online catalogs accessed by terminals, and paper printouts. Advantages…
Cataloguing Costed and Restructured at Curtin University of Technology.
ERIC Educational Resources Information Center
Wade, Rona; Williamson, Vicki
1998-01-01
Outlines the results of a review of the cataloging operations at the library at Curtin University of Technology and the three other publicly funded universities in WAGUL (the Western Australian Group of University Librarians). Identifies cost savings to be achieved by re-engineering cataloging and related operations, and separates them from the…
ERIC Educational Resources Information Center
Solomon, Paul
1994-01-01
Examines elementary school students' use of an online public access catalog to investigate the interaction between children, technology, curriculum, instruction, and learning. Highlights include patterns of successes and breakdowns; search strategies; instructional approaches and childrens' interests; structure of interaction; search terms; and…
Enhancing Access to Information: Designing Catalogs for the 21st Century.
ERIC Educational Resources Information Center
Tyckoson, David A., Ed.
This book addresses the problem of when a library has limited catalog access, and explores various technological methods to expand the catalog beyond its traditional boundaries. Fourteen chapters describe catalog projects in individual libraries: (1) "Enhancing Access to Information: Building Catalogs for the Future" (David A. Tyckoson);…
Standards for Cataloging Nonprint Materials. Revised Edition.
ERIC Educational Resources Information Center
Quinly, William J.; And Others
Rules and procedures for cataloging non-print media are provided in this manual of the Association for Educational Communications and Technology. The first section on cataloging rules covers all elements which should appear on the catalog card. After some comments on entries, the arrangement of catalog elements, and style, the elements of the…
Non-conventional technologies for data collection in Brazilian dissertations and theses.
Salvador, Pétala Tuani Candido de Oliveira; Rodrigues, Cláudia Cristiane Filgueira Martins; de Lima, Kálya Yasmine Nunes; Alves, Kisna Yasmin Andrade; Santos, Viviane Euzébia Pereira
2015-01-01
to characterize non-conventional technologies used for data collection of dissertations and theses available in the Catalog of Theses and Dissertations (CEPEn) of the Brazilian Nursing Association (ABEn). this is a documentary research, whose data were collected in the catalogs of theses and dissertations available at the ABEn website, from Volumes XIX to XXI. The indicators collected were: academic level; educational institution; year; qualification of the author; setting; non-conventional technology used; type of technology; association with conventional techniques; methodological design; benefits and methodological limitations. from a total of 6346 studies, only 121 (1.91%) used non-conventional technologies for data collection, representing the fi nal sample of the study. it is concluded that Brazilian Nursing researches still need methodological innovations for data collection.
Technology: Trigger for Change in Reference Librarianship.
ERIC Educational Resources Information Center
Hallman, Clark N.
1990-01-01
Discussion of the influence of technological developments on social change focuses on the effects of information technology on academic reference librarianship. Highlights include reference skills; electronic resources; microcomputer technology; online catalogs; interaction and communication with users; the need to teach information skills; and…
The Online Catalog Revolution.
ERIC Educational Resources Information Center
Kilgour, Frederick G.
1984-01-01
A review of library technological development and card catalog innovations of the past century and a half precedes a discussion of online public access catalog development. Design requirements and purpose of the online catalog, access techniques and provisions, costs, and future integration are highlighted. Twenty-two references are listed. (EJS)
Google Books vs. BISON: Is the BISON Catalog Going the Way of Its Namesake?
ERIC Educational Resources Information Center
Ludwig, Mark J.; Wells, Margaret R.
2008-01-01
Just as the Internet is likely to be one of the most disruptive overall technologies of a man's lifetime, Google Books may become one of the most disruptive technologies for academic libraries. The immediate challenge is that Google Book's deeper indexing and more advanced relevancy ranking usually works better than that of local catalogs--and it…
Earth Resources Technology Satellite: Non-US standard catalog No. N-13
NASA Technical Reports Server (NTRS)
1973-01-01
To provide dissemination of information regarding the availability of Earth Resources Technology Satellite (ERTS) imagery, a Non-U.S. Standard Catalog is published on a monthly schedule. The catalogs identify imagery which has been processed and input to the data files during the preceding month. The Non-U.S. Standard Catalog includes imagery covering all areas except that of the United States, Hawaii, and Alaska. Imagery adjacent to the Continental U.S. and Alaska borders will normally appear in the U.S. Standard Catalog. As a supplement to these catalogs, an inventory of ERTS imagery on 16 millimeter microfilm is available. The catalogs consist of four parts: (1) annotated maps which graphically depict the geographic areas covered by the imagery listed in the current catalog, (2) a computer-generated listing organized by observation identification number (ID) with pertinent information for each image, (3) a computer listing of observations organized by longitude and latitude, and (4) observations which have had changes made in their catalog information since the original entry in the data base.
Information Technology: A Bibliography.
ERIC Educational Resources Information Center
Wright, William F.; Hawkins, Donald T.
1981-01-01
This selective annotated bibliography lists 86 references on the following topics: future technology for libraries, library automation, paperless information systems; computer conferencing and electronic mail, videotext systems, videodiscs, communications technology, networks, information retrieval, cataloging, microcomputers, and minicomputers.…
NASA SBIR product catalog, 1991
NASA Technical Reports Server (NTRS)
1991-01-01
This catalog is a partial list of products of NASA SBIR (Small Business Innovation Research) projects that have advanced to some degree into Phase 3. While most of the products evolved from work conducted during SBIR Phase 1 and 2, a few advanced to commercial status solely from Phase 1 activities. The catalog presents information provided to NASA by SBIR contractors who wished to have their products exhibited at Technology 2001, a NASA-sponsored technology transfer conference held in San Jose, California, on December 4, 5, and 6, 1991. The catalog presents the product information in the following technology areas: computer and communication systems; information processing and AI; robotics and automation; signal and image processing; microelectronics; electronic devices and equipment; microwave electronic devices; optical devices and lasers; advanced materials; materials processing; materials testing and NDE; materials instrumentation; aerodynamics and aircraft; fluid mechanics and measurement; heat transfer devices; refrigeration and cryogenics; energy conversion devices; oceanographic instruments; atmosphere monitoring devices; water management; life science instruments; and spacecraft electromechanical systems.
ERIC Educational Resources Information Center
Chiang, Ching-hsin
This thesis reports on the designer's plans and experiences in carrying out the design, development, implementation, and evaluation of a project, the purpose of which was to develop a training program that would enable foreign students at the New York Institute of Technology (NYIT) to use the Computer Output Microform Catalog (COMCAT) and to…
Biomass CHP Catalog of Technologies
This report reviews the technical and economic characterization of biomass resources, biomass preparation, energy conversion technologies, power production systems, and complete integrated CHP systems.
ERIC Educational Resources Information Center
International Federation of Library Associations, The Hague (Netherlands).
Papers on cataloging and national bibliography presented at the 1984 general conference of IFLA include: (1) "Pratiques et Problemes de Catalogage au Senegal" (Cataloging Practices and Problems in Senegal) (Marietou Diop Diongue, Senegal); (2) "The Consequences of New Technologies in Classification and Subject Cataloging in Third…
78 FR 31535 - Assistive Technology Alternative Financing Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-24
... DEPARTMENT OF EDUCATION Assistive Technology Alternative Financing Program AGENCY: Office of Special Education and Rehabilitative Services, Department of Education. ACTION: Notice. Catalog of Federal... developed for the Assistive Technology (AT) Alternative Financing Program (AFP) in fiscal year (FY) 2012 to...
Technology in the Public Library: Results from the 1992 PLDS Survey of Technology.
ERIC Educational Resources Information Center
Fidler, Linda M.; Johnson, Debra Wilcox
1994-01-01
Discusses and compares the incorporation of technology by larger public libraries in Canada and the United States. Technology mentioned includes online public access catalogs; remote and local online database searching; microcomputers and software for public use; and fax, voice mail, and Telecommunication Devices for the Deaf and Teletype writer…
16mm Film and Videotape Lectures and Demonstrations. 1976/1977 Catalog.
ERIC Educational Resources Information Center
Massachusetts Inst. of Tech., Cambridge. Center for Advanced Engineering Study.
The Massachusetts Institute of Technology provides a catalog of 16mm filmed and videotaped lectures and demonstrations. Each listing includes title, short description, length of presentation, catalog number, purchase and rental prices, and indications as to whether the item is film or videotape and black-and-white or color. The catalog is divided…
Information Technology and Disabilities, 1995.
ERIC Educational Resources Information Center
McNulty, Tom, Ed.
1995-01-01
Four issues of this newsletter on information technology and disabilities (ITD) contain the following articles: "Developing an Accessible Online Public Access Catalog at the Washington Talking Book and Braille Library" (Charles Hamilton); "Assistive Technology in the Science Laboratory: A Talking Laboratory Work Station for Visually Impaired…
Book Catalogs versus Card Catalogs *
Pizer, Irwin H.
1965-01-01
The development of the library catalog in book form and its abandonment in favor of the card catalog are briefly traced. The advantages and disadvantages of both types of catalogs are enumerated, and several solutions which tried to combine the best features of both are discussed. The present trend back to the book catalog, made possible by recent advances in computer technology, is analyzed, advantages and disadvantages are compared, current examples are illustrated, and finally the computerized catalog is weighed against both the book and card catalog as to main features and practicality. PMID:14271116
First-Year Composition Teachers' Uses of New Media Technologies in the Composition Class
ERIC Educational Resources Information Center
Mina, Lilian W.
2014-01-01
As new media technologies emerge and evolve rapidly, the need to make informed decisions about using these technologies in teaching writing increases. This dissertation research study aimed at achieving multiple purposes. The first purpose was to catalog the new media technologies writing teachers use in teaching first-year composition classes.…
UCLA Working Group on Public Catalogs. Final Report.
ERIC Educational Resources Information Center
Aroeste, Jean; And Others
Though recent technological developments have promised solutions to the costly problems of expanding card catalogs and changing cataloging practices, at the University of California libraries the problems have been compounded by a desire to stay compatible with the changing conventions of the Library of Congress and by an increasingly stringent…
OLAP Cube Visualization of Hydrologic Data Catalogs
NASA Astrophysics Data System (ADS)
Zaslavsky, I.; Rodriguez, M.; Beran, B.; Valentine, D.; van Ingen, C.; Wallis, J. C.
2007-12-01
As part of the CUAHSI Hydrologic Information System project, we assemble comprehensive observations data catalogs that support CUAHSI data discovery services (WaterOneFlow services) and online mapping interfaces (e.g. the Data Access System for Hydrology, DASH). These catalogs describe several nation-wide data repositories that are important for hydrologists, including USGS NWIS and EPA STORET data collections. The catalogs contain a wealth of information reflecting the entire history and geography of hydrologic observations in the US. Managing such catalogs requires high performance analysis and visualization technologies. OLAP (Online Analytical Processing) cube, often called data cubes, is an approach to organizing and querying large multi-dimensional data collections. We have applied the OLAP techniques, as implemented in Microsoft SQL Server 2005, to the analysis of the catalogs from several agencies. In this initial report, we focus on the OLAP technology as applied to catalogs, and preliminary results of the analysis. Specifically, we describe the challenges of generating OLAP cube dimensions, and defining aggregations and views for data catalogs as opposed to observations data themselves. The initial results are related to hydrologic data availability from the observations data catalogs. The results reflect geography and history of available data totals from USGS NWIS and EPA STORET repositories, and spatial and temporal dynamics of available measurements for several key nutrient-related parameters.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-01
... Public Internet Site; Notice Inviting Applications for New Awards for Fiscal Year (FY) 2010 Catalog of... assistive technology public Internet site to improve awareness of and access to assistive technology (AT...: National Assistive Technology Public Internet Site. Under this priority, the Department will support an...
ERIC Educational Resources Information Center
Lynch, Clifford A.
1997-01-01
Union catalogs and distributed search systems are two ways users can locate materials in print and electronic formats. This article examines the advantages and limitations of both approaches and argues that they should be considered complementary rather than competitive. Discusses technologies creating linkage between catalogs and databases and…
ERIC Educational Resources Information Center
Tillin, Alma M.; Quinly, William J.
Standards established by the Association for Educational Communications and Technology (AECT) set forth basic cataloging rules that apply to all types of nonprint materials. Included are all elements needed to identify, describe, and retrieve an article. Cataloging rules are applied to 18 specific media formats including audiorecording, films,…
NGS Catalog: A Database of Next Generation Sequencing Studies in Humans
Xia, Junfeng; Wang, Qingguo; Jia, Peilin; Wang, Bing; Pao, William; Zhao, Zhongming
2015-01-01
Next generation sequencing (NGS) technologies have been rapidly applied in biomedical and biological research since its advent only a few years ago, and they are expected to advance at an unprecedented pace in the following years. To provide the research community with a comprehensive NGS resource, we have developed the database Next Generation Sequencing Catalog (NGS Catalog, http://bioinfo.mc.vanderbilt.edu/NGS/index.html), a continually updated database that collects, curates and manages available human NGS data obtained from published literature. NGS Catalog deposits publication information of NGS studies and their mutation characteristics (SNVs, small insertions/deletions, copy number variations, and structural variants), as well as mutated genes and gene fusions detected by NGS. Other functions include user data upload, NGS general analysis pipelines, and NGS software. NGS Catalog is particularly useful for investigators who are new to NGS but would like to take advantage of these powerful technologies for their own research. Finally, based on the data deposited in NGS Catalog, we summarized features and findings from whole exome sequencing, whole genome sequencing, and transcriptome sequencing studies for human diseases or traits. PMID:22517761
DOE New Technology: Sharing New Frontiers, April 1, 1993--September 30, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamura, A.T.; Henline, D.M.
The purpose of DOE New Technology is to provide information on how to access specific technologies developed through research sponsored by DOE and performed by DOE laboratories or by DOE-contracted researchers. This document describes technologies identified as having potential for commercial applications in addition to a catalog of current patent applications and patents available for licensing from DOE and DOE contractors.
ERIC Educational Resources Information Center
Chen, Ching-chih
1986-01-01
This discussion of information technology and its impact on library operations and services emphasizes the development of microcomputer and laser optical disc technologies. Libraries' earlier responses to bibliographic utilities, online databases, and online public access catalogs are described, and future directions for library services are…
Business Unusual: How "Event-Awareness" May Breathe Life into the Catalog?
ERIC Educational Resources Information Center
Lagoze, Carl
This paper proposes changes in the use of the catalog and the model upon which it rests. The first section describes why these changes are necessary if the library is to transition effectively into the digital age, including: the disruptive context caused by technological change; the costs associated with the catalog; the changing nature of…
A Catalog of Products and Services to Enhance the Independence of the Elderly.
ERIC Educational Resources Information Center
Drexel Univ. Philadelphia, PA. The Inst. on Aging.
This catalog, for specialists in the Area Agencies on Aging system, outlines specialized products and services, currently available in the private marketplace, which may enhance the independence of the elderly. The body of the catalog is divided into nine sections according to technological areas; a brief description of each area and the types of…
ERIC Educational Resources Information Center
Kumaran, Maha; Geary, Joe
2011-01-01
Technology has transformed libraries. There are digital libraries, electronic collections, online databases and catalogs, ebooks, downloadable books, and much more. With free technology such as social websites, newspaper collections, downloadable online calendars, clocks and sticky notes, online scheduling, online document sharing, and online…
ERIC Educational Resources Information Center
Kilpatrick, Thomas L., Ed.
1998-01-01
Provides annotations of 29 journal articles and six book reviews on a variety of topics related to technology in libraries, including collection development, computer-assisted instruction, databases, distance education, ergonomics, hardware, information technology, interlibrary loan and document supply, Internet, online catalogs, preservation,…
ERIC Educational Resources Information Center
Henthorne, Eileen
1995-01-01
Describes a project at the Princeton University libraries that converted the pre-1981 public card catalog, using digital imaging and optical character recognition technology, to fully tagged and indexed records of text in MARC format that are available on an online database and will be added to the online catalog. (LRW)
ERIC Educational Resources Information Center
Lynch, Clifford A.
1989-01-01
Reviews the history of the network that supports the MELVYL online union catalog, describes current technological and policy issues, and discusses the role the network plays in integrating local automation, the union catalog, access to resource databases, and other initiatives. Sidebars by Mark Needleman discuss the TCP/IP protocol suite, internet…
ERIC Educational Resources Information Center
Lipetz, Ben-Ami; Paulson, Peter J.
To study the users and use of the subject searching capability offered by the new online public catalog at the New York State Library (NYSL) a study was begun before the catalog change in mid-1983 and continued for a considerable time after the change. Data were collected during three 1-week periods; one preceded the catalog change and two…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michelle R. Blacker
The Idaho National Laboratory (INL) is a Department of Energy (DOE) multi-program national laboratory that conducts research and development in all DOE mission areas. Like all other federal laboratories, INL has a statutory, technology transfer mission to make its capabilities and technologies available to all federal agencies, to state and local governments, and to universities and industry. To fulfill this mission, INL encourages its scientific, engineering, and technical staff to disclose new inventions and creations to ensure the resulting intellectual property is captured, protected, and made available to others who might benefit from it. As part of the mission, intellectualmore » property is licensed to industrial partners for commercialization, creating jobs and delivering the benefits of federally funded technology to consumers. In other cases, unique capabilities are made available to other federal agencies or to regional small businesses to solve specific technical challenges. In other interactions, INL employees work cooperatively with researchers and other technical staff of our partners to further develop emerging technologies. This report is a catalog of selected INL technology transfer and commercialization transactions during this past year. The size and diversity of INL technical resources, coupled with the large number of relationships with other organizations, virtually ensures that a report of this nature will fail to capture all interactions. Recognizing this limitation, this report focuses on transactions that are specifically authorized by technology transfer legislation (and corresponding contractual provisions) or involve the transfer of legal rights to technology to other parties. This report was compiled from primary records, which were readily available to the INL’s Office of Technology Transfer & Commercialization. The accomplishments cataloged in the report, however, reflect the achievements and creativity of the highly skilled researchers, technicians, support staff, and operators of the INL workforce. Their achievements and recognized capabilities are what make the accomplishments cataloged here possible. Without them, none of these transactions would occur.« less
NASA Central Operation of Resources for Educators (CORE): Educational Materials Catalog
NASA Technical Reports Server (NTRS)
1999-01-01
This catalog contains order information for video cassettes with topics such as: aeronautics, earth science, weather, space exploration/satellites, life sciences, energy, living in space, manned spaceflight, social sciences, space art, space sciences, technology education and utilization, and mathematics/physics.
ERIC Educational Resources Information Center
Cleveland, Gary
The development of information technologies such as public access catalogs and online databases has greatly enhanced access to information. The lack of automation in the area of document delivery, however, has created a large disparity between the speed with which citations are found and the provision of primary documents. This imbalance can…
NASA Technical Reports Server (NTRS)
Djorgovski, S. George
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complete database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful, and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications, and has produced real, published results.
On-line catalogs of solar energetic protons at SRTI-BAS
NASA Astrophysics Data System (ADS)
Miteva, R.; Danov, D.
2017-08-01
We outline the status of the on-line catalogs of solar energetic particles supported by the Space Climate group at the Space Research and Technology, Bulgarian Academy of Sciences (SRTI-BAS). In addition to the already compiled proton catalog from Wind/EPACT instrument, in the current report we present preliminary results on the high energy SOHO/ERNE proton enhancement identifications as well as comparative analysis with two other proton lists. The future plans for the on-line catalogs are briefly summarized.
My Favorite Things Electronically Speaking.
ERIC Educational Resources Information Center
Glantz, Shelley
1997-01-01
Presents the results of an informal user survey on favorite information technology, including the best features of these. Discusses library online catalogs, electronic encyclopedias, CD-ROMs, laser discs, electronic magazine indexes, online services, the Internet, word processing programs, magazines as major sources of technology information,…
The Human Response to Library Automation.
ERIC Educational Resources Information Center
Kirkland, Janice, Ed.
1989-01-01
Eleven articles discuss the response of library users and personnel to automation. Topics covered include computerized reference services, online public access catalogs, paraprofessional staff perceptions of technology, organizational and managerial impacts of automation, workshops on new technology, gender differences in motivation to manage and…
What Lies Beyond the Online Catalog?
ERIC Educational Resources Information Center
Matthews, Joseph R.; And Others
1985-01-01
Five prominent consultants project technological advancements that, in some cases, will enhance current library systems, and in many cases will cause them to become obsolete. Major trends include advances in mainframe and microcomputing technology, development of inexpensive local area networks and telecommunications gateways, and the advent of…
Advanced Composition and the Computerized Library.
ERIC Educational Resources Information Center
Hult, Christine
1989-01-01
Discusses four kinds of computerized access tools: online catalogs; computerized reference; online database searching; and compact disks and read only memory (CD-ROM). Examines how these technologies are changing research. Suggests how research instruction in advanced writing courses can be refocused to include the new technologies. (RS)
Bibliographic Utilities and the Use of Microcomputers in Libraries: Current and Projected Practices.
ERIC Educational Resources Information Center
McAninch, Glen
1986-01-01
Bibliographic utilities are marketing specially designed hardware and software that permit libraries to patch together automated cataloging, acquisitions, serials control, reference, online public catalogs, circulation, interlibrary loan, and administrative functions. The increasing complexity of the technology is making it more difficult to…
ERIC Educational Resources Information Center
Chang, Min-min
1998-01-01
Discusses the Online Computer Library Center (OCLC) and the changing Asia Pacific library scene under the broad headings of the three phases of technology innovation. Highlights include WorldCat and the OCLC shared cataloging system; resource sharing and interlibrary loan; enriching OCLC online catalog with Asian collections; and future outlooks.…
The Quebec National Library on the Web.
ERIC Educational Resources Information Center
Kieran, Shirley; Sauve, Diane
1997-01-01
Provides an overview of the Quebec National Library (Bibliotheque Nationale du Quebec, or BNQ) Web site. Highlights include issues related to content, design, and technology; IRIS, the BNQ online public access catalog; development of the multimedia catalog; software; digitization of documents; links to bibliographic records; and future…
Directed Energy Weapon System for Ballistic Missile Defense
2009-02-15
Scientific Assessment of High Power Free - Electron Laser Technology , “Introduction and Principle Findings,” available at: http://www.nap.edu/catalog...will lead to thermal blooming and will reduce the energy of light to the target. Scientific Assessment of High Power Free - Electron Laser Technology , pg
Role of Computers in Sci-Tech Libraries.
ERIC Educational Resources Information Center
Bichteler, Julie; And Others
1986-01-01
Articles in this theme issue discuss applications of microcomputers in science/technology libraries, a UNIX-based online catalog, online versus print sources, computer-based statistics, and the applicability and implications of the Matheson-Cooper Report on health science centers for science/technology libraries. A bibliography of new reference…
Advances in Projection Technology for On-Line Instruction.
ERIC Educational Resources Information Center
Davis, H. Scott; Miller, Marsha
This document consists of supplemental information designed to accompany a presentation on the application of projection technology, including video projectors and liquid crystal display (LCD) devices, in the online catalog library instruction program at the Indiana State University libraries. Following an introductory letter, the packet includes:…
ERIC Educational Resources Information Center
Seltz-Petrash, Ann, Ed.; Wolff, Kathryn, Ed.
Currently available American 16mm films in the areas of pure science, applied science and technology, and science and society are identified and listed. Included are films that are available from commercial, government, university, and industry producers. The first section of the catalog lists in Dewey Decimal order films intended for junior high…
A Feasibility Study on Data Distribution on Optical Media.
ERIC Educational Resources Information Center
Campbell (Bonnie) & Associates, Toronto (Ontario).
This feasibility study assesses the potential of optical technology in the development of accessible bibliographic and location data networks both in Canada and within the international MARC (Machine-Readable Cataloging) network. The study is divided into four parts: (1) a market survey of cataloging and interlibrary loan librarians to determine…
Cataloging as a Customer Service: Applying Knowledge to Technology Tools.
ERIC Educational Resources Information Center
Konovalov, Yuri
1999-01-01
Discusses the increase in significance and importance of cataloging and authority control in the online environment of libraries to help improve both precision and recall of searches. Highlights include the inadequacies of keyword searching; corporate library experiences; and library management with foreign library branches and foreign language…
A Kid-Built Classroom Library. Curriculum Boosters. Technology.
ERIC Educational Resources Information Center
Cooper, Laurie K.
1994-01-01
Elementary students can help turn the classroom book collection into a well-organized library using the classroom computer. In the process, they get practice in cooperative-learning groups as they work with both electronic and card catalogs. The article explains how to use computers to create an electronic catalog. (SM)
UWP 011: Popular Science and Technology Writing
ERIC Educational Resources Information Center
Perrault, Sarah
2012-01-01
UWP 011: Popular Science & Technology Writing is a sophomore-level course designed as an introduction to rhetoric of science at UC Davis, a science-focused land-grant university. The course fulfills the general education requirements for written literacy and for topical breadth in arts and humanities. The catalog describes the course as…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-03
...: Catalog of Federal Domestic Assistance Name and Number: Measurement and Engineering Research and Standards... engineering sciences and, as the lead Federal agency for technology transfer, it provides a strong interface... enables the Center for Nanoscale Science and Technology (CNST), Engineering Laboratory (EL), Information...
NASA Astrophysics Data System (ADS)
McDougall, C.; Peddicord, H.; Russell, E. L.; Hackathorn, E. J.; Pisut, D.; MacIntosh, E.
2016-12-01
NOAA's data visualization education and technology platforms, Science On a Sphere and NOAA View, are providing content for innovative and diverse educational platforms worldwide. Science On a Sphere (SOS) is a system composed of a large-scale spherical display and a curated data catalog. SOS displays are on exhibit in more than 140 locations in 26 countries and 29 US states that reach at least 35 million people every year. Additionally, the continuously updated data catalog, consisting of over 500 visualizations accompanied by descriptions, videos, and related content, is publicly available for download. This catalog is used by a wide variety of users including planetariums, other spherical displays, and teachers. To further broaden the impact of SOS, SOS Explorer, a flat screen version of SOS that can be used in schools and museums has over 100 of the SOS datasets and enables students and other users dig into the data in ways that aren't possible with SOS. Another resource from NOAA, NOAA View, is an easy-to-use portal to NOAA's vast data archives including historical datasets that go back to 1880 and models for ocean, atmosphere, land, cryosphere, climate and weather. NOAA View provides hundreds of data variables within a single interface, allowing the user to browse, interrogate, and download resources from NOAA's vast archives. And, through story maps, users can see how data can be used to understand our planet and improve our lives. Together, these provide invaluable resources to educators and technology pioneers. Both NOAA View and the SOS data catalog enable educators, students and communicators to easily ingest complex and often, stunning visualizations. The visualizations are available in formats that can be incorporated into a number of different display technologies to maximize their use. Although making the visualizations available to users is a technological hurdle, an equally large hurdle is making them understandable by viewers. In this presentation we will discuss the challenges we've encountered in making these resources useable to educators and education institutions as well as the feedback we've received about the value of these resources.
RLMS Micro-File: Current State of Catalog Card Reproduction. Supplement 1.
ERIC Educational Resources Information Center
Nitecki, Joseph Z., Comp.
Nine papers on various aspects and methods of catalog card reproduction are included in this supplement. Many reports include cost analyses and comparisons. A lengthy paper describes the history and the present use of technology of the Library of Congress card production operations. Other reports cover offset press and computer output microfilm…
Distributing an Online Catalog on CD-ROM...The University of Illinois Experience.
ERIC Educational Resources Information Center
Watson, Paula D.; Golden, Gary A.
1987-01-01
Description of the planning of a project designed to test the feasibility of distributing a statewide union catalog database on optical disk discusses the relationship of the project's goals to those of statewide library development; dealing with vendors in a volatile, high technology industry; and plans for testing and evaluation. (EM)
After Losing Users in Catalogs, Libraries Find Better Search Software
ERIC Educational Resources Information Center
Parry, Marc
2010-01-01
Traditional online library catalogs do not tend to order search results by ranked relevance, and they can befuddle users with clunky interfaces. However, that's changing because of two technology trends. First, a growing number of universities are shelling out serious money for sophisticated software that makes exploring their collections more…
Dialogue with an OPAC: How Visionary Was Swanson in 1964?
ERIC Educational Resources Information Center
Su, Shiao-Feng
1994-01-01
Traces the development of online public access catalogs (OPACs) and compares what has occurred with a 1964 article that outlined recommendations for a future card catalog. Subject access is emphasized, including Library of Congress Subject Headings, expansion of OPACs, user-friendly interfaces, new technologies, and current visions of the future…
NASA Technical Reports Server (NTRS)
Djorgovski, S. G.
1994-01-01
We developed a package to process and analyze the data from the digital version of the Second Palomar Sky Survey. This system, called SKICAT, incorporates the latest in machine learning and expert systems software technology, in order to classify the detected objects objectively and uniformly, and facilitate handling of the enormous data sets from digital sky surveys and other sources. The system provides a powerful, integrated environment for the manipulation and scientific investigation of catalogs from virtually any source. It serves three principal functions: image catalog construction, catalog management, and catalog analysis. Through use of the GID3* Decision Tree artificial induction software, SKICAT automates the process of classifying objects within CCD and digitized plate images. To exploit these catalogs, the system also provides tools to merge them into a large, complex database which may be easily queried and modified when new data or better methods of calibrating or classifying become available. The most innovative feature of SKICAT is the facility it provides to experiment with and apply the latest in machine learning technology to the tasks of catalog construction and analysis. SKICAT provides a unique environment for implementing these tools for any number of future scientific purposes. Initial scientific verification and performance tests have been made using galaxy counts and measurements of galaxy clustering from small subsets of the survey data, and a search for very high redshift quasars. All of the tests were successful and produced new and interesting scientific results. Attachments to this report give detailed accounts of the technical aspects of the SKICAT system, and of some of the scientific results achieved to date. We also developed a user-friendly package for multivariate statistical analysis of small and moderate-size data sets, called STATPROG. The package was tested extensively on a number of real scientific applications and has produced real, published results.
ERIC Educational Resources Information Center
Jantz, Ronald
2001-01-01
Analyzes the implications of electronic book technology (e-books) on academic libraries. Discusses new business models for publishers, including self-publishing, Internet publishing, and partnerships with libraries as publishers; impact on library services, including cataloging, circulation, and digital preservation; user benefits; standards;…
Information Technology Planning: Computers in the School Library--How Many Are Enough?
ERIC Educational Resources Information Center
Simpson, Carol
2002-01-01
Describes the development of a formula to determine the needed quantity of computers for a school library. Four types of information technology activities (administrative, personal productive, class/group productive, online public access catalog) and several variables (age levels served, campus focus, number of staff, size of student body, average…
Handbook for Local Coordinators: Value-Added, Compact Disk, Union Catalog Test Phase.
ERIC Educational Resources Information Center
Townley, Charles
In 1988, the Associated College Libraries of Central Pennsylvania received a grant to create a value-added, compact disk, union catalog from the U.S. Department of Education's College Library Technology and Cooperative Grants Program, Title II of the Higher Education Act. Designed to contain, in time, 2,000,830 records from 17 member library…
The Online Public Access Catalogue at the Cite des Sciences Mediatheque in Paris.
ERIC Educational Resources Information Center
Witt, Maria
1990-01-01
Provides background on the holdings, services, and layout of the mediatheque (multimedia library) at the Cite des Sciences et de l'Industrie (originally the Museum of Science, Technology, and Industry) in Paris. The library's online public access catalog and use of the catalog by children and the visually handicapped are described. (four…
VizieR Online Data Catalog: ABCG209 spectroscopic and photometric catalog (Mercurio+, 2008)
NASA Astrophysics Data System (ADS)
Mercurio, A.; Barbera, F. L.; Haines, C. P.; Merluzzi, P.; Busarello, G.; Capaccioli, M.
2008-11-01
Spectroscopic observations were carried out at the ESO New Technology Telescope (NTT) with the ESO Multi-Mode Instrument (EMMI) and at the Telescopio Nazionale Galileo (TNG) with the Device Optimized for the LOw RESolution (DOLORES), while NIR photometric data were collected with the Son OF ISAAC (SOFI) at NTT. (3 data files).
Library of Congress Report Urges Technological Updates of Cataloging Strategies
ERIC Educational Resources Information Center
Foster, Andrea L.; Howard, Jennifer
2008-01-01
Libraries need to share records more with one another, make greater use of the Web, and bring more attention to their special collections, according to a report released last month by the Library of Congress. The new study examines how libraries can improve the distribution and use of their materials in a technology-centric environment. But "On…
ERIC Educational Resources Information Center
Hanifan, Thomas; Hoogheem, Cynthia L.
The Eastern Iowa Community College District (EICCD) libraries received a federal College Library Technology and Cooperation grant to provide and link public access catalogs at each college of the district--Clinton Community College, Muscatine Community College, and Scott Community College. That network is named Quad-LINC (Quad Cities Libraries in…
A Suggested Approach for Producing VAMS Air Transportation System Technology Roadmaps
NASA Technical Reports Server (NTRS)
Weathers, Del
2002-01-01
This viewgraph presentation provides an overview on the use of technology 'roadmaps' in order to facilitate the research development of VAMS (Virtual Airspace Modeling and Simulation). These roadmaps are to be produced by each concept team, updated annually, discussed at the technical interchange meetings (TIMs), shared among all VAMS participants, and made available electronically. These concept-specific technology roadmaps will be subsequently blended into an integrated catalog of roadmaps, technical discussions, and research recommendations. A historical example of ATM (Air Traffic Management) research and technology from 1940 to 1999 as shown in a series of 'roadmaps' is also included.
Manufacturing Methods and Technology Program Plan, CY 1984.
1984-09-01
77nD-Al48 828 MANUFACTURING METHODS AIND TECHNOLOGY PROGRAM PLAN CY 1/3 1984(U) ARMY INDUSTRIAL BASE ENGINEERING ACTIVITY ROCK ISLAND IL G FISCHER...1984 MANUFACTURING TECHNOLOGY DIVISION U S ARMY INDUSTRIAL BASE ENGINEERING ACTIVITY ROCK ISLAND, ILLINOIS 61299-7260 8 4 30 033 .. . . . . ...4i.l...NUMBE2N. GOVT ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER i2- ffl7’ NONE 14TTITLE (Mid Skdde) S. TYPE OF REPORT & PERIOD COVERED MANUFACTURING METHODS
The new NHGRI-EBI Catalog of published genome-wide association studies (GWAS Catalog)
MacArthur, Jacqueline; Bowler, Emily; Cerezo, Maria; Gil, Laurent; Hall, Peggy; Hastings, Emma; Junkins, Heather; McMahon, Aoife; Milano, Annalisa; Morales, Joannella; Pendlington, Zoe May; Welter, Danielle; Burdett, Tony; Hindorff, Lucia; Flicek, Paul; Cunningham, Fiona; Parkinson, Helen
2017-01-01
The NHGRI-EBI GWAS Catalog has provided data from published genome-wide association studies since 2008. In 2015, the database was redesigned and relocated to EMBL-EBI. The new infrastructure includes a new graphical user interface (www.ebi.ac.uk/gwas/), ontology supported search functionality and an improved curation interface. These developments have improved the data release frequency by increasing automation of curation and providing scaling improvements. The range of available Catalog data has also been extended with structured ancestry and recruitment information added for all studies. The infrastructure improvements also support scaling for larger arrays, exome and sequencing studies, allowing the Catalog to adapt to the needs of evolving study design, genotyping technologies and user needs in the future. PMID:27899670
APRA-E: The First Seven Years: A Sampling of Project Outcomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Ellen D.
2016-08-23
Since 2009, ARPA-E has funded over 500 potentially transformational energy technology projects. Many of these projects have already demonstrated early indicators of technical and commercial success. ARPA-E has begun the process of analyzing and cataloging some of the agency’s most successful projects. This document is a compilation of the first volume of these impactful technologies.
Navy Distributed Virtual Library Requirements Analysis.
1995-12-01
Josie McCrary asajam01@asnmail.asc.edu Issues In Science and Technology Librarianship ♦ Provides short substantial articles on timely and important...topics in science and technology librarianship as well as conference and workshop reports and short correspondences. ♦ Send a message to acrlsts...electronic journal encompassing all aspects of aca- demic audiovisual librarianship . Focus includes cataloging, refer- ence, collection development
A Catalog of MIPSGAL Disk and Ring Sources
2010-04-01
average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...California Institute of Technology, Pasadena, CA 14. ABSTRACT We present a catalog of 416 extended, resolved , disk and ringlike objects as... Satellite sources. Among the identified objects, those with central sources are mostly listed as emission-line stars, but with other source types including
NASA Technical Reports Server (NTRS)
Murphy, J. D.; Dideriksen, R. I.
1975-01-01
The application of remote sensing technology by the U.S. Department of Agriculture (USDA) is examined. The activities of the USDA Remote-Sensing User Requirement Task Force which include cataloging USDA requirements for earth resources data, determining those requirements that would return maximum benefits by using remote sensing technology and developing a plan for acquiring, processing, analyzing, and distributing data to satisfy those requirements are described. Emphasis is placed on the large area crop inventory experiment and its relationship to the task force.
ERIC Educational Resources Information Center
Smith, Linda C., Ed.; Gluck, Myke, Ed.
This document assembles conference papers which focus on how electronic technologies are creating new ways of meeting user needs for spatial and cartographic information. Contents include: (1) "Mapping Technology in Transition" (Mark Monmonier); (2) "Cataloging Planetospatial Data in Digital Form: Old Wine, New Bottles--New Wine,…
Technology Transfer Annual Report Fiscal Year 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skinner, Wendy Lee
Idaho National Laboratory (INL) is a Department of Energy (DOE) multi-program national laboratory that conducts research and development in all DOE mission areas. Like all other federal laboratories, INL has a statutory, technology transfer mission to make its capabilities and technologies available to federal agencies, state and local governments, universities, and industry. To fulfill this mission, INL encourages its scientific, engineering, and technical staff to disclose new inventions and creations to ensure the resulting intellectual property is captured, protected, and available to others who might benefit from it. As part of the mission, intellectual property is licensed to industrial partnersmore » for commercialization, job creation, and delivering the benefits of federally funded technology to consumers. In some cases, unique capabilities are made available to other federal agencies, international organizations, domestic and foreign commercial entities, or small businesses to solve specific technical challenges. INL employees work cooperatively with researchers and technical staff from the university and industrial sectors to further development of emerging technologies. In this multinational global economy, INL is contributing to the development of the next generation of engineers and scientists by licensing software to educational institutions throughout the world. This report is a catalog of select INL technology transfer and commercialization transactions and research agreements that were executed during this past year. The size and diversity of INL technical resources, coupled with the large number of relationships with other organizations, virtually ensures that a report of this nature will fail to capture all interactions. Recognizing this limitation, this report focuses on transactions that are specifically authorized by technology transfer legislation (and corresponding contractual provisions) or involve the transfer of legal rights to technology to other parties. This report was compiled from primary records, which were readily available to the INL’s Technology Deployment and Contracts Management Offices. Accomplishments cataloged in the report reflect the achievements and creativity of the researchers, technicians, support staff, and operators of the INL workforce.« less
NOAA's Data Catalog and the Federal Open Data Policy
NASA Astrophysics Data System (ADS)
Wengren, M. J.; de la Beaujardiere, J.
2014-12-01
The 2013 Open Data Policy Presidential Directive requires Federal agencies to create and maintain a 'public data listing' that includes all agency data that is currently or will be made publicly-available in the future. The directive requires the use of machine-readable and open formats that make use of 'common core' and extensible metadata formats according to the best practices published in an online repository called 'Project Open Data', to use open licenses where possible, and to adhere to existing metadata and other technology standards to promote interoperability. In order to meet the requirements of the Open Data Policy, the National Oceanic and Atmospheric Administration (NOAA) has implemented an online data catalog that combines metadata from all subsidiary NOAA metadata catalogs into a single master inventory. The NOAA Data Catalog is available to the public for search and discovery, providing access to the NOAA master data inventory through multiple means, including web-based text search, OGC CS-W endpoint, as well as a native Application Programming Interface (API) for programmatic query. It generates on a daily basis the Project Open Data JavaScript Object Notation (JSON) file required for compliance with the Presidential directive. The Data Catalog is based on the open source Comprehensive Knowledge Archive Network (CKAN) software and runs on the Amazon Federal GeoCloud. This presentation will cover topics including mappings of existing metadata in standard formats (FGDC-CSDGM and ISO 19115 XML ) to the Project Open Data JSON metadata schema, representation of metadata elements within the catalog, and compatible metadata sources used to feed the catalog to include Web Accessible Folder (WAF), Catalog Services for the Web (CS-W), and Esri ArcGIS.com. It will also discuss related open source technologies that can be used together to build a spatial data infrastructure compliant with the Open Data Policy.
Technology Deployment Annual Report 2014 December
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arterburn, George K.
This report is a summary of key Technology Deployment activities and achievements for 2014, including intellectual property, granted copyrights, royalties, license agreements, CRADAs, WFOs and Technology-Based Economic Development. Idaho National Laboratory (INL) is a Department of Energy (DOE) multi-program national laboratory that conducts research and development in all DOE mission areas. Like all other federal laboratories, INL has a statutory, technology transfer mission to make its capabilities and technologies available to all federal agencies, to state and local governments, and to universities and industry. To fulfill this mission, INL encourages its scientific, engineering, and technical staff to disclose new inventionsmore » and creations to ensure the resulting intellectual property is captured, protected, and made available to others who might benefit from it. As part of the mission, intellectual property is licensed to industrial partners for commercialization, creating jobs and delivering the benefits of federally funded technology to consumers. In other cases, unique capabilities are made available to other federal agencies or to regional small businesses to solve specific technical challenges. INL employees also work cooperatively with researchers and technical staff from the university and industrial sectors to further develop emerging technologies. In our multinational global economy, INL is contributing to the development of the next generation of engineers and scientists by licensing software to educational instiutitons throughout the world. This report is a catalog of selected INL technology transfer and commercialization transactions during this past year. The size and diversity of INL technical resources, coupled with the large number of relationships with other organizations, virtually ensures that a report of this nature will fail to capture all interactions. Recognizing this limitation, this report focuses on transactions that are specifically authorized by technology transfer legislation (and corresponding contractual provisions) or involve the transfer of legal rights to technology to other parties. This report was compiled from primary records, which were readily available to the INL’s Office of Technology Deployment. However, the accomplishments cataloged in the report reflect the achievements and creativity of the researchers, technicians, support staff, and operators of the INL workforce.« less
The new NHGRI-EBI Catalog of published genome-wide association studies (GWAS Catalog).
MacArthur, Jacqueline; Bowler, Emily; Cerezo, Maria; Gil, Laurent; Hall, Peggy; Hastings, Emma; Junkins, Heather; McMahon, Aoife; Milano, Annalisa; Morales, Joannella; Pendlington, Zoe May; Welter, Danielle; Burdett, Tony; Hindorff, Lucia; Flicek, Paul; Cunningham, Fiona; Parkinson, Helen
2017-01-04
The NHGRI-EBI GWAS Catalog has provided data from published genome-wide association studies since 2008. In 2015, the database was redesigned and relocated to EMBL-EBI. The new infrastructure includes a new graphical user interface (www.ebi.ac.uk/gwas/), ontology supported search functionality and an improved curation interface. These developments have improved the data release frequency by increasing automation of curation and providing scaling improvements. The range of available Catalog data has also been extended with structured ancestry and recruitment information added for all studies. The infrastructure improvements also support scaling for larger arrays, exome and sequencing studies, allowing the Catalog to adapt to the needs of evolving study design, genotyping technologies and user needs in the future. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
N /A
Idaho National Laboratory (INL) is a Department of Energy (DOE) multi-program national laboratory that conducts research and development in all DOE mission areas. Like all other federal laboratories, INL has a statutory technology transfer mission to make its capabilities and technologies available to all federal agencies, to state and local governments, and to universities and industry. To fulfill this mission, INL encourages its scientific, engineering, and technical staff to disclose new inventions and creations to ensure the resulting intellectual property is captured, protected, and made available to others who might benefit from it. As part of the mission, intellectual propertymore » is licensed to industrial partners for commercialization, creating jobs and delivering the benefits of federally funded technology to consumers. In other cases, unique capabilities are made available to other federal agencies or to regional small businesses to solve specific technical challenges. INL employees also work cooperatively with researchers and technical staff from the university and industrial sectors to further develop emerging technologies. In a multinational global economy, INL is contributing to the development of the next generation of engineers and scientists by licensing software to educational institutions throughout the world. This report is a catalog of selected INL technology transfer and commercialization transactions during this past year. The size and diversity of INL technical resources, coupled with the large number of relationships with other organizations, virtually ensures that a report of this nature will fail to capture all interactions. Recognizing this limitation, this report focuses on transactions that are specifically authorized by technology transfer legislation (and corresponding contractual provisions) or involve the transfer of legal rights to technology to other parties. This report was compiled from primary records, which were readily available to the INL’s Office of Technology Deployment. However, the accomplishments cataloged in the report reflect the achievements and creativity of the researchers, technicians, support staff, and operators of the INL workforce.« less
Finding Atmospheric Composition (AC) Metadata
NASA Technical Reports Server (NTRS)
Strub, Richard F..; Falke, Stefan; Fiakowski, Ed; Kempler, Steve; Lynnes, Chris; Goussev, Oleg
2015-01-01
The Atmospheric Composition Portal (ACP) is an aggregator and curator of information related to remotely sensed atmospheric composition data and analysis. It uses existing tools and technologies and, where needed, enhances those capabilities to provide interoperable access, tools, and contextual guidance for scientists and value-adding organizations using remotely sensed atmospheric composition data. The initial focus is on Essential Climate Variables identified by the Global Climate Observing System CH4, CO, CO2, NO2, O3, SO2 and aerosols. This poster addresses our efforts in building the ACP Data Table, an interface to help discover and understand remotely sensed data that are related to atmospheric composition science and applications. We harvested GCMD, CWIC, GEOSS metadata catalogs using machine to machine technologies - OpenSearch, Web Services. We also manually investigated the plethora of CEOS data providers portals and other catalogs where that data might be aggregated. This poster is our experience of the excellence, variety, and challenges we encountered.Conclusions:1.The significant benefits that the major catalogs provide are their machine to machine tools like OpenSearch and Web Services rather than any GUI usability improvements due to the large amount of data in their catalog.2.There is a trend at the large catalogs towards simulating small data provider portals through advanced services. 3.Populating metadata catalogs using ISO19115 is too complex for users to do in a consistent way, difficult to parse visually or with XML libraries, and too complex for Java XML binders like CASTOR.4.The ability to search for Ids first and then for data (GCMD and ECHO) is better for machine to machine operations rather than the timeouts experienced when returning the entire metadata entry at once. 5.Metadata harvest and export activities between the major catalogs has led to a significant amount of duplication. (This is currently being addressed) 6.Most (if not all) Earth science atmospheric composition data providers store a reference to their data at GCMD.
Automated Atmospheric Composition Dataset Level Metadata Discovery. Difficulties and Surprises
NASA Astrophysics Data System (ADS)
Strub, R. F.; Falke, S. R.; Kempler, S.; Fialkowski, E.; Goussev, O.; Lynnes, C.
2015-12-01
The Atmospheric Composition Portal (ACP) is an aggregator and curator of information related to remotely sensed atmospheric composition data and analysis. It uses existing tools and technologies and, where needed, enhances those capabilities to provide interoperable access, tools, and contextual guidance for scientists and value-adding organizations using remotely sensed atmospheric composition data. The initial focus is on Essential Climate Variables identified by the Global Climate Observing System - CH4, CO, CO2, NO2, O3, SO2 and aerosols. This poster addresses our efforts in building the ACP Data Table, an interface to help discover and understand remotely sensed data that are related to atmospheric composition science and applications. We harvested GCMD, CWIC, GEOSS metadata catalogs using machine to machine technologies - OpenSearch, Web Services. We also manually investigated the plethora of CEOS data providers portals and other catalogs where that data might be aggregated. This poster is our experience of the excellence, variety, and challenges we encountered.Conclusions:1.The significant benefits that the major catalogs provide are their machine to machine tools like OpenSearch and Web Services rather than any GUI usability improvements due to the large amount of data in their catalog.2.There is a trend at the large catalogs towards simulating small data provider portals through advanced services. 3.Populating metadata catalogs using ISO19115 is too complex for users to do in a consistent way, difficult to parse visually or with XML libraries, and too complex for Java XML binders like CASTOR.4.The ability to search for Ids first and then for data (GCMD and ECHO) is better for machine to machine operations rather than the timeouts experienced when returning the entire metadata entry at once. 5.Metadata harvest and export activities between the major catalogs has led to a significant amount of duplication. (This is currently being addressed) 6.Most (if not all) Earth science atmospheric composition data providers store a reference to their data at GCMD.
This report provides an overview of how combined heat and power systems work and the key concepts of efficiency and power-to-heat ratios. It also provides information and performance characteristics of five commercially available CHP prime movers.
Advanced Launch Technology Life Cycle Analysis Using the Architectural Comparison Tool (ACT)
NASA Technical Reports Server (NTRS)
McCleskey, Carey M.
2015-01-01
Life cycle technology impact comparisons for nanolauncher technology concepts were performed using an Affordability Comparison Tool (ACT) prototype. Examined are cost drivers and whether technology investments can dramatically affect the life cycle characteristics. Primary among the selected applications was the prospect of improving nanolauncher systems. As a result, findings and conclusions are documented for ways of creating more productive and affordable nanolauncher systems; e.g., an Express Lane-Flex Lane concept is forwarded, and the beneficial effect of incorporating advanced integrated avionics is explored. Also, a Functional Systems Breakdown Structure (F-SBS) was developed to derive consistent definitions of the flight and ground systems for both system performance and life cycle analysis. Further, a comprehensive catalog of ground segment functions was created.
NASA Technical Reports Server (NTRS)
Gregory, J. W.
1975-01-01
Plans are formulated for chemical propulsion technology programs to meet the needs of advanced space transportation systems from 1980 to the year 2000. The many possible vehicle applications are reviewed and cataloged to isolate the common threads of primary propulsion technology that satisfies near term requirements in the first decade and at the same time establish the technology groundwork for various potential far term applications in the second decade. Thrust classes of primary propulsion engines that are apparent include: (1) 5,000 to 30,000 pounds thrust for upper stages and space maneuvering; and (2) large booster engines of over 250,000 pounds thrust. Major classes of propulsion systems and the important subdivisions of each class are identified. The relative importance of each class is discussed in terms of the number of potential applications, the likelihood of that application materializing, and the criticality of the technology needed. Specific technology programs are described and scheduled to fulfill the anticipated primary propulsion technology requirements.
Event Discrimination Using Seismoacoustic Catalog Probabilities
NASA Astrophysics Data System (ADS)
Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.
2017-12-01
Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.
Coal gasification systems engineering and analysis, volume 2
NASA Technical Reports Server (NTRS)
1980-01-01
The major design related features of each generic plant system were characterized in a catalog. Based on the catalog and requirements data, approximately 17 designs and cost estimates were developed for MBG and alternate products. A series of generic trade studies was conducted to support all of the design studies. A set of cost and programmatic analyses were conducted to supplement the designs. The cost methodology employed for the design and sensitivity studies was documented and implemented in a computer program. Plant design and construction schedules were developed for the K-T, Texaco, and B&W MBG plant designs. A generic work breakdown structure was prepared, based on the K-T design, to coincide with TVA's planned management approach. An extensive set of cost sensitivity analyses was completed for K-T, Texaco, and B&W design. Product price competitiveness was evaluated for MBG and the alternate products. A draft management policy and procedures manual was evaluated. A supporting technology development plan was developed to address high technology risk issues. The issues were identified and ranked in terms of importance and tractability, and a plan developed for obtaining data or developing technology required to mitigate the risk.
Highway performance monitoring system catalog : new technology and techniques
DOT National Transportation Integrated Search
1999-03-01
The Share the Road Campaign Research Study Final Report documents the independent study and review of the Federal Highway Administration (FHWA), Office of Motor Carrier and Highway Safety's (OMCHS), Share the Road program called the No-Zone Campaign....
The Computer Bulletin Board. Modified Gran Plots of Very Weak Acids on a Spreadsheet.
ERIC Educational Resources Information Center
Chau, F. T.; And Others
1990-01-01
Presented are two applications of computer technology to chemistry instruction: the use of a spreadsheet program to analyze acid-base titration curves and the use of database software to catalog stockroom inventories. (CW)
1984-12-31
LII IH"L 5 B 1.4 1111.6 MICROCOPY RESOLUTION TEST CHART SECURITY CLASSIFICATION OF THIS PAGE (MS’ien Dare Itere d) RED DSTUC ONS REPORT ...DOCUMENTATION PAGE BEFORE COMPLETIG FORM R NUMER . GOVT ACCESSION NO 5. RECIPIENT’S CATALOG iUMBER • TR-09 4. TITLE ( nd Subttle) S. TYPE OF REPORT A PERIOD...COVERELs Applications of Piezoelectric and Pyroelectric Technical Report -Interim Thin Films: Opportunities for Langmuir-Blodgett Technology 6. PERFORMING
Research Opportunities in Advanced Aerospace Concepts
NASA Technical Reports Server (NTRS)
Jones, Gregory S.; Bangert, Linda S.; Garber, Donald P.; Huebner, Lawrence D.; McKinley, Robert E.; Sutton, Kenneth; Swanson, Roy C., Jr.; Weinstein, Leonard
2000-01-01
This report is a review of a team effort that focuses on advanced aerospace concepts of the 21st Century. The paper emphasis advanced technologies, rather than cataloging every unusual aircraft that has ever been attempted. To dispel the myth that "aerodynamics is a mature science" an extensive list of "What we cannot do, or do not know" was enumerated. A zeit geist, a feeling for the spirit of the times, was developed, based on existing research goals. Technological drivers and the constraints that might influence these technological developments in a future society were also examined. The present status of aeronautics, space exploration, and non-aerospace applications, both military and commercial, including enabling technologies are discussed. A discussion of non-technological issues affecting advanced concepts research is presented. The benefit of using the study of advanced vehicles as a tool to uncover new directions for technology development is often necessary. An appendix is provided containing examples of advanced vehicle configurations currently of interest.
A Conspectus of Management Courses.
ERIC Educational Resources Information Center
British Inst. of Management, London (England).
This catalog of management courses lists only Diploma in Management Studies available in the British Isles at business schools, universities, colleges of technology, commerce, and further education, and selected independent colleges, management consultants centers, adult education colleges, professional institutions, and private organizations. The…
State of the States: Fuel Cells in America 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtin, Sandra; Jennifer, Gangi
This December 2015 report, the sixth in a series, provides a comprehensive analysis of state activities supporting fuel cell and hydrogen technology, profiles of leading states, and a catalog of recent installations, policies, funding, and deployments around the country.
Technology and Microcomputers for an Information Centre/Special Library.
ERIC Educational Resources Information Center
Daehn, Ralph M.
1984-01-01
Discusses use of microcomputer hardware and software, telecommunications methods, and advanced library methods to create a specialized information center's database of literature relating to farm machinery and food processing. Systems and services (electronic messaging, serials control, database creation, cataloging, collections, circulation,…
ERIC Educational Resources Information Center
Veterans Administration Medical Center, Washington, DC.
This dental training films catalog is organized into two sections. Section I is a category listing of the films by number and title, indexed according to generalized headings; categories are as follow: anatomy, articulator systems, complete dentures, dental assisting, dental laboratory technology, dental materials, dental office emergencies,…
DOT National Transportation Integrated Search
2014-05-09
The Broad Agency Announcement Alternative Aviation Fuels was a solicitation released by the U.S. Department of Transportation Research and Innovative Technology Administration (RITA) / John A. Volpe National Transportation Systems Center with funding...
Centers of Excellence: A Catalogue
NASA Technical Reports Server (NTRS)
Phelps, Paul B. (Compiler)
1988-01-01
This report summarizes information on State-sponsored 'Centers of Excellence' gathered during a survey of State programs in the Fall of 1987. For the purposes of this catalog, 'Centers of Excellence' refers to organizations or activities with the following characteristics: institutionalized, focused, cooperative Research and Development (R&D) programs; supported in part by State governments, in addition to universities, industry and (in some cases) Federal agencies; performed by teams that may include both industry and university employees; and concentrated on relatively specific R&D agendas, usually with near term commercial or governmental applicability. Most of these activities involve state-of-the-art advancement of new technologies under conditions leading to early practical applications. Not included in this catalog are project-level matching grant programs. The principal purpose of this catalog is to help NASA program management, at all levels. to identify and where appropriate, to initiate relationships with other technology-developing organizations. These State-sponsored programs should be of particular interest, because: they present an opportunity to leverage NASA's R&D investments; they are concentrated at the frontier, yet have a concern for practical applications; and they involve industrial participation under conditions that increase the probability of prompt, widespread dissemination in the form of new or enhanced commercial products, processes, or services.
Deep Space Wide Area Search Strategies
NASA Astrophysics Data System (ADS)
Capps, M.; McCafferty, J.
There is an urgent need to expand the space situational awareness (SSA) mission beyond catalog maintenance to providing near real-time indications and warnings of emerging events. While building and maintaining a catalog of space objects is essential to SSA, this does not address the threat of uncatalogued and uncorrelated deep space objects. The Air Force therefore has an interest in transformative technologies to scan the geostationary (GEO) belt for uncorrelated space objects. Traditional ground based electro-optical sensors are challenged in simultaneously detecting dim objects while covering large areas of the sky using current CCD technology. Time delayed integration (TDI) scanning has the potential to enable significantly larger coverage rates while maintaining sensitivity for detecting near-GEO objects. This paper investigates strategies of employing TDI sensing technology from a ground based electro-optical telescope, toward providing tactical indications and warnings of deep space threats. We present results of a notional wide area search TDI sensor that scans the GEO belt from three locations: Maui, New Mexico, and Diego Garcia. Deep space objects in the NASA 2030 debris catalog are propagated over multiple nights as an indicative data set to emulate notional uncatalogued near-GEO orbits which may be encountered by the TDI sensor. Multiple scan patterns are designed and simulated, to compare and contrast performance based on 1) efficiency in coverage, 2) number of objects detected, and 3) rate at which detections occur, to enable follow-up observations by other space surveillance network (SSN) sensors. A step-stare approach is also modeled using a dedicated, co-located sensor notionally similar to the Ground-Based Electro-Optical Deep Space Surveillance (GEODSS) tower. Equivalent sensitivities are assumed. This analysis quantifies the relative benefit of TDI scanning for the wide area search mission.
Ríos, Pedro Rizo; Rivera, Aurora González; Oropeza, Itzel Rivas; Ramírez, Odette Campos
2014-12-01
One of the instruments Mexico has available for the optimization of resources specifically allocated to health technologies is the Health Care Formulary and Supply Catalog (Cuadro Básico y Catálogo de Insumos del Sector Salud [CBCISS]). The aim of the CBCISS is to collaborate in the optimization of public resources through the use of technologies (supplies) that have proven their safety, therapeutic efficacy, and efficiency. The importance of the CBCISS lies in the fact that all public institutions within the National Health System must use only the established technologies it contains. The implementation of strategies that strengthen the CBCISS update process allows it to be thought of as an essential regulatory tool for the introduction of health technologies, with relevant contributions to the proper selection of cost-effective interventions. It ensures that each supply included on the list meets the criteria sufficient and necessary to ensure efficacy, safety, effectiveness, and, of course, efficiency, as evidence supporting the selection of suitable technologies. The General Health Council (Consejo de Salubridad General [CSG]) is a collegial body of constitutional origin that-in accordance with its authority-prepares, updates, publishes, and distributes the CBCISS. To perform these activities, the CSG has the CBCISS Inter-institutional Commission. The CBCISS update is performed through the processes of inclusion, modification, and exclusion of supplies approved by the Interior Commission. The CBCISS update process consists of three stages: the first stage involves a test that leads to the acceptance or inadmissibility of the requests, and the other two focus on an in-depth evaluation for the ruling. This article describes the experience of health technology assessment in Mexico, presents the achievements and outlines the improvements in the process of submission of new health technologies, and presents a preliminary analysis of the submissions evaluated until December 2012. During the analysis period, 394 submissions were received. After confirming compliance with the requirements, 59.9% of the submissions passed to the next stage of the process, technology assessment. In the third stage, the committee approved 44.9% of the submissions evaluated. The improvements established in the country in terms of health technology assessment allowed choosing the technologies that give more value for money in a context of public health institutions. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
The Shattered Stereotype: The Academic Library in Technological Transition.
ERIC Educational Resources Information Center
Foster, Constance L.
In academic libraries, neither technical services, public services, nor administration has escaped the impact of online information systems. Online catalogs, network systems, interlibrary lending, database searches, circulation control, automated technical processes, and an increasing number of non-book materials are part of a technological…
Evaluating School Library Information Services in the Digital Age.
ERIC Educational Resources Information Center
Everhart, Nancy
2000-01-01
Discusses criteria for evaluating school library information services. Highlights include types of services; physical facilities; library usage; circulation statistics; changes due to technology; fill rate, or the percentage of successful searches for library materials; OPAC (online public access catalog) reports; observation; and examining…
ERIC Educational Resources Information Center
Scharf, David
2002-01-01
Discusses XML (extensible markup language), particularly as it relates to libraries. Topics include organizing information; cataloging; metadata; similarities to HTML; organizations dealing with XML; making XML useful; a history of XML; the semantic Web; related technologies; XML at the Library of Congress; and its role in improving the…
Wyoming Carbon Capture and Storage Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nealon, Teresa
This report outlines the accomplishments of the Wyoming Carbon Capture and Storage (CCS) Technology Institute (WCTI), including creating a website and online course catalog, sponsoring technology transfer workshops, reaching out to interested parties via news briefs and engaging in marketing activities, i.e., advertising and participating in tradeshows. We conclude that the success of WCTI was hampered by the lack of a market. Because there were no supporting financial incentives to store carbon, the private sector had no reason to incur the extra expense of training their staff to implement carbon storage. ii
Terahertz technology and applications
NASA Technical Reports Server (NTRS)
Siegel, P.
2002-01-01
Despite great scientific interest since at least the 1920's, the THz frequency range remains on e of the least tapped regions of the electromagnetic spectrum. Sandwiched between traditional microwave and optical technologies where there is a limited atmospheric propagation path, little commercial emphasis has been placed on THz systems. This has, perhaps fortunately, preserved some unique science and applications for tomorrow's technologies. For over 25 years the sole niche for THz technology has been in the high resolution spectroscopy and remote sensing areas where heterodyne and Fourier transform techniques have allowed astronomers, chemists, Earth, planetary and space scientists to measure, catalog and map thermal emission lines for a wide variety of lightweight molecules. As it turns out, no where else in the electromagnetic spectrum do we receive so much information about these chemical species. In fact, the universe is bathed in THz energy, most of it going unnoticed and undetected.
Building a High-Tech Library in a Period of Austerity.
ERIC Educational Resources Information Center
Bazillion, Richard J.; Scott, Sue
1991-01-01
Describes the planning process for designing a new library for Algoma University College (Ontario). Topics discussed include the building committee, library policy, design considerations, an electric system that supports computer technology, library automation, the online public access catalog (OPAC), furnishings and interior environment, and…
The Role of the National Agricultural Library.
ERIC Educational Resources Information Center
Howard, Joseph H.
1989-01-01
Describes the role, users, collections and services of the National Agricultural Library. Some of the services discussed include a machine readable bibliographic database, an international interlibrary loan system, programs to develop library networks and cooperative cataloging, and the development and use of information technologies such as laser…
Interlibrary Lending with Computerized Union Catalogues.
ERIC Educational Resources Information Center
Lehmann, Klaus-Dieter
Interlibrary loans in the Federal Republic of Germany are facilitated by applying techniques of data processing and computer output microfilm (COM) to the union catalogs of the national library system. The German library system consists of two national libraries, four central specialized libraries of technology, medicine, agriculture, and…
The Electronic Librarian: Inching Towards the Revolution
ERIC Educational Resources Information Center
Cuesta, Emerita M.
2005-01-01
Electronic resources are transforming the way librarians work. New technological skills have been added to the librarian's tool kit. Some libraries have undertaken large-scale organizational reconfigurations to meet the challenges of the digital environment. Yet libraries still rely on traditional functions such as acquisitions, cataloging, and…
Pushing the Boundaries: Zines and Libraries.
ERIC Educational Resources Information Center
Dodge, Chris
1995-01-01
Describes zines (self-published magazines); suggests librarians should regard zines as historical sources of culture and include them in materials selection. Discusses the role of technology in publication and distribution; the audience; cataloging problems; free speech; and zines on the Internet. Sidebars provide information on selected zines and…
Australian University Libraries: Problems and Prospects for the 1980s.
ERIC Educational Resources Information Center
Steele, Colin
1982-01-01
Inadequate funding for Australian university libraries is criticized. Issues considered include interlibrary loans, staffing needs, the state of cataloging, networks, acquisitions, building and space problems, new technologies, mergers, and governmental priorities. Statistics for individual library holdings and expenditures, and trends in 1976-79…
Library Instruction and Online Database Searching.
ERIC Educational Resources Information Center
Mercado, Heidi
1999-01-01
Reviews changes in online database searching in academic libraries. Topics include librarians conducting all searches; the advent of end-user searching and the need for user instruction; compact disk technology; online public catalogs; the Internet; full text databases; electronic information literacy; user education and the remote library user;…
Readers in Search of Authors: The Changing Face of the Middleman.
ERIC Educational Resources Information Center
Potter, William
1986-01-01
Discusses new information technologies which offer libraries increased opportunities to provide innovative services: linking automated technical processing systems with public systems to provide more timely information; using the expanding services of subscription agencies as a channel for electronic publications delivery; online catalogs; and…
The Continuity Project, Fall 1997 Report.
ERIC Educational Resources Information Center
Wasilko, Peter J.
The Continuity Project is a research, development, and technology transfer initiative aimed at creating a "Library of the Future" by combining features of an online public access catalog (OPAC) and a campus wide information system (CWIS) with advanced facilities drawn from such areas as artificial intelligence (AI), knowledge…
High-performance technology for indexing of high volumes of Earth remote sensing data
NASA Astrophysics Data System (ADS)
Strotov, Valery V.; Taganov, Alexander I.; Kolesenkov, Aleksandr N.; Kostrov, Boris V.
2017-10-01
The present paper has suggested a technology for search, indexing, cataloging and distribution of aerospace images on the basis of geo-information approach, cluster and spectral analysis. It has considered information and algorithmic support of the system. Functional circuit of the system and structure of the geographical data base have been developed on the basis of the geographical online portal technology. Taking into account heterogeneity of information obtained from various sources it is reasonable to apply a geoinformation platform that allows analyzing space location of objects and territories and executing complex processing of information. Geoinformation platform is based on cartographic fundamentals with the uniform coordinate system, the geographical data base, a set of algorithms and program modules for execution of various tasks. The technology for adding by particular users and companies of images taken by means of professional and amateur devices and also processed by various software tools to the array system has been suggested. Complex usage of visual and instrumental approaches allows significantly expanding an application area of Earth remote sensing data. Development and implementation of new algorithms based on the complex usage of new methods for processing of structured and unstructured data of high volumes will increase periodicity and rate of data updating. The paper has shown that application of original algorithms for search, indexing and cataloging of aerospace images will provide an easy access to information spread by hundreds of suppliers and allow increasing an access rate to aerospace images up to 5 times in comparison with current analogues.
NASA Astrophysics Data System (ADS)
Zhang, Chaoran; Van Sistine, Anglea; Kaplan, David; Brady, Patrick; Cook, David O.; Kasliwal, Mansi
2018-01-01
A complete catalog of galaxies in the local universe is critical for efficient electromagnetic follow-up of gravitational wave events (EMGW). The Census of the Local Universe (CLU; Cook et al. 2017, in preparation) aims to provide a galaxy catalog out to 200 Mpc that is as complete as possible. CLU has recently completed an Hα survey of ~3π of the sky with the goal of cataloging those galaxies that are likely hosts of EMGW events. Here, we present a tool we developed using machine learning technology to classify sources extracted from the Hα narrowband images within 200Mpc. In this analysis we find we are able to recover more galaxies compared to selections based on Hα colors alone.
A catalogue of devices applicable to the measurement of boundary layers and wakes on flight vehicles
NASA Technical Reports Server (NTRS)
Miley, S. J.
1972-01-01
A literature search was conducted to assemble a catalog of devices and techniques which have possible application to boundary layer and wake measurements on flight vehicles. The indices used in the search were NACA, NASA STAR, IAA, USGRDR and Applied Science and Technology Index. The period covered was 1950 through 1970. The devices contained in the catalog were restricted to those that provided essentially direct measurement of velocities, pressures and shear stresses. Pertinent material was included in the catalog if it contained either an adequate description of a device and associated performance data or a presentation of applicable information on a particular measurement theory and/or technique. When available, illustrations showing the configuration of the device and test condition data were also included.
Online catalog access and distribution of remotely sensed information
NASA Astrophysics Data System (ADS)
Lutton, Stephen M.
1997-09-01
Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.
ERIC Educational Resources Information Center
Hilton-Chalfen, Danny
1992-01-01
Discussion of the potential for academic libraries to provide improved opportunities for patrons with disabilities focuses on access to online information, including online catalogs, campuswide information systems, CD-ROM products, graphical user interfaces, and electronic documents. Other considerations include location of online resources,…
The Librarian and the Library User: What the Future Holds.
ERIC Educational Resources Information Center
Electronic Library, 1997
1997-01-01
Discusses the role of information professionals in the future, based on a session at the Online Information 96 conference in London (England). Topics include equipment and software needs; technological advances; a trend toward distance education; how library users are adapting to change; interlibrary loans; online public access catalogs; and…
The Impact of Automation on Libraries. Final Report.
ERIC Educational Resources Information Center
Cline, Hugh F.; Sinnott, Loraine T.
This project examined a series of alternative policies for the management and funding of university libraries as they adopt and adapt to various information science technologies to accomplish the functions of acquisitions, cataloging, circulation, and reference services. Comparative case studies were completed at the University of Chicago,…
Benchmarking Academic Libraries: An Australian Case Study.
ERIC Educational Resources Information Center
Robertson, Margaret; Trahn, Isabella
1997-01-01
Discusses experiences and outcomes of benchmarking at the Queensland University of Technology (Australia) library that compared acquisitions, cataloging, document delivery, and research support services with those of the University of New South Wales. Highlights include results as a catalyst for change, and the use of common output and performance…
ERIC Educational Resources Information Center
Association of Research Libraries, Washington, DC. Office of Management Studies.
Designed to serve both as an activity report on Office of Management Studies (OMS) progress during 1987 and a catalog of OMS services and products, this annual report focuses on the management of technology in a scholarly environment. Programs and services are reported in five sections: (1) Applied Research and Development (the Institute on…
School Library Automation: Is It an Option?
ERIC Educational Resources Information Center
Clyde, L. Anne
2000-01-01
Discusses the value of school library automation, and suggests that an automated school library is the right of every child and teacher. Examines the role of the teacher librarian in the current environment; advantages of Online Public Access Catalogs (OPACs); and trends towards greater convergence in information technology applications in the…
VizieR Online Data Catalog: Transiting planet WASP-19b (Tregloan-Reed+, 2013)
NASA Astrophysics Data System (ADS)
Tregloan-Reed, J.; Southworth, J.; Tappert, C.
2018-05-01
Defocussed photometry for the transiting extrasolar planetary system WASP-19. The data were obtained in the Gunn r passband using the EFOSC CCD imager on the 3.6m New Technology Telescope at ESO La Silla. The observer was Claus Tappert. (1 data file).
The Continuity Project. Spring/Summer 1998 Report.
ERIC Educational Resources Information Center
Wasilko, Peter J.
The Continuity Project is a research, development, and technology transfer initiative aimed at creating a Library of the Future by combining features of an online public access catalog (OPAC) and a campuswide information system (CWIS) with advanced facilities drawn from such areas as artificial intelligence (AI), knowledge representation (KR),…
A Multi-Purpose Data Dissemination Infrastructure for the Marine-Earth Observations
NASA Astrophysics Data System (ADS)
Hanafusa, Y.; Saito, H.; Kayo, M.; Suzuki, H.
2015-12-01
To open the data from a variety of observations, the Japan Agency for Marine-Earth Science and Technology (JAMSTEC) has developed a multi-purpose data dissemination infrastructure. Although many observations have been made in the earth science, all the data are not opened completely. We think data centers may provide researchers with a universal data dissemination service which can handle various kinds of observation data with little effort. For this purpose JAMSTEC Data Management Office has developed the "Information Catalog Infrastructure System (Catalog System)". This is a kind of catalog management system which can create, renew and delete catalogs (= databases) and has following features, - The Catalog System does not depend on data types or granularity of data records. - By registering a new metadata schema to the system, a new database can be created on the same system without sytem modification. - As web pages are defined by the cascading style sheets, databases have different look and feel, and operability. - The Catalog System provides databases with basic search tools; search by text, selection from a category tree, and selection from a time line chart. - For domestic users it creates the Japanese and English pages at the same time and has dictionary to control terminology and proper noun. As of August 2015 JAMSTEC operates 7 databases on the Catalog System. We expect to transfer existing databases to this system, or create new databases on it. In comparison with a dedicated database developed for the specific dataset, the Catalog System is suitable for the dissemination of small datasets, with minimum cost. Metadata held in the catalogs may be transfered to other metadata schema to exchange global databases or portals. Examples: JAMSTEC Data Catalog: http://www.godac.jamstec.go.jp/catalog/data_catalog/metadataList?lang=enJAMSTEC Document Catalog: http://www.godac.jamstec.go.jp/catalog/doc_catalog/metadataList?lang=en&tab=categoryResearch Information and Data Access Site of TEAMS: http://www.i-teams.jp/catalog/rias/metadataList?lang=en&tab=list
Information sources in science and technology in Finland
NASA Technical Reports Server (NTRS)
Haarala, Arja-Riitta
1994-01-01
Finland poses some problems to be overcome in the field of scientific and technical information: a small user community which makes domestic systems costly; great distances within the country between users and suppliers of information; great distances to international data systems and large libraries abroad; and inadequate collections of scientific and technical information. The national bibliography Fennica includes all books and journals published in Finland. Data base services available in Finland include: reference data bases in science and technology; data banks for decision making such as statistical time series or legal proceedings; national bibliographies; and library catalogs.
2010-01-01
Background De novo assembly of transcript sequences produced by short-read DNA sequencing technologies offers a rapid approach to obtain expressed gene catalogs for non-model organisms. A draft genome sequence will be produced in 2010 for a Eucalyptus tree species (E. grandis) representing the most important hardwood fibre crop in the world. Genome annotation of this valuable woody plant and genetic dissection of its superior growth and productivity will be greatly facilitated by the availability of a comprehensive collection of expressed gene sequences from multiple tissues and organs. Results We present an extensive expressed gene catalog for a commercially grown E. grandis × E. urophylla hybrid clone constructed using only Illumina mRNA-Seq technology and de novo assembly. A total of 18,894 transcript-derived contigs, a large proportion of which represent full-length protein coding genes were assembled and annotated. Analysis of assembly quality, length and diversity show that this dataset represent the most comprehensive expressed gene catalog for any Eucalyptus tree. mRNA-Seq analysis furthermore allowed digital expression profiling of all of the assembled transcripts across diverse xylogenic and non-xylogenic tissues, which is invaluable for ascribing putative gene functions. Conclusions De novo assembly of Illumina mRNA-Seq reads is an efficient approach for transcriptome sequencing and profiling in Eucalyptus and other non-model organisms. The transcriptome resource (Eucspresso, http://eucspresso.bi.up.ac.za/) generated by this study will be of value for genomic analysis of woody biomass production in Eucalyptus and for comparative genomic analysis of growth and development in woody and herbaceous plants. PMID:21122097
A 1000 Arab genome project to study the Emirati population.
Al-Ali, Mariam; Osman, Wael; Tay, Guan K; AlSafar, Habiba S
2018-04-01
Discoveries from the human genome, HapMap, and 1000 genome projects have collectively contributed toward the creation of a catalog of human genetic variations that has improved our understanding of human diversity. Despite the collegial nature of many of these genome study consortiums, which has led to the cataloging of genetic variations of different ethnic groups from around the world, genome data on the Arab population remains overwhelmingly underrepresented. The National Arab Genome project in the United Arab Emirates (UAE) aims to address this deficiency by using Next Generation Sequencing (NGS) technology to provide data to improve our understanding of the Arab genome and catalog variants that are unique to the Arab population of the UAE. The project was conceived to shed light on the similarities and differences between the Arab genome and those of the other ethnic groups.
Outsourcing Cataloging, Authority Work, and Physical Processing: A Checklist of Considerations.
ERIC Educational Resources Information Center
Kascus, Marie A., Ed.; Hale, Dawn, Ed.
Due to automation technology, financial restrictions, and resultant downsizing, library managers have increasingly relied on the services of contractors, rather than in-house staff, to accomplish different technical services operations. Contracted services may range from a small project for a selected group of materials to a large project for…
intelligentCAPTURE 1.0 Adds Tables of Content to Library Catalogues and Improves Retrieval.
ERIC Educational Resources Information Center
Hauer, Manfred; Simedy, Walton
2002-01-01
Describes an online library catalog that was developed for an Austrian scientific library that includes table of contents in addition to the standard bibliographic information in order to increase relevance for searchers. Discusses the technology involved, including OCR (Optical Character Recognition) and automatic indexing techniques; weighted…
TOC/DOC: "It Has Changed the Way I Do Science".
ERIC Educational Resources Information Center
Douglas, Kimberly; Roth, Dana L.
1997-01-01
Describes a user-based automated service developed at the California Institute of Technology that combines access to journal article databases with an in-house document delivery system. TOC/DOC (Tables of Contents/Document Delivery) has undergone a conceptual change from a catalog of locally-held journal articles to a broader, more retrospective…
Singingfish: Advancing the Art of Multimedia Search.
ERIC Educational Resources Information Center
Fritz, Mark
2003-01-01
Singingfish provides multimedia search services that enable Internet users to locate audio and video online. Over the last few years, the company has cataloged and indexed over 30 million streams and downloadable MP3s, with 150,000 to 250,000 more being added weekly. This article discusses a deal with Microsoft; the technology; improving the…
2007-03-01
features Federated Search (providing services to find and aggregate information across GIG enterprise data sources); Enterprise Catalog (providing...Content Discovery Federated Search Portlet Users Guide v0.4.3 M16 25-Apr-05 NCES Mediation Core Enterprise Services SDK v0.5.0 M17 25-Apr-05 NCES
The Method behind the Madness: Acquiring Online Journals and a Solution to Provide Access
ERIC Educational Resources Information Center
Skekel, Donna
2005-01-01
Libraries are seeking the best possible solution for integrating online journals into their collections. While exploring the different methods and technology available, many libraries still strive to fulfill the original "library mission" proposed by Charles Cutter in his "Rules for a Dictionary Catalog". Providing comprehensive access to…
The Golden Age of Reference Service: Is It Really Over?
ERIC Educational Resources Information Center
Rice, James
1986-01-01
Argues that reference services will not only survive changes brought about by new technologies, but will be improved and enhanced as a result. Examples given include online public access catalogs, automated record-keeping operations, CD-ROM as an information storage medium, the continuing need for intermediaries in online searching, and copyright…
Reusable Software Usability Specifications for mHealth Applications.
Cruz Zapata, Belén; Fernández-Alemán, José Luis; Toval, Ambrosio; Idri, Ali
2018-01-25
One of the key factors for the adoption of mobile technologies, and in particular of mobile health applications, is usability. A usable application will be easier to use and understand by users, and will improve user's interaction with it. This paper proposes a software requirements catalog for usable mobile health applications, which can be used for the development of new applications, or the evaluation of existing ones. The catalog is based on the main identified sources in literature on usability and mobile health applications. Our catalog was organized according to the ISO/IEC/IEEE 29148:2011 standard and follows the SIREN methodology to create reusable catalogs. The applicability of the catalog was verified by the creation of an audit method, which was used to perform the evaluation of a real app, S Health, application created by Samsung Electronics Co. The usability requirements catalog, along with the audit method, identified several usability flaws on the evaluated app, which scored 83%. Some flaws were detected in the app related to the navigation pattern. Some more issues related to the startup experience, empty screens or writing style were also found. The way a user navigates through an application improves or deteriorates user's experience with the application. We proposed a reusable usability catalog and an audit method. This proposal was used to evaluate a mobile health application. An audit report was created with the usability issues identified on the evaluated application.
NASA Astrophysics Data System (ADS)
Tronconi, C.; Forneris, V.; Santoleri, R.
2009-04-01
CNR-ISAC-GOS is responsible for the Mediterranean Sea satellite operational system in the framework of MOON Patnership. This Observing System acquires satellite data and produces Near Real Time, Delayed Time and Re-analysis of Ocean Colour and Sea Surface Temperature products covering the Mediterranean and the Black Seas and regional basins. In the framework of several projects (MERSEA, PRIMI, Adricosm Star, SeaDataNet, MyOcean, ECOOP), GOS is producing Climatological/Satellite datasets based on optimal interpolation and specific Regional algorithm for chlorophyll, updated in Near Real Time and in Delayed mode. GOS has built • an informatic infrastructure data repository and delivery based on THREDDS technology The datasets are generated in NETCDF format, compliant with both the CF convention and the international satellite-oceanographic specification, as prescribed by GHRSST (for SST). All data produced, are made available to the users through a THREDDS server catalog. • A LAS has been installed in order to exploit the potential of NETCDF data and the OPENDAP URL. It provides flexible access to geo-referenced scientific data • a Grid Environment based on Globus Technologies (GT4) connecting more than one Institute; in particular exploiting CNR and ESA clusters makes possible to reprocess 12 years of Chlorophyll data in less than one month.(estimated processing time on a single core PC: 9months). In the poster we will give an overview of: • the features of the THREDDS catalogs, pointing out the powerful characteristics of this new middleware that has replaced the "old" OPENDAP Server; • the importance of adopting a common format (as NETCDF) for data exchange; • the tools (e.g. LAS) connected with THREDDS and NETCDF format use. • the Grid infrastructure on ISAC We will present also specific basin-scale High Resolution products and Ultra High Resolution regional/coastal products available on these catalogs.
Mission to the Solar System: Exploration and Discovery. A Mission and Technology Roadmap
NASA Technical Reports Server (NTRS)
Gulkis, S. (Editor); Stetson, D. S. (Editor); Stofan, E. R. (Editor)
1998-01-01
Solar System exploration addresses some of humanity's most fundamental questions: How and when did life form on Earth? Does life exist elsewhere in the Solar System or in the Universe? - How did the Solar System form and evolve in time? - What can the other planets teach us about the Earth? This document describes a Mission and Technology Roadmap for addressing these and other fundamental Solar System Questions. A Roadmap Development Team of scientists, engineers, educators, and technologists worked to define the next evolutionary steps in in situ exploration, sample return, and completion of the overall Solar System survey. Guidelines were to "develop aa visionary, but affordable, mission and technology development Roadmap for the exploration of the Solar System in the 2000 to 2012 timeframe." The Roadmap provides a catalog of potential flight missions. (Supporting research and technology, ground-based observations, and laboratory research, which are no less important than flight missions, are not included in this Roadmap.)
Designing a Multi-Petabyte Database for LSST
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becla, Jacek; Hanushevsky, Andrew; Nikolaev, Sergei
2007-01-10
The 3.2 giga-pixel LSST camera will produce approximately half a petabyte of archive images every month. These data need to be reduced in under a minute to produce real-time transient alerts, and then added to the cumulative catalog for further analysis. The catalog is expected to grow about three hundred terabytes per year. The data volume, the real-time transient alerting requirements of the LSST, and its spatio-temporal aspects require innovative techniques to build an efficient data access system at reasonable cost. As currently envisioned, the system will rely on a database for catalogs and metadata. Several database systems are beingmore » evaluated to understand how they perform at these data rates, data volumes, and access patterns. This paper describes the LSST requirements, the challenges they impose, the data access philosophy, results to date from evaluating available database technologies against LSST requirements, and the proposed database architecture to meet the data challenges.« less
ERIC Educational Resources Information Center
Coursen, David
The term "media," as employed here, refers to printed and audiovisual forms of communication and their accompanying technology. A representative list of printed materials might include books, periodicals, catalogs, and printed programmed materials. Audiovisual materials include films and filmstrips, recordings, slides, graphic materials,…
Taking It to the Streets: Mobile CD-ROM Workshops on Campus.
ERIC Educational Resources Information Center
Parker-Gibson, Necia
1994-01-01
Librarians at the University of Arkansas (Fayetteville) offer CD-ROM database and online catalog training using a portable unit composed of a laptop computer, modem, LCD panel, and overhead projector. This unit allows librarians to make presentations to large groups and to demonstrate to faculty and students the relevancy of library technology.…
Riding the Crest of the E-Commerce Wave: Transforming MIT's Campus Computer Resale Operation.
ERIC Educational Resources Information Center
Hallisey, Joanne
1998-01-01
Reengineering efforts, vendor consolidation, and rising costs prompted the Massachusetts Institute of Technology to convert its computer resale store to an online catalog that allows students, faculty, and staff to purchase equipment and software through a World Wide Web interface. The transition has been greeted with a mixed reaction. The next…
The Key to the Future of the Library Catalog is Openness
ERIC Educational Resources Information Center
Westrum, Anne-Lena
2011-01-01
Technology makes it possible to redefine libraries and make them relevant to the public once again. But how good are the digital services offered by public libraries today? The digital services department team of the Pode project at Norway's Oslo Public Library has spent the last 2 years investigating the possibilities available in order to…
Information Technology and the Evolution of the Library
2009-03-01
Resource Commons/ Repository/ Federated Search ILS (GLADIS/Pathfinder - Millenium)/ Catalog/ Circulation/ Acquisitions/ Digital Object Content...content management services to help centralize and distribute digi- tal content from across the institution, software to allow for seamless federated ... search - ing across multiple databases, and imaging software to allow for daily reimaging of ter- minals to reduce security concerns that otherwise
Building an Ajax Application from Scratch
ERIC Educational Resources Information Center
Clark, Jason A.
2006-01-01
The author of this article suggests that to refresh Web pages and online library catalogs in a more pleasing way, Ajax, an acronym for Asynchronous JavaScript and XML, should be used. Ajax is the way to use Web technologies that work together to refresh sections of Web pages to allow almost instant responses to user input. This article describes…
ERIC Educational Resources Information Center
Anderson, Mary Alice, Ed.
This notebook is a compilation of 53 lesson plans for grades 6-12, written by various authors and focusing on the integration of technology into the curriculum. Lesson plans include topics such as online catalog searching, electronic encyclopedias, CD-ROM databases, exploring the Internet, creating a computer slide show, desktop publishing, and…
ERIC Educational Resources Information Center
Webb, Paula L.; Nero, Muriel D.
2009-01-01
In today's world of instant everything, everyone has been exposed to some form of Web 2.0 technology, and higher education is not exempt from its long reach. Libraries of all types are incorporating Web 2.0 features to attract users as well as to showcase library services. The Online Public Access Catalog (OPAC) has become more user-friendly with…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Laura L.; Barela, Amanda Crystal; Walkow, Walter M.
An Evaluation and Screening team supporting the Fuel Cycle Technologies Program Office of the United States Department of Energy, Office of Nuclear Energy is conducting an evaluation and screening of a comprehensive set of fuel cycle options. These options have been assigned to one of 40 evaluation groups, each of which has a representative fuel cycle option [Todosow 2013]. A Fuel Cycle Data Package System Datasheet has been prepared for each representative fuel cycle option to ensure that the technical information used in the evaluation is high-quality and traceable [Kim, et al., 2013]. The information contained in the Fuel Cyclemore » Data Packages has been entered into the Nuclear Fuel Cycle Options Catalog at Sandia National Laboratories so that it is accessible by the evaluation and screening team and other interested parties. In addition, an independent team at Savannah River National Laboratory has verified that the information has been entered into the catalog correctly. This report documents that the 40 representative fuel cycle options have been entered into the Catalog, and that the data entered into the catalog for the 40 representative options has been entered correctly.« less
ERIC Educational Resources Information Center
Smithsonian Institution, Washington, DC. Science Information Exchange.
Described are 2,589 research projects under the general headings of: Properties of Water, Water Motion, Meteorology, Survey and Prediction, Living Systems (non-human), Public Health and Safety, Marine Geology, Engineering and Technology, Costal Zone Management and Use, Legal Studies, Education and Training, and Facilities. Each description…
ERIC Educational Resources Information Center
Mulrooney, Timothy J.
2009-01-01
A Geographic Information System (GIS) serves as the tangible and intangible means by which spatially related phenomena can be created, analyzed and rendered. GIS metadata serves as the formal framework to catalog information about a GIS data set. Metadata is independent of the encoded spatial and attribute information. GIS metadata is a subset of…
Plug Your Users into Library Resources with OpenSearch Plug-Ins
ERIC Educational Resources Information Center
Baker, Nicholas C.
2007-01-01
To bring the library catalog and other online resources right into users' workspace quickly and easily without needing much more than a short XML file, the author, a reference and Web services librarian at Williams College, learned to build and use OpenSearch plug-ins. OpenSearch is a set of simple technologies and standards that allows the…
VizieR Online Data Catalog: PSR J1023+0038 & XSS J12270-4859 VRi polarimetry (Baglio+, 2016)
NASA Astrophysics Data System (ADS)
Baglio, M. C.; D'Avanzo, P.; Campana, S.; Coti Zelati, F.; Covino, S.; Russell, D. M.
2016-05-01
The systems PSR J1023+0038 and XSS J12270-4859 were observed in quiescence on 8 and 9 February 2015 (respectively) with the ESO New Technology Telescope (NTT) located at La Silla (Chile), equipped with the EFOSC2 camera in polarimetric mode, using the B, V, R, i filters. (6 data files).
Code of Federal Regulations, 2013 CFR
2013-01-01
... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...
Code of Federal Regulations, 2014 CFR
2014-01-01
... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...
Code of Federal Regulations, 2012 CFR
2012-01-01
... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...
Code of Federal Regulations, 2011 CFR
2011-01-01
... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...
NASA Technical Reports Server (NTRS)
Dulchavsky, Scott A.; Sargsyan, A.E.
2009-01-01
This slide presentation reviews the use of ultrasound as a diagnostic tool in microgravity environments. The goals of research in ultrasound usage in space environments are: (1) Determine accuracy of ultrasound in novel clinical conditions. (2) Determine optimal training methodologies, (3) Determine microgravity associated changes and (4) Develop intuitive ultrasound catalog to enhance autonomous medical care. Also uses of Ultrasound technology in terrestrial applications are reviewed.
A catalog of NASA special publications
NASA Technical Reports Server (NTRS)
1981-01-01
A list of all of the special publications released by NASA are presented. The list includes scientific and technical books covering a wide variety of topics, including much of the agencies research and development work, its full range of space exploration programs, its work in advancing aeronautics technology, and many associated historical and managerial efforts. A total of 1200 titles are presented.
US Department of Energy education programs catalog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-07-01
Missions assigned to DOE by Congress include fundamental scientific research, research and development of energy technologies, energy conservation, strategic weapons development and production, energy regulation, energy data collection and analysis, federal power marketing, and education in science and technology. Contributing to mathematics and science education initiatives are nine DOE national laboratories and more than 30 additional specialized research facilities. Within their walls, some of the most exciting research in contemporary science is conducted. The Synchrotron Light Source at Brookhaven National Laboratory, the Intense Pulsed Neutron Source at Argonne National Laboratory, lasers, electron microscopes, advanced robotics and supercomputers are examples ofmore » some of the unique tools that DOE employs in exploring research frontiers. Nobel laureates and other eminent scientists employed by DOE laboratories have accomplished landmark work in physics, chemistry, biology, materials science, and other disciplines. The Department oversees an unparalleled collection of scientific and technical facilities and equipment with extraordinary potential for kindling in students and the general public a sense of excitement about science and increasing public science literacy. During 1991, programs funded by DOE and its contractors reached more than one million students and educators. This document is a catalog of these education programs.« less
US Department of Energy education programs catalog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-01-01
Missions assigned to DOE by Congress include fundamental scientific research, research and development of energy technologies, energy conservation, strategic weapons development and production, energy regulation, energy data collection and analysis, federal power marketing, and education in science and technology. Contributing to mathematics and science education initiatives are nine DOE national laboratories and more than 30 additional specialized research facilities. Within their walls, some of the most exciting research in contemporary science is conducted. The Synchrotron Light Source at Brookhaven National Laboratory, the Intense Pulsed Neutron Source at Argonne National Laboratory, lasers, electron microscopes, advanced robotics and supercomputers are examples ofmore » some of the unique tools that DOE employs in exploring research frontiers. Nobel laureates and other eminent scientists employed by DOE laboratories have accomplished landmark work in physics, chemistry, biology, materials science, and other disciplines. The Department oversees an unparalleled collection of scientific and technical facilities and equipment with extraordinary potential for kindling in students and the general public a sense of excitement about science and increasing public science literacy. During 1991, programs funded by DOE and its contractors reached more than one million students and educators. This document is a catalog of these education programs.« less
NASA Astrophysics Data System (ADS)
Soni, Jigensh; Yadav, R. K.; Patel, A.; Gahlaut, A.; Mistry, H.; Parmar, K. G.; Mahesh, V.; Parmar, D.; Prajapati, B.; Singh, M. J.; Bandyopadhyay, M.; Bansal, G.; Pandya, K.; Chakraborty, A.
2013-02-01
Twin Source - An Inductively coupled two RF driver based 180 kW, 1 MHz negative ion source experimental setup is initiated at IPR, Gandhinagar, under Indian program, with the objective of understanding the physics and technology of multi-driver coupling. Twin Source [1] (TS) also provides an intermediate platform between operational ROBIN [2] [5] and eight RF drivers based Indian test facility -INTF [3]. A twin source experiment requires a central system to provide control, data acquisition and communication interface, referred as TS-CODAC, for which a software architecture similar to ITER CODAC core system has been decided for implementation. The Core System is a software suite for ITER plant system manufacturers to use as a template for the development of their interface with CODAC. The ITER approach, in terms of technology, has been adopted for the TS-CODAC so as to develop necessary expertise for developing and operating a control system based on the ITER guidelines as similar configuration needs to be implemented for the INTF. This cost effective approach will provide an opportunity to evaluate and learn ITER CODAC technology, documentation, information technology and control system processes, on an operational machine. Conceptual design of the TS-CODAC system has been completed. For complete control of the system, approximately 200 Nos. control signals and 152 acquisition signals are needed. In TS-CODAC, control loop time required is within the range of 5ms - 10 ms, therefore for the control system, PLC (Siemens S-7 400) has been chosen as suggested in the ITER slow controller catalog. For the data acquisition, the maximum sampling interval required is 100 micro second, and therefore National Instruments (NI) PXIe system and NI 6259 digitizer cards have been selected as suggested in the ITER fast controller catalog. This paper will present conceptual design of TS -CODAC system based on ITER CODAC Core software and applicable plant system integration processes.
Virtual Collaboration for a Distributed Enterprise
2013-01-01
451-6915; Email : order@rand.org Library of Congress Cataloging-in-Publication Data is available for this publication. ISBN: 978-08330-8003-5 The...category. Online discussion boards, chat rooms, and email are all considered forms of computer-mediated communication.9 In particular, many...Google Wave, which combines features of chat, email , and graphics and document sharing.14 Through these technologies, distant team members can now
ERIC Educational Resources Information Center
Salisbury, Lutishoor; Laincz, Jozef; Smith, Jeremy J.
2012-01-01
Many academic libraries and publishers have developed mobile-optimized versions of their web sites and catalogs. Almost all database vendors and major journal publishers have provided a way to connect to their resources via the Internet and the mobile web. In light of this pervasive use of the Internet, mobile devices and social networking, this…
ERIC Educational Resources Information Center
Atherton, Pauline; And Others
A single issue of Nuclear Science Abstracts, containing about 2,300 abstracts, was indexed by Universal Decimal Classification (UDC) using the Special Subject Edition of UDC for Nuclear Science and Technology. The descriptive cataloging and UDC-indexing records formed a computer-stored data base. A systematic random sample of 500 additional…
ERIC Educational Resources Information Center
Griffin, Michael; Lee, Jongsoo
1995-01-01
Reports on a visual content analysis of 1,104 Gulf War-related pictures. Finds that a narrowly limited range of images (with a special emphasis on cataloging military weaponry and technology) dominated the pictorial coverage. Suggests that the scarcity of pictures depicting ongoing events in the Gulf contradicts the impression of first-hand media…
Translations on USSR Science and Technology Biomedical Sciences No. 12
1977-10-25
PUBLICATIONS JPRS publications may be ordered from the National Technical Information Service, Springfield, Virginia 22151. In order- ing, it is...Reports Announcements issued semi-monthly by the National Technical Information Service, and are listed in the Monthly Catalog of U.S. Government...hygienic norms and requirements into national economic practice. During the period 1971-1978 various stand- ards will be prepared within the CEMA
VizieR Online Data Catalog: Hα and [NII] survey in local 11 Mpc (Kennicutt+, 2008)
NASA Astrophysics Data System (ADS)
Kennicutt, R. C. Jr; Lee, J. C.; Funes, J. G.; Sakai, S.; Akiyama, S.
2009-11-01
Most of the Hα and R imaging reported in this paper was obtained in 2001-2004 using CCD direct imagers on the Steward Observatory Bok 2.3m telescope on Kitt Peak (Bok), the Lennon 1.8m Vatican Advanced Technology Telescope (VATT), and the 0.9m telescope at Cerro Tololo Interamerican Observatory (CTIO). (3 data files).
2015-06-09
Many members of Team RoboSimian and a few guests gather with competition hardware at a "Meet the Robots" event during the DARPA Robotics Challenge Finals in Pomona, California, on June 6, 2015. The RoboSimian team at JPL is collaborating with partners at the University of California, Santa Barbara, and the California Institute of Technology in Pasadena. Caltech manages JPL for NASA. http://photojournal.jpl.nasa.gov/catalog/PIA19329
Catalog of lunar and Mars science payloads
NASA Technical Reports Server (NTRS)
Budden, Nancy Ann (Editor)
1994-01-01
This catalog collects and describes science payloads considered for future robotic and human exploration missions to the Moon and Mars. The science disciplines included are geosciences, meteorology, space physics, astronomy and astrophysics, life sciences, in-situ resource utilization, and robotic science. Science payload data is helpful for mission scientists and engineers developing reference architectures and detailed descriptions of mission organizations. One early step in advanced planning is formulating the science questions for each mission and identifying the instrumentation required to address these questions. The next critical element is to establish and quantify the supporting infrastructure required to deliver, emplace, operate, and maintain the science experiments with human crews or robots. This requires a comprehensive collection of up-to-date science payload information--hence the birth of this catalog. Divided into lunar and Mars sections, the catalog describes the physical characteristics of science instruments in terms of mass, volume, power and data requirements, mode of deployment and operation, maintenance needs, and technological readiness. It includes descriptions of science payloads for specific missions that have been studied in the last two years: the Scout Program, the Artemis Program, the First Lunar Outpost, and the Mars Exploration Program.
Worldwide standardized seismograph network
Peterson, J.
1977-01-01
A global network of seismographs is as indispensable to seismologists as telescopes are to astronomers. The network is used to catalog the thousands of earthquakes that occur each year and to provide the data needed for detailed studies of earthquake mechanisms, deep Earth structure, and tectonic processes. Like astronomy, seismology is an observational science, and most of the scientific advances have been preceded by improvements in instrument technology. To be useful for seismic studies, new technology must be applied on a global scale. During the past two decades, there has been notable success in meeting this objective. The network that exists today (fig. 1) is a vital scientific resource. Continued innovations and improvements are needed to insure that its keeps pace with the data needs of the seismological community.
ERIC Educational Resources Information Center
Wilson, E. C.
This catalog contains a listing of the audio-visual aids used in the Alabama State Module of the Appalachian Adult Basic Education Program. Aids listed include filmstrips utilized by the following organizations: Columbia, South Carolina State Department of Education; Raleigh, North Carolina State Department of Education; Alden Films of Brooklyn,…
2015-09-30
whales that facilitate relatively close approaches to the animals without obviously disturbing them. With this experience, during field projects in... Marine Mammal Electronic Tags” funded through a Science and Technology Transfer (STTR) program Phase II Option 1 contract, Office of Naval Research...assessment: updating photo-identification catalogs for estimating abundance, assessing the nature and extent of fishery interactions with pantropical
THz Spectroscopy and Spectroscopic Database for Astrophysics
NASA Technical Reports Server (NTRS)
Pearson, John C.; Drouin, Brian J.
2006-01-01
Molecule specific astronomical observations rely on precisely determined laboratory molecular data for interpretation. The Herschel Heterodyne Instrument for Far Infrared, a suite of SOFIA instruments, and ALMA are each well placed to expose the limitations of available molecular physics data and spectral line catalogs. Herschel and SOFIA will observe in high spectral resolution over the entire far infrared range. Accurate data to previously unimagined frequencies including infrared ro-vibrational and ro-torsional bands will be required for interpretation of the observations. Planned ALMA observations with a very small beam will reveal weaker emission features requiring accurate knowledge of higher quantum numbers and additional vibrational states. Historically, laboratory spectroscopy has been at the front of submillimeter technology development, but now astronomical receivers have an enormous capability advantage. Additionally, rotational spectroscopy is a relatively mature field attracting little interest from students and funding agencies. Molecular database maintenance is tedious and difficult to justify as research. This severely limits funding opportunities even though data bases require the same level of expertise as research. We report the application of some relatively new receiver technology into a simple solid state THz spectrometer that has the performance required to collect the laboratory data required by astronomical observations. Further detail on the lack of preparation for upcoming missions by the JPL spectral line catalog is given.
THz Spectroscopy and Spectroscopic Database for Astrophysics
NASA Technical Reports Server (NTRS)
Pearson, John C.; Drouin, Brian J.
2006-01-01
Molecule specific astronomical observations rely on precisely determined laboratory molecular data for interpretation. The Herschel Heterodyne Instrument for Far Infrared, a suite of SOFIA instruments, and ALMA are each well placed to expose the limitations of available molecular physics data and spectral line catalogs. Herschel and SOFIA will observe in high spectral resolution over the entire far infrared range. Accurate data to previously unimagined frequencies including infrared ro-vibrational and ro-torsional bands will be required for interpretation of the observations. Planned ALMA observations with a very small beam will reveal weaker emission features requiring accurate knowledge of higher quantum numbers and additional vibrational states. Historically, laboratory spectroscopy has been at the front of submillimeter technology development, but now astronomical receivers have an enormous capability advantage. Additionally, rotational spectroscopy is a relatively mature field attracting little interest from students and funding agencies. Molecular data base maintenance is tedious and difficult to justify as research. This severely limits funding opportunities even though data bases require the same level of expertise as research. We report the application of some relatively new receiver technology into a simple solid state THz spectrometer that has the performance required to collect the laboratory data required by astronomical observations. Further detail on the lack of preparation for upcoming missions by the JPL spectral line catalog is given.
Hirata, Satoshi; Kojima, Kaname; Misawa, Kazuharu; Gervais, Olivier; Kawai, Yosuke; Nagasaki, Masao
2018-05-01
Forensic DNA typing is widely used to identify missing persons and plays a central role in forensic profiling. DNA typing usually uses capillary electrophoresis fragment analysis of PCR amplification products to detect the length of short tandem repeat (STR) markers. Here, we analyzed whole genome data from 1,070 Japanese individuals generated using massively parallel short-read sequencing of 162 paired-end bases. We have analyzed 843,473 STR loci with two to six basepair repeat units and cataloged highly polymorphic STR loci in the Japanese population. To evaluate the performance of the cataloged STR loci, we compared 23 STR loci, widely used in forensic DNA typing, with capillary electrophoresis based STR genotyping results in the Japanese population. Seventeen loci had high correlations and high call rates. The other six loci had low call rates or low correlations due to either the limitations of short-read sequencing technology, the bioinformatics tool used, or the complexity of repeat patterns. With these analyses, we have also purified the suitable 218 STR loci with four basepair repeat units and 53 loci with five basepair repeat units both for short read sequencing and PCR based technologies, which would be candidates to the actual forensic DNA typing in Japanese population.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trentadue, R.; Clemencic, M.; Dykstra, D.
The LCG Persistency Framework consists of three software packages (CORAL, COOL and POOL) that address the data access requirements of the LHC experiments in several different areas. The project is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that are using some or all of the Persistency Framework components to access their data. POOL is a hybrid technology store for C++ objects, using a mixture of streaming and relational technologies to implement both object persistency and object metadata catalogs and collections. CORAL is an abstraction layer with an SQL-free APImore » for accessing data stored using relational database technologies. COOL provides specific software components and tools for the handling of the time variation and versioning of the experiment conditions data. This presentation reports on the status and outlook in each of the three sub-projects at the time of the CHEP2012 conference, reviewing the usage of each package in the three LHC experiments.« less
Engineering and Technology Challenges for Active Debris Removal
NASA Technical Reports Server (NTRS)
Liou, Jer-Chyi
2011-01-01
After more than fifty years of space activities, the near-Earth environment is polluted with man-made orbital debris. The collision between Cosmos 2251 and the operational Iridium 33 in 2009 signaled a potential collision cascade effect, also known as the "Kessler Syndrome", in the environment. Various modelling studies have suggested that the commonly-adopted mitigation measures will not be sufficient to stabilize the future debris population. Active debris removal must be considered to remediate the environment. This paper summarizes the key issues associated with debris removal and describes the technology and engineering challenges to move forward. Fifty-four years after the launch of Sputnik 1, satellites have become an integral part of human society. Unfortunately, the ongoing space activities have left behind an undesirable byproduct orbital debris. This environment problem is threatening the current and future space activities. On average, two Shuttle window panels are replaced after every mission due to damage by micrometeoroid or orbital debris impacts. More than 100 collision avoidance maneuvers were conducted by satellite operators in 2010 to reduce the impact risks of their satellites with respect to objects in the U.S. Space Surveillance Network (SSN) catalog. Of the four known accident collisions between objects in the SSN catalog, the last one, collision between Cosmos 2251 and the operational Iridium 33 in 2009, was the most significant. It was the first ever accidental catastrophic destruction of an operational satellite by another satellite. It also signaled the potential collision cascade effect in the environment, commonly known as the "Kessler Syndrome," predicted by Kessler and Cour-Palais in 1978 [1]. Figure 1 shows the historical increase of objects in the SSN catalog. The majority of the catalog objects are 10 cm and larger. As of April 2011, the total objects tracked by the SSN sensors were more than 22,000. However, approximately 6000 of them had yet to be fully processed and entered into the catalog. This population had been dominated by fragmentation debris throughout history. Before the anti-satellite test (ASAT) conducted by China in 2007, the fragmentation debris were almost all explosion fragments. After the ASAT test and the collision between Iridium 33 and Cosmos 2251, the ratio of collision fragments to explosion fragments was about one-to-one. It is expected that accidental collision fragments will further dominate the environment in the future.
The National Virtual Observatory
NASA Astrophysics Data System (ADS)
Hanisch, Robert J.
2001-06-01
The National Virtual Observatory is a distributed computational facility that will provide access to the ``virtual sky''-the federation of astronomical data archives, object catalogs, and associated information services. The NVO's ``virtual telescope'' is a common framework for requesting, retrieving, and manipulating information from diverse, distributed resources. The NVO will make it possible to seamlessly integrate data from the new all-sky surveys, enabling cross-correlations between multi-Terabyte catalogs and providing transparent access to the underlying image or spectral data. Success requires high performance computational systems, high bandwidth network services, agreed upon standards for the exchange of metadata, and collaboration among astronomers, astronomical data and information service providers, information technology specialists, funding agencies, and industry. International cooperation at the onset will help to assure that the NVO simultaneously becomes a global facility. .
The CMS Data Management System
NASA Astrophysics Data System (ADS)
Giffels, M.; Guo, Y.; Kuznetsov, V.; Magini, N.; Wildish, T.
2014-06-01
The data management elements in CMS are scalable, modular, and designed to work together. The main components are PhEDEx, the data transfer and location system; the Data Booking Service (DBS), a metadata catalog; and the Data Aggregation Service (DAS), designed to aggregate views and provide them to users and services. Tens of thousands of samples have been cataloged and petabytes of data have been moved since the run began. The modular system has allowed the optimal use of appropriate underlying technologies. In this contribution we will discuss the use of both Oracle and NoSQL databases to implement the data management elements as well as the individual architectures chosen. We will discuss how the data management system functioned during the first run, and what improvements are planned in preparation for 2015.
Europe Report, Science and Technology.
1986-05-29
developed the institutional software training of today, primarily in institutions of higher learning or in study courses. TV BASIC-- it really would be a...only to develop software but also to offer large catalogs to users of its machines. Bull is studying the product-marketing strategies of companies such...by DEC, an XCON + XSEL package). A prototype is under study ; it was developed on an SPS 9 with Kool and a relational data base. The first
Search and Determine Integrated Environment (SADIE)
NASA Astrophysics Data System (ADS)
Sabol, C.; Schumacher, P.; Segerman, A.; Coffey, S.; Hoskins, A.
2012-09-01
A new and integrated high performance computing software applications package called the Search and Determine Integrated Environment (SADIE) is being jointly developed and refined by the Air Force and Naval Research Laboratories (AFRL and NRL) to automatically resolve uncorrelated tracks (UCTs) and build a more complete space object catalog for improved Space Situational Awareness (SSA). The motivation for SADIE is to respond to very challenging needs identified and guidance received from Air Force Space Command (AFSPC) and other senior leaders to develop this technology to support the evolving Joint Space Operations Center (JSpOC) and Alternate Space Control Center (ASC2)-Dahlgren. The JSpOC and JMS SSA mission requirements and threads flow down from the United States Strategic Command (USSTRATCOM). The SADIE suite includes modification and integration of legacy applications and software components that include Search And Determine (SAD), Satellite Identification (SID), and Parallel Catalog (Parcat), as well as other utilities and scripts to enable end-to-end catalog building and maintenance in a parallel processing environment. SADIE is being developed to handle large catalog building challenges in all orbit regimes and includes the automatic processing of radar, fence, and optical data. Real data results are provided for the processing of Air Force Space Surveillance System fence observations and for the processing of Space Surveillance Telescope optical data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Daniel Arthur
In recent years, rising populations and regional droughts have caused coal-fired power plants to temporarily curtail or cease production due to a lack of available water for cooling. In addition, concerns about the availability of adequate supplies of cooling water have resulted in cancellation of plans to build much-needed new power plants. These issues, coupled with concern over the possible impacts of global climate change, have caused industry and community planners to seek alternate sources of water to supplement or replace existing supplies. The Department of Energy, through the National Energy Technology Laboratory (NETL) is researching ways to reduce themore » water demands of coal-fired power plants. As part of the NETL Program, ALL Consulting developed an internet-based Catalog of potential alternative sources of cooling water. The Catalog identifies alternative sources of water, such as mine discharge water, oil and gas produced water, saline aquifers, and publicly owned treatment works (POTWs), which could be used to supplement or replace existing surface water sources. This report provides an overview of the Catalog, and examines the benefits and challenges of using these alternative water sources for cooling water.« less
The G-HAT Search for Advanced Extraterrestrial Civilizations: The Reddest Extended WISE Sources
NASA Astrophysics Data System (ADS)
Maldonado, Jessica; Povich, Matthew S.; Wright, Jason; Griffith, Roger; Sigurdsson, Steinn; Mullan, Brendan L.
2015-01-01
Freeman Dyson (1960) theorized how to identify possible signatures of advanced extra-terrestrial civilizations by their waste heat, an inevitable byproduct of a civilization using a significant fraction of the luminosity from their host star. If a civilizations could tap the starlight throughout their host galaxy their waste heat would be easily detectable by recent infrared surveys. The Glimpsing Heat from Alien Technologies (G-HAT) pilot project aims to place limits on the existence of extraterrestrial civilizations at pan-galactic scales. We present results from the G-HAT cleaned catalog of 563 extremely red, extended high Galactic latitude (|b| ≥ 10) sources from the WISE All-Sky Catalog. Our catalog includes sources new to the scientific literature along with well-studied objects (e.g. starburst galaxies, AGN, and planetary nebulae) that exemplify extreme WISE colors. Objects of particular interest include a supergiant Be star (48 Librae) surrounded by a resolved, mid-infrared nebula, possibly indicating dust in the stellar wind ejecta, and a curious cluster of seven extremely red WISE sources (associated with IRAS 04287+6444) that have no optical counterparts.
Surrogate outcomes in health technology assessment: an international comparison.
Velasco Garrido, Marcial; Mangiapane, Sandra
2009-07-01
Our aim was to review the recommendations given by health technology assessment (HTA) institutions in their methodological guidelines concerning the use of surrogate outcomes in their assessments. In a second step, we aimed at quantifying the role surrogate parameters take in assessment reports. We analyzed methodological papers and guidelines from HTA agencies with International Network of Agencies for Health Technology Assessment membership as well as from institutions related to pharmaceutical regulation (i.e., reimbursement, pricing). We analyzed the use of surrogate outcomes in a sample of HTA reports randomly drawn from the HTA database. We checked methods, results (including evidence tables), and conclusions sections and extracted the outcomes reported. We report descriptive statistics on the presence of surrogate outcomes in the reports. We identified thirty-four methodological guidelines, twenty of them addressing the issue of outcome parameter choice and the problematic of surrogate outcomes. Overall HTA agencies call on caution regarding the reliance on surrogate outcomes. None of the agencies has provided a list or catalog of acceptable and validated surrogate outcomes. We extracted the outcome parameter of 140 HTA reports. Only around half of the reports determined the outcomes for the assessment prospectively. Surrogate outcomes had been used in 62 percent of the reports. However, only 3.6 percent were based upon surrogate outcomes exclusively. All of them assessed diagnostic or screening technologies and the surrogate outcomes were predominantly test characteristics. HTA institutions seem to agree on a cautious approach to the use of surrogate outcomes in technology assessment. Thorough assessment of health technologies should not rely exclusively on surrogate outcomes.
Orbit Transfer Rocket Engine Technology Program, Advanced Engine Study Task D.6
1992-02-28
l!J~iliiJl 1. Report No. 2. Government Accession No. 3 . Recipient’s Catalog No. NASA 187215 4. Title and Subtitle 5. Report Date ORBIT TRANSFER ROCKET...Engine Study, three primary subtasks were accomplished: 1) Design and Parametric Data, 2) Engine Requirement Variation Studies, and 3 ) Vehicle Study...Mixture Ratio Parametrics 18 3 . Thrust Parametrics Off-Design Mixture Ratio Scans 22 4. Expansion Area Ratio Parametrics 24 5. OTV 20 klbf Engine Off
The 1000 Genomes Project: data management and community access.
Clarke, Laura; Zheng-Bradley, Xiangqun; Smith, Richard; Kulesha, Eugene; Xiao, Chunlin; Toneva, Iliana; Vaughan, Brendan; Preuss, Don; Leinonen, Rasko; Shumway, Martin; Sherry, Stephen; Flicek, Paul
2012-04-27
The 1000 Genomes Project was launched as one of the largest distributed data collection and analysis projects ever undertaken in biology. In addition to the primary scientific goals of creating both a deep catalog of human genetic variation and extensive methods to accurately discover and characterize variation using new sequencing technologies, the project makes all of its data publicly available. Members of the project data coordination center have developed and deployed several tools to enable widespread data access.
Arms Control and Nonproliferation: A Catalog of Treaties and Agreements
2014-07-21
Europe in the waning years of the Cold War . Other arrangements seek to slow the spread of technologies that nations could use to develop advanced...This provides each nation with the freedom to mix their forces as they see fit. This change reflects, in part, a lesser concern with Cold War models...CFE provisions reflected Cold War assumptions and did not fairly address its new national security concerns. Further, it argued that economic hardship
2018-05-17
The complete TEMPEST-D spacecraft shown with the solar panels deployed. RainCube, CubeRRT and TEMPEST-D are currently integrated aboard Orbital ATKs Cygnus spacecraft and are awaiting launch on an Antares rocket. After the CubeSats have arrived at the station, they will be deployed into low-Earth orbit and will begin their missions to test these new technologies useful for predicting weather, ensuring data quality, and helping researchers better understand storms. https://photojournal.jpl.nasa.gov/catalog/PIA22458
Model documentation renewable fuels module of the National Energy Modeling System
NASA Astrophysics Data System (ADS)
1995-06-01
This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1995 Annual Energy Outlook (AEO95) forecasts. The report catalogs and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. The RFM also reads in hydroelectric facility capacities and capacity factors from a data file for use by the NEMS Electricity Market Module (EMM). The purpose of the RFM is to define the technological, cost, and resource size characteristics of renewable energy technologies. These characteristics are used to compute a levelized cost to be competed against other similarly derived costs from other energy sources and technologies. The competition of these energy sources over the NEMS time horizon determines the market penetration of these renewable energy technologies. The characteristics include available energy capacity, capital costs, fixed operating costs, variable operating costs, capacity factor, heat rate, construction lead time, and fuel product price.
Statistical Evaluation of Turkey Earthquake Cataloque: A Case study (1900-2015)
NASA Astrophysics Data System (ADS)
Kalafat, Dogan
2016-04-01
In this study, Turkey earthquake catalog of the events within the time period of 1900-2015 prepared by Boǧaziçi University Kandilli Observatory and Earthquake Research Institute is analyzed. The catalog consists of earthquakes occurred in Turkey and surrounding area (32o-45oN/23o-48oE). The current earthquake catalog data has been checked in two aspects; the time dependent variation and compliance for different regions. Specifically the data set prior to 1976 was found deficient. In total, 7 regions were evaluated according to the tectonic specifications and data set. In this study for every region original data were used without any change; b- values, a- values, Magnitude of completeness (Mc) were calculated. For the calculation of b- values focal depth was selected as h= 0-50 km. One of the important complications for the seismic catalogs is discriminating real (natural) seismic events from artificial (unnatural) seismic events. Therefore within the original current catalog events especially artificial quarry blasts and mine blasts have been separated by declustering and dequarry methods. Declustering process eliminates induced earthquakes especially occurred in thermal regions, large water basins, mine regions from the original catalogs. Current moment tensor catalog prepared by Kalafat, 2015 the faulting type map of the region was prepared. As a result, for each region it is examined if there is a relation between fault type and b- values. In this study, the hypothesis of the relation between previously evaluated and currently ongoing extensional, compression, strike-slip fault regimes in Turkey and b- values are tested one more time. This study was supported by the Department of Science Fellowship and Grant programs (2014-2219) of TUBITAK (The Scientific and Technological Research Councilof Turkey). It also encourages the conduct of the study and support the constructive contributionthat Prof.Dr. Nafi TOKSÖZ to offer my eternal gratitude.
Advanced Information Technology Investments at the NASA Earth Science Technology Office
NASA Astrophysics Data System (ADS)
Clune, T.; Seablom, M. S.; Moe, K.
2012-12-01
The NASA Earth Science Technology Office (ESTO) regularly makes investments for nurturing advanced concepts in information technology to enable rapid, low-cost acquisition, processing and visualization of Earth science data in support of future NASA missions and climate change research. In 2012, the National Research Council published a mid-term assessment of the 2007 decadal survey for future spacemissions supporting Earth science and applications [1]. The report stated, "Earth sciences have advanced significantly because of existing observational capabilities and the fruit of past investments, along with advances in data and information systems, computer science, and enabling technologies." The report found that NASA had responded favorably and aggressively to the decadal survey and noted the role of the recent ESTO solicitation for information systems technologies that partnered with the NASA Applied Sciences Program to support the transition into operations. NASA's future missions are key stakeholders for the ESTO technology investments. Also driving these investments is the need for the Agency to properly address questions regarding the prediction, adaptation, and eventual mitigation of climate change. The Earth Science Division has championed interdisciplinary research, recognizing that the Earth must be studied as a complete system in order toaddress key science questions [2]. Information technology investments in the low-mid technology readiness level (TRL) range play a key role in meeting these challenges. ESTO's Advanced Information Systems Technology (AIST) program invests in higher risk / higher reward technologies that solve the most challenging problems of the information processing chain. This includes the space segment, where the information pipeline begins, to the end user, where knowledge is ultimatelyadvanced. The objectives of the program are to reduce the risk, cost, size, and development time of Earth Science space-based and ground-based systems, increase the accessibility and utility of science data, and to enable new observation measurements and information products. We will discuss the ESTO investment strategy for information technology development, the methods used to assess stakeholder needs and technology advancements, and technology partnerships to enhance the infusion for the resulting technology. We also describe specific investments and their potential impact on enabling NASA missions and scientific discovery. [1] "Earth Science and Applications from Space: A Midterm Assessment of NASA's Implementation of the Decadal Survey", 2012: National Academies Press, http://www.nap.edu/catalog.php?record_id=13405 [2] "Responding to the Challenge of Climate and Environmental Change: NASA's Plan for a Climate-Centric Architecture for Earth Observations and Applications from Space", 2010: NASA Tech Memo, http://science.nasa.gov/media/medialibrary/2010/07/01/Climate_Architecture_Final.pdf
NASA Technical Reports Server (NTRS)
Gilman, J. A.; Paige, J. B.; Schwenk, F. Carl
1992-01-01
The purpose of this catalog is to assist small business firms in making the community aware of products emerging from their efforts in the Small Business Innovation Research (SBIR) program. It contains descriptions of some products that have advanced into Phase 3 and others that are identified as prospective products. Both lists of products in this catalog are based on information supplied by NASA SBIR contractors in responding to an invitation to be represented in this document. Generally, all products suggested by the small firms were included in order to meet the goals of information exchange for SBIR results. Of the 444 SBIR contractors NASA queried, 137 provided information on 219 products. The catalog presents the product information in the technology areas listed in the table of contents. Within each area, the products are listed in alphabetical order by product name and are given identifying numbers. Also included is an alphabetical listing of the companies that have products described. This listing cross-references the product list and provides information on the business activity of each firm. In addition, there are three indexes: one a list of firms by states, one that lists the products according to NASA Centers that managed the SBIR projects, and one that lists the products by the relevant Technical Topics utilized in NASA's annual program solicitation under which each SBIR project was selected.
Turbine Engine Diagnostics System Study
1991-10-01
objective of this report. Tehnical pNW Decuanfmesln Page 1. Ropro. 1/ 2. ,ornm.o Acceosioa No. 3 . o.p.ifteCatalog Me. DOT/FAA/CT-91/16 4. Titeo en. depeI 0o...Safety Information Service, Springfield, Performance Virginia 22161 I9. seewrity Clossa. too *is ,pw#) 3 . 0 n4 .i. (*#A pop) ,g. 21. me. of eeoc...1. INTRODUCTION 1 2. BACKGROUND 2 3 . PHASE I TECHNICAL OBJECTIVES 4 4. SYSTEM SURVEY 7 4.1 Literature Search 7 4.2 Industry Survey 10 4.3 Technology
RoboSimian Disaster Relief Poster Artist Concept
2015-03-11
This artist's concept shows RoboSimian, a robot intended to assist with disaster relief and mitigation. RoboSimian is an ape-like robot that moves around on four limbs. It was designed and built at the Jet Propulsion Laboratory in Pasadena, California. It will compete in the 2015 DARPA Robotics Challenge Finals. To get the robot in shape for the contest, researchers at JPL are collaborating with partners at University of California, Santa Barbara, and the California Institute of Technology. http://photojournal.jpl.nasa.gov/catalog/PIA19313
ASTRONAUTICS INFORMATION. OPEN LITERATURE SURVEY, VOLUME III, NO. 2 (ENTRIES 30,202-30,404)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1961-02-01
<>15:014925. An annotated list of references on temperature control of satellite and space vehicles is presented. Methods and systems for maintaining vehicles within tolerable temperature bounds while operating outside planetary atmospheres are outlined. Discussions of the temperature environment in space and how it might affect vehicle operation are given. Re-entry heating problems are not included. Among the sources used were: Engineering Index, Applied Science and Technology Index, Astronautics Abstracts, PAL uniterm index, ASTIA, and LMSD card catalog. (auth)
2016-08-09
A furled first prototype starshade developed by NASA's Jet Propulsion Laboratory, shown in technology partner Astro Aerospace/Northrup Grumman's facility in Santa Barbara, California, in 2013. This design shows petals that are more extreme in shape, which properly diffracts starlight for smaller telescopes. For launch, the petals of the starshade will be wrapped around the spacecraft, then unfurled into the familiar flower-like design once in space. As shown by this 66-foot (20-meter) model, starshades can come in many shapes and sizes. http://photojournal.jpl.nasa.gov/catalog/PIA20905
1991-12-01
those who still have no idea what it is.[ Shoop ] This scenario makes clear the importance of understanding individuals’ fear of change and possible...Technology Index (ASTI) • Defense Technical Information Center (DTIC) " OCLC Online Union Catalog (FirstSearch) " Old "key word" database searches currently...Marine Corps to fashion the transition in such a way that it meets with the fewest objections possible. B. RECOMMENDATIONS TQL methodology itself
2018-05-17
The RainCube 6U CubeSat with fully-deployed antenna. RainCube, CubeRRT and TEMPEST-D are currently integrated aboard Orbital ATKs Cygnus spacecraft and are awaiting launch on an Antares rocket. After the CubeSats have arrived at the station, they will be deployed into low-Earth orbit and will begin their missions to test these new technologies useful for predicting weather, ensuring data quality, and helping researchers better understand storms. https://photojournal.jpl.nasa.gov/catalog/PIA22457
Telecommunications and tomorrow
NASA Astrophysics Data System (ADS)
Hromocky, A.
1981-08-01
The various subject heading used by the Library of Congress, under which most books on telecommunications can be located in card, book, and online catalogs are listed. Recommended literature from the library's collections is cited in this noncomprehensive bibliography intended for those interested in studying the technological, social, and policy aspects of telecommunications. The library's classification number is given for the books, journal articles, government publications, conference proceedings, and technical reports, recommended all of which are listed by author entry. Abstracting and indexing services that cover relevant literature are included.
VizieR Online Data Catalog: SDSS optically selected BL Lac candidates (Kuegler+, 2014)
NASA Astrophysics Data System (ADS)
Kuegler, S. D.; Nilsson, K.; Heidt, J.; Esser, J.; Schultz, T.
2014-11-01
The data that we use for variability and host galaxy analysis were presented in Paper I (Heidt & Nilsson, 2011A&A...529A.162H, Cat. J/A+A/529/A162). Alltogether, 123 targets were observed at the ESO New Technology Telescope (NTT) on La Silla, Chile during Oct. 2-6, 2008 and Mar. 28-Apr. 1, 2009. The observations were made with the EFOSC2 instrument through a Gunn-r filter (#786). (2 data files).
Research on camera on orbit radial calibration based on black body and infrared calibration stars
NASA Astrophysics Data System (ADS)
Wang, YuDu; Su, XiaoFeng; Zhang, WanYing; Chen, FanSheng
2018-05-01
Affected by launching process and space environment, the response capability of a space camera must be attenuated. So it is necessary for a space camera to have a spaceborne radiant calibration. In this paper, we propose a method of calibration based on accurate Infrared standard stars was proposed for increasing infrared radiation measurement precision. As stars can be considered as a point target, we use them as the radiometric calibration source and establish the Taylor expansion method and the energy extrapolation model based on WISE catalog and 2MASS catalog. Then we update the calibration results from black body. Finally, calibration mechanism is designed and the technology of design is verified by on orbit test. The experimental calibration result shows the irradiance extrapolation error is about 3% and the accuracy of calibration methods is about 10%, the results show that the methods could satisfy requirements of on orbit calibration.
Visualization of Multi-mission Astronomical Data with ESASky
NASA Astrophysics Data System (ADS)
Baines, Deborah; Giordano, Fabrizio; Racero, Elena; Salgado, Jesús; López Martí, Belén; Merín, Bruno; Sarmiento, María-Henar; Gutiérrez, Raúl; Ortiz de Landaluce, Iñaki; León, Ignacio; de Teodoro, Pilar; González, Juan; Nieto, Sara; Segovia, Juan Carlos; Pollock, Andy; Rosa, Michael; Arviset, Christophe; Lennon, Daniel; O'Mullane, William; de Marchi, Guido
2017-02-01
ESASky is a science-driven discovery portal to explore the multi-wavelength sky and visualize and access multiple astronomical archive holdings. The tool is a web application that requires no prior knowledge of any of the missions involved and gives users world-wide simplified access to the highest-level science data products from multiple astronomical space-based astronomy missions plus a number of ESA source catalogs. The first public release of ESASky features interfaces for the visualization of the sky in multiple wavelengths, the visualization of query results summaries, and the visualization of observations and catalog sources for single and multiple targets. This paper describes these features within ESASky, developed to address use cases from the scientific community. The decisions regarding the visualization of large amounts of data and the technologies used were made to maximize the responsiveness of the application and to keep the tool as useful and intuitive as possible.
LHCb experience with LFC replication
NASA Astrophysics Data System (ADS)
Bonifazi, F.; Carbone, A.; Perez, E. D.; D'Apice, A.; dell'Agnello, L.; Duellmann, D.; Girone, M.; Re, G. L.; Martelli, B.; Peco, G.; Ricci, P. P.; Sapunenko, V.; Vagnoni, V.; Vitlacil, D.
2008-07-01
Database replication is a key topic in the framework of the LHC Computing Grid to allow processing of data in a distributed environment. In particular, the LHCb computing model relies on the LHC File Catalog, i.e. a database which stores information about files spread across the GRID, their logical names and the physical locations of all the replicas. The LHCb computing model requires the LFC to be replicated at Tier-1s. The LCG 3D project deals with the database replication issue and provides a replication service based on Oracle Streams technology. This paper describes the deployment of the LHC File Catalog replication to the INFN National Center for Telematics and Informatics (CNAF) and to other LHCb Tier-1 sites. We performed stress tests designed to evaluate any delay in the propagation of the streams and the scalability of the system. The tests show the robustness of the replica implementation with performance going much beyond the LHCb requirements.
Rabier, Christelle
2013-01-01
The special issue "Fitting for Health" offers a critical inquiry into the co-construction of medicine and technology in the early industrial age. It investigates the "social life" of medical things, through their material configuration, invention, improvement, and diversification, the sites of their deployment, their status as both novelties and less spectacular objects of everyday use, and the challenges they faced in fitting themselves into people's lives and European res publica. The set of articles (on steel trusses, medical electricity, anatomical models, and trade catalogs) heuristically uses "technology" to analyze how medicine and its material processes were crafted, endowed with meaning, and woven into European societies. Opening the medical "black box"—circumventing its tendency to be ignored as a mediating tool—provides a significant common point of entry for the four enquiries, triggering further analysis of the relationship between humans and non-humans as shaped in medical knowledge and practice. PMID:27057070
Satellite image analysis using neural networks
NASA Technical Reports Server (NTRS)
Sheldon, Roger A.
1990-01-01
The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.
NASA Technical Reports Server (NTRS)
Papagiannis, M. D.
1986-01-01
The end product of the biological evolution seems to be the appearance of technological civilizations, which are characterized by superior technology that supercedes biological capabilities. The Search for Extraterrestrial Intelligence (SETI) has gained scientific recognition in recent years. The concept of galactic colonization is debated extensively, with opinions ranging from the impossible to the inevitable, but without a clear resolution. Answers can be obtained only with experimental tests and not with endless debates. A search for large space colonies in the asteroid belt, an ideal source of raw materials for a spaceborne civilization, is a test of the galactic colonization theory. The catalogue of solar system objects obtained form the Infrared Astronomy Satellite (IRAS) observations at 12, 25, 60, and 100 microns, is an ideal source for such a search. The catalog is expected to be ready at the end of 1985 and will contain more than 10,000 objects.
Scalable Data Mining and Archiving for the Square Kilometre Array
NASA Astrophysics Data System (ADS)
Jones, D. L.; Mattmann, C. A.; Hart, A. F.; Lazio, J.; Bennett, T.; Wagstaff, K. L.; Thompson, D. R.; Preston, R.
2011-12-01
As the technologies for remote observation improve, the rapid increase in the frequency and fidelity of those observations translates into an avalanche of data that is already beginning to eclipse the resources, both human and technical, of the institutions and facilities charged with managing the information. Common data management tasks like cataloging both data itself and contextual meta-data, creating and maintaining scalable permanent archive, and making data available on-demand for research present significant software engineering challenges when considered at the scales of modern multi-national scientific enterprises such as the upcoming Square Kilometre Array project. The NASA Jet Propulsion Laboratory (JPL), leveraging internal research and technology development funding, has begun to explore ways to address the data archiving and distribution challenges with a number of parallel activities involving collaborations with the EVLA and ALMA teams at the National Radio Astronomy Observatory (NRAO), and members of the Square Kilometre Array South Africa team. To date, we have leveraged the Apache OODT Process Control System framework and its catalog and archive service components that provide file management, workflow management, resource management as core web services. A client crawler framework ingests upstream data (e.g., EVLA raw directory output), identifies its MIME type and automatically extracts relevant metadata including temporal bounds, and job-relevant/processing information. A remote content acquisition (pushpull) service is responsible for staging remote content and handing it off to the crawler framework. A science algorithm wrapper (called CAS-PGE) wraps underlying code including CASApy programs for the EVLA, such as Continuum Imaging and Spectral Line Cube generation, executes the algorithm, and ingests its output (along with relevant extracted metadata). In addition to processing, the Process Control System has been leveraged to provide data curation and automatic ingestion for the MeerKAT/KAT-7 precursor instrument in South Africa, helping to catalog and archive correlator and sensor output from KAT-7, and to make the information available for downstream science analysis. These efforts, supported by the increasing availability of high-quality open source software, represent a concerted effort to seek a cost-conscious methodology for maintaining the integrity of observational data from the upstream instrument to the archive, and at the same time ensuring that the data, with its richly annotated catalog of meta-data, remains a viable resource for research into the future.
The Automatic Measuring Machines and Ground-Based Astrometry
NASA Astrophysics Data System (ADS)
Sergeeva, T. P.
The introduction of the automatic measuring machines into the astronomical investigations a little more then a quarter of the century ago has increased essentially the range and the scale of projects which the astronomers could capable to realize since then. During that time, there have been dozens photographic sky surveys, which have covered all of the sky more then once. Due to high accuracy and speed of automatic measuring machines the photographic astrometry has obtained the opportunity to create the high precision catalogs such as CpC2. Investigations of the structure and kinematics of the stellar components of our Galaxy has been revolutionized in the last decade by the advent of automated plate measuring machines. But in an age of rapidly evolving electronic detectors and space-based catalogs, expected soon, one could think that the twilight hours of astronomical photography have become. On opposite of that point of view such astronomers as D.Monet (U.S.N.O.), L.G.Taff (STScI), M.K.Tsvetkov (IA BAS) and some other have contended the several ways of the photographic astronomy evolution. One of them sounds as: "...special efforts must be taken to extract useful information from the photographic archives before the plates degrade and the technology required to measure them disappears". Another is the minimization of the systematic errors of ground-based star catalogs by employment of certain reduction technology and a dense enough and precise space-based star reference catalogs. In addition to that the using of the higher resolution and quantum efficiency emulsions such as Tech Pan and some of the new methods of processing of the digitized information hold great promise for future deep (B<25) surveys (Bland-Hawthorn et al. 1993, AJ, 106, 2154). Thus not only the hard working of all existing automatic measuring machines is apparently needed but the designing, development and employment of a new generation of portable, mobile scanners is very necessary. The classification, main parameters of some modern automatic measuring machines, developed with them scientific researches and some of the used methods of high accuracy, reliability and certainly ensuring are reported in that paper. This work are supported by Grant N U4I000 from International Science Foundation.
Data Processing Factory for the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan
2002-12-01
The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.
USGS Science Data Catalog - Open Data Advances or Declines
NASA Astrophysics Data System (ADS)
Frame, M. T.; Hutchison, V.; Zolly, L.; Wheeler, B.; Latysh, N.; Devarakonda, R.; Palanisamy, G.; Shrestha, B.
2014-12-01
The recent Office of Science and Technology Policy (OSTP) White House Open Data Policies (2013) have required Federal agencies to establish formal catalogues of their science data holdings and make these data easily available on Web sites, portals, and applications. As an organization, the USGS has historically excelled at making its data holdings freely available on its various Web sites (i.e., National, Scientific Programs, or local Science Center). In response to these requirements, the USGS Core Science Analytics, Synthesis, and Libraries program, in collaboration with DOE's Oak Ridge National Laboratory (ORNL) Mercury Consortium (funded by NASA, USGS, and DOE), and a number of other USGS organizations, established the Science Data Catalog (http://data.usgs.gov) cyberinfrastructure, content management processes/tools, and supporting policies. The USGS Science Data Catalog led the charge at USGS to improve the robustness of existing/future metadata collections; streamline and develop sustainable publishing to external aggregators (i.e., data.gov); and provide leadership to the U.S. Department of Interior in emerging Open Data policies, techniques, and systems. The session will discuss the current successes, challenges, and movement toward meeting these Open Data policies for USGS scientific data holdings. A retrospective look at the last year of implementation of these efforts within USGS will occur to determine whether these Open Data Policies are improving data access or limiting data availability. To learn more about the USGS Science Data Catalog, visit us at http://data.usgs.gov/info/about.html
NASA Technical Reports Server (NTRS)
Scholz, A. L.; Hart, M. T.; Lowry, D. J.
1987-01-01
The Preliminary Issues Database (PIDB) was assembled very early in the study as one of the fundamental tools to be used throughout the study. Data was acquired from a variety of sources and compiled in such a way that the data could be easily sorted in accordance with a number of different analytical objectives. The system was computerized to significantly expedite sorting and make it more usable. The information contained in the PIDB is summarized and the reader is provided with the capability to manually find items of interest.
DDT Behavior of Waxed Mixtures of RDX, HMX, and Tetryl
1977-10-18
HMX (689*; Class A, 6 ’ 200p), RDX (659; Class E, 6 P 15p), tetryl (812; 6 s 470p), carnauba wax (134; 6 r 125p). The mixing procedure was that used...NSWC/WOL TR 77-96 O DDT BEHAVIOR OF WAXED MIXTURES OF " RDX, HMX, AND TETRYL BY DONNA PRICE and RICHARD R. BERNECKER RESEARCH AND TECHNOLOGY...BEFORE COMPLETING FORM RY PORT NUM ER 2. GOVT ACCESSION No. 3. RECIPIENT’S CATALOG NUMBER S. TYPE OF REPORT & PERIOD COVERED PDT Behavior of Waxed Mixtures
EARS: Electronic Access to Reference Service.
Weise, F O; Borgendale, M
1986-01-01
Electronic Access to Reference Service (EARS) is a front end to the Health Sciences Library's electronic mail system, with links to the online public catalog. EARS, which became operational in September 1984, is accessed by users at remote sites with either a terminal or microcomputer. It is menu-driven, allowing users to request: a computerized literature search, reference information, a photocopy of a journal article, or a book. This paper traces the history of EARS and discusses its use, its impact on library staff and services, and factors that influence the diffusion of new technology. PMID:3779167
VizieR Online Data Catalog: SN2009ip UBVRI, UVOT and JHK light curves (Fraser+, 2013)
NASA Astrophysics Data System (ADS)
Fraser, M.; Inserra, C.; Jerkstrand, A.; Kotak, R.; Pignata, G.; Benetti, S.; Botticella, M.-T.; Bufano, F.; Childress, M.; Mattila, S.; Pastorello, A.; Smartt, S. J.; Turatto, M.; Yuan, F.; Anderson, J. P.; Bayliss, D. D. R.; Bauer, F. E.; Chen, T.-W.; Forster Buron, F.; Gal-Yam, A.; Haislip, J. B.; Knapic, C.; Le Guillou, L.; Marchi, S.; Mazzali, P.; Molinaro, M.; Moore, J. P.; Reichart, D.; Smareglia, R.; Smith, K. W.; Sternberg, A.; Sullivan, M.; Takats, K.; Tucker, B. E.; Valenti, S.; Yaron, O.; Young, D. R.; Zhou, G.
2014-11-01
Optical spectroscopic follow-up of SN 2009ip was chiefly obtained with the New Technology Telescope (NTT) + ESO Faint Object Spectrograph and Camera 2 (EFOSC2), as part of the Public European Southern Observatory (ESO) Spectroscopic Survey of Transient Objects (PESSTO). The PESSTO data were supplemented with data from the Telescopio Nazionale Galileo (TNG) + Device Optimized for the LOw RESolution (DOLORES), and the Australian National University (ANU) 2.3m telescope + Wide Field Spectrograph (WiFeS). (3 data files).
EARS: Electronic Access to Reference Service.
Weise, F O; Borgendale, M
1986-10-01
Electronic Access to Reference Service (EARS) is a front end to the Health Sciences Library's electronic mail system, with links to the online public catalog. EARS, which became operational in September 1984, is accessed by users at remote sites with either a terminal or microcomputer. It is menu-driven, allowing users to request: a computerized literature search, reference information, a photocopy of a journal article, or a book. This paper traces the history of EARS and discusses its use, its impact on library staff and services, and factors that influence the diffusion of new technology.
Education in an Information Society
NASA Astrophysics Data System (ADS)
Moore, John W.
1999-04-01
Last month's editorial pointed out that higher education may well change significantly as a result of the tremendous impact that information technologies are having on society. It quoted a white paper (1) by Russell Edgerton, Director of the Education Program of the Pew Charitable Trusts. Edgerton argued that higher education is currently failing to meet three challenges: to provide higher quality education; to reduce costs; and to regain its former stature as an important player in shaping public policy. Edgerton recommended that the Pew Trusts should encourage colleges and universities to set more ambitious goals for undergraduate education, to enter the public arena and play a major role in the reform of K-12 education, and to develop an academic profession interested in working toward these goals. Four new aims for undergraduate education were identified: "encouraging institutions to take learning seriously, encouraging faculty to take pedagogy seriously, demonstrating that technology can be used to reduce costs as well as to enhance learning, and developing new incentives for continuous quality improvement." One wonders why institutions of higher education should need to be encouraged toward goals that seem obviously congruent with their mission and self interest, but today's colleges and universities seem more likely to respond to outside offers of funding than to develop their own plans of action. As members of the faculty of such institutions, it behooves us to consider what some of those outside influences are likely to be and what effects they are likely to have on us, on our institutions, and on our students. Higher education is seen as a growth market by Michael Dolence and Donald Norris (2). In 1995 they projected that in five years there would be an increase of 20 million full-time equivalent enrollments in the U.S. and more than 100 million world wide. However, this growth was not projected to be traditional, on-campus students. Most was expected to come from those whose knowledge requires continual updating (engineers, medical personnel, computer programmers), or from those who need to acquire new skills because their employment has been terminated. If education is a growth market, it is sobering to realize that the existing system invests precious little in research and development. In the pharmaceutical industry some 23% of expenditures for medications is applied toward research and development. For growth industries on average, the figure is 10%. For education it may be as small as 0.1%. This situation seems not to be a major concern of faculty or administrators at colleges and universities, but it has attracted the attention of venture capitalists. In the past year or two they have started investing heavily in companies that propose to create educational software, Web-based materials, and other applications of information technology to education. Perhaps Disney, Microsoft, or an upstart successor will provide the best hope for the future of education in this country. But won't college and university faculty be involved in writing the new technology-based materials just as they write textbooks now? Perhaps they will, but development of these new materials is far more a cooperative enterprise than is writing a book. It requires daily interactions among individuals with many different kinds of expertise, and it is an incremental process that requires continual input and revision. Even materials that have been thoroughly evaluated and refined must be continually updated to make them compatible with the latest technology. A textbook author does not have to transpose from Apple II to IBM PC to Macintosh to Windows to World Wide Web to. It seems certain that to produce new kinds of technology-based educational materials will require team efforts and far greater investment of time, money, and equipment than is currently applied to textbooks. Colleges and universities are the institutions most likely to have the combination of personnel, experience, and facilities to become effective suppliers in this growth market. They could also alleviate existing problems by developing and adopting information technology more widely. For example, students are less likely to drop out of courses in which effective learning technology is employed, and remediation of poor background can be individualized via technology. Students who do not retake a course represent savings in time and effort on the part of their teachers, in institutional resources, and in dollar costs to the students themselves. Quality of education can be improved and costs reduced by information technology, even within the traditional campus setting. Energizing universities to lead the way in development of new educational approaches will not succeed unless much greater kudos and rewards are accorded those who collaborate to produce appropriate materials. For a university it is only a small stretch from generating income through overhead on grants to generating income via off-campus distribution of educational materials. The requisite cognitive scientists, subject-matter experts, learning theorists, and information technology staff are available on many campuses. Institutions that evolve a workable system for initiating, stimulating, and rewarding development of technology-based educational materials are likely to become the leaders of the 21st century. Literature Cited 1. Edgerton, R. Education White Paper, Pew Charitable Trusts, 1998 http://www.pewtrusts.com/Frame.cfm?Framesource=programs/edu /eduindex.cfm (accessed Feb 1999). 2. Dolence, M. G.; Norris, D. M. Transforming Higher Education: A Vision for Learning in the 21st Century; Society for College and University Planning, Ann Arbor, MI 48105, 1995 (catalog is at http://141.211.140.59/catalog/catalog.htm; accessed Feb 1999).
Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N
2017-01-01
Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.
A Case Study in Integrating Multiple E-commerce Standards via Semantic Web Technology
NASA Astrophysics Data System (ADS)
Yu, Yang; Hillman, Donald; Setio, Basuki; Heflin, Jeff
Internet business-to-business transactions present great challenges in merging information from different sources. In this paper we describe a project to integrate four representative commercial classification systems with the Federal Cataloging System (FCS). The FCS is used by the US Defense Logistics Agency to name, describe and classify all items under inventory control by the DoD. Our approach uses the ECCMA Open Technical Dictionary (eOTD) as a common vocabulary to accommodate all different classifications. We create a semantic bridging ontology between each classification and the eOTD to describe their logical relationships in OWL DL. The essential idea is that since each classification has formal definitions in a common vocabulary, we can use subsumption to automatically integrate them, thus mitigating the need for pairwise mappings. Furthermore our system provides an interactive interface to let users choose and browse the results and more importantly it can translate catalogs that commit to these classifications using compiled mapping results.
Keenan, Alexandra B; Jenkins, Sherry L; Jagodnik, Kathleen M; Koplev, Simon; He, Edward; Torre, Denis; Wang, Zichen; Dohlman, Anders B; Silverstein, Moshe C; Lachmann, Alexander; Kuleshov, Maxim V; Ma'ayan, Avi; Stathias, Vasileios; Terryn, Raymond; Cooper, Daniel; Forlin, Michele; Koleti, Amar; Vidovic, Dusica; Chung, Caty; Schürer, Stephan C; Vasiliauskas, Jouzas; Pilarczyk, Marcin; Shamsaei, Behrouz; Fazel, Mehdi; Ren, Yan; Niu, Wen; Clark, Nicholas A; White, Shana; Mahi, Naim; Zhang, Lixia; Kouril, Michal; Reichard, John F; Sivaganesan, Siva; Medvedovic, Mario; Meller, Jaroslaw; Koch, Rick J; Birtwistle, Marc R; Iyengar, Ravi; Sobie, Eric A; Azeloglu, Evren U; Kaye, Julia; Osterloh, Jeannette; Haston, Kelly; Kalra, Jaslin; Finkbiener, Steve; Li, Jonathan; Milani, Pamela; Adam, Miriam; Escalante-Chong, Renan; Sachs, Karen; Lenail, Alex; Ramamoorthy, Divya; Fraenkel, Ernest; Daigle, Gavin; Hussain, Uzma; Coye, Alyssa; Rothstein, Jeffrey; Sareen, Dhruv; Ornelas, Loren; Banuelos, Maria; Mandefro, Berhan; Ho, Ritchie; Svendsen, Clive N; Lim, Ryan G; Stocksdale, Jennifer; Casale, Malcolm S; Thompson, Terri G; Wu, Jie; Thompson, Leslie M; Dardov, Victoria; Venkatraman, Vidya; Matlock, Andrea; Van Eyk, Jennifer E; Jaffe, Jacob D; Papanastasiou, Malvina; Subramanian, Aravind; Golub, Todd R; Erickson, Sean D; Fallahi-Sichani, Mohammad; Hafner, Marc; Gray, Nathanael S; Lin, Jia-Ren; Mills, Caitlin E; Muhlich, Jeremy L; Niepel, Mario; Shamu, Caroline E; Williams, Elizabeth H; Wrobel, David; Sorger, Peter K; Heiser, Laura M; Gray, Joe W; Korkola, James E; Mills, Gordon B; LaBarge, Mark; Feiler, Heidi S; Dane, Mark A; Bucher, Elmar; Nederlof, Michel; Sudar, Damir; Gross, Sean; Kilburn, David F; Smith, Rebecca; Devlin, Kaylyn; Margolis, Ron; Derr, Leslie; Lee, Albert; Pillai, Ajay
2018-01-24
The Library of Integrated Network-Based Cellular Signatures (LINCS) is an NIH Common Fund program that catalogs how human cells globally respond to chemical, genetic, and disease perturbations. Resources generated by LINCS include experimental and computational methods, visualization tools, molecular and imaging data, and signatures. By assembling an integrated picture of the range of responses of human cells exposed to many perturbations, the LINCS program aims to better understand human disease and to advance the development of new therapies. Perturbations under study include drugs, genetic perturbations, tissue micro-environments, antibodies, and disease-causing mutations. Responses to perturbations are measured by transcript profiling, mass spectrometry, cell imaging, and biochemical methods, among other assays. The LINCS program focuses on cellular physiology shared among tissues and cell types relevant to an array of diseases, including cancer, heart disease, and neurodegenerative disorders. This Perspective describes LINCS technologies, datasets, tools, and approaches to data accessibility and reusability. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA space and Earth science data on CD-ROM
NASA Technical Reports Server (NTRS)
Towheed, Syed S.
1993-01-01
The National Space Science Data Center (NSSDC) is very interested in facilitating the widest possible use of the scientific data acquired through NASA spaceflight missions. Therefore, NSSDC has participated with projects and data management elements throughout the NASA science environment in the creation, archiving, and dissemination of data using Compact Disk-Read Only Memory (CD-ROM). This CD-ROM technology has the potential to enable the dissemination of very large data volumes at very low prices to a great many researchers, students and their teachers, and others. This catalog identifies and describes the scientific CD-ROM's now available from NSSDC including the following data sets: Einstein Observatory CD-ROM, Galileo Cruise Imaging on CD-ROM, International Halley Watch, IRAS Sky Survey Atlas, Infrared Thermal Mapper (IRTM), Magellan (MIDR), Magellan (ARCDR's), Magellan (GxDR's), Mars Digital Image Map (MDIM), Outer Planets Fields & Particles Data, Pre-Magellan, Selected Astronomical Catalogs, TOMS Gridded Ozone Data, TOMS Ozone Image Data, TOMS Update, Viking Orbiter Images of Mars, and Voyager Image.
Explosive Growth and Advancement of the NASA/IPAC Extragalactic Database (NED)
NASA Astrophysics Data System (ADS)
Mazzarella, Joseph M.; Ogle, P. M.; Fadda, D.; Madore, B. F.; Ebert, R.; Baker, K.; Chan, H.; Chen, X.; Frayer, C.; Helou, G.; Jacobson, J. D.; LaGue, C.; Lo, T. M.; Pevunova, O.; Schmitz, M.; Terek, S.; Steer, I.
2014-01-01
The NASA/IPAC Extragalactic Database (NED) is continuing to evolve in lock-step with the explosive growth of astronomical data and advancements in information technology. A new methodology is being used to fuse data from very large surveys. Selected parameters are first loaded into a new database layer and made available in areal searches before they are cross-matched with prior NED objects. Then a programmed, rule-based statistical approach is used to identify new objects and compute cross-identifications with existing objects where possible; otherwise associations between objects are derived based on positional uncertainties or spatial resolution differences. Approximately 62 million UV sources from the GALEX All-Sky Survey and Medium Imaging Survey catalogs have been integrated into NED using this new process. The December 2013 release also contains nearly half a billion sources from the 2MASS Point Source Catalog accessible in cone searches, while the large scale cross-matching is in progress. Forthcoming updates will fuse data from All-WISE, SDSS DR12, and other very large catalogs. This work is progressing in parallel with the equally important integration of data from the literature, which is also growing rapidly. Recent updates have also included H I and CO channel maps (data cubes), as well as substantial growth in redshifts, classifications, photometry, spectra and redshift-independent distances. The By Parameters search engine now incorporates a simplified form for entry of constraints, and support for long-running queries with machine-readable output. A new tool for exploring the environments of galaxies with measured radial velocities includes informative graphics and a method to assess the incompleteness of redshift measurements. The NED user interface is also undergoing a major transformation, providing more streamlined navigation and searching, and a modern development framework for future enhancements. For further information, please visit our poster (Fadda et al. 2014) and stop by the NED exhibit for a demo. NED is operated by the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics and Space Administration.
Buchanan, Carrie C; Torstenson, Eric S; Bush, William S; Ritchie, Marylyn D
2012-01-01
Since publication of the human genome in 2003, geneticists have been interested in risk variant associations to resolve the etiology of traits and complex diseases. The International HapMap Consortium undertook an effort to catalog all common variation across the genome (variants with a minor allele frequency (MAF) of at least 5% in one or more ethnic groups). HapMap along with advances in genotyping technology led to genome-wide association studies which have identified common variants associated with many traits and diseases. In 2008 the 1000 Genomes Project aimed to sequence 2500 individuals and identify rare variants and 99% of variants with a MAF of <1%. To determine whether the 1000 Genomes Project includes all the variants in HapMap, we examined the overlap between single nucleotide polymorphisms (SNPs) genotyped in the two resources using merged phase II/III HapMap data and low coverage pilot data from 1000 Genomes. Comparison of the two data sets showed that approximately 72% of HapMap SNPs were also found in 1000 Genomes Project pilot data. After filtering out HapMap variants with a MAF of <5% (separately for each population), 99% of HapMap SNPs were found in 1000 Genomes data. Not all variants cataloged in HapMap are also cataloged in 1000 Genomes. This could affect decisions about which resource to use for SNP queries, rare variant validation, or imputation. Both the HapMap and 1000 Genomes Project databases are useful resources for human genetics, but it is important to understand the assumptions made and filtering strategies employed by these projects.
Automated metadata--final project report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schissel, David
This report summarizes the work of the Automated Metadata, Provenance Cataloging, and Navigable Interfaces: Ensuring the Usefulness of Extreme-Scale Data Project (MPO Project) funded by the United States Department of Energy (DOE), Offices of Advanced Scientific Computing Research and Fusion Energy Sciences. Initially funded for three years starting in 2012, it was extended for 6 months with additional funding. The project was a collaboration between scientists at General Atomics, Lawrence Berkley National Laboratory (LBNL), and Massachusetts Institute of Technology (MIT). The group leveraged existing computer science technology where possible, and extended or created new capabilities where required. The MPO projectmore » was able to successfully create a suite of software tools that can be used by a scientific community to automatically document their scientific workflows. These tools were integrated into workflows for fusion energy and climate research illustrating the general applicability of the project’s toolkit. Feedback was very positive on the project’s toolkit and the value of such automatic workflow documentation to the scientific endeavor.« less
VizieR Online Data Catalog: Gravitational waves from known pulsars (Aasi+, 2014)
NASA Astrophysics Data System (ADS)
Aasi, J.; Abadie, J.; Abbott, B. P.; Abbott, R.; Abbott, T.; Abernathy, M. R.; Accadia, T.; Acernese, F.; Adams, C.; Adams, T.; Adhikari, R. X.; Affeldt, C.; Agathos, M.; Aggarwal, N.; Aguiar, O. D.; Ajith, P.; Allen, B.; Allocca, A.; Ceron, E. A.; Amariutei, D.; Anderson, R. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C.; Areeda, J.; Ast, S.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Austin, L.; Aylott, B. E.; Babak, S.; Baker, P. T.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barker, D.; Barnum, S. H.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barton, M. A.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J.; Bauchrowitz, J.; Bauer, Th. S.; Bebronne, M.; Behnke, B.; Bejger, M.; Beker, M. G.; Bell, A. S.; Bell, C.; Belopolski, I.; Bergmann, G.; Berliner, J. M.; Bersanetti, D.; Bertolini, A.; Bessis, D.; Betzwieser, J.; Beyersdorf, P. T.; Bhadbhade, T.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Bitossi, M.; Bizouard, M. A.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Blom, M.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogan, C.; Bo, Nd C.; Bondu, F.; Bonelli, L.; Bonnand, R.; Bork, R.; Born, M.; Boschi, V.; Bose, S.; Bosi, L.; Bowers, J.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brannen, C. A.; Brau, J. E.; Breyer, J.; Briant, T.; Bridges, D. O.; Brillet, A.; Brinkmann, M.; Brisson, V.; Britzger, M.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Bruckner, F.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Calderon Bustillo, J.; Calloni, E.; Camp, J. B.; Campsie, P.; Cannon, K. C.; Canuel, B.; Cao, J.; Capano, C. D.; Carbognani, F.; Carbone, L.; Caride, S.; Castiglia, A.; Caudill, S.; Cavaglia, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, X.; Chen, Y.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Chow, J.; Christensen, N.; Chu, Q.; Chua, S. S. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, D. E.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Colombini, M.; Constancio, M. Jr; Conte, A.; Conte, R.; Cook, D.; Corbitt, T. R.; Cordier, M.; Cornish, N.; Corsi, A.; Costa, C. A.; Coughlin, M. W.; Coulon, J.-P.; Countryman, S.; Couvares, P.; Coward, D. M.; Cowart, M.; Coyne, D. C.; Craig, K.; Creighton, J. D. E.; Creighton, T. D.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dahl, K.; Dal Canton, T.; Damjanic, M.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dattilo, V.; Daudert, B.; Daveloza, H.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Dayanga, T.; De Rosa, R.; Debreczeni, G.; Degallaix, J.; Del Pozzo, W.; Deleeuw, E.; Deleglise, S.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; Derosa, R.; Desalvo, R.; Dhurandhar, S.; di Fiore, L.; di Lieto, A.; di Palma, I.; di, Virgilio A.; Diaz, M.; Dietz, A.; Dmitry, K.; Donovan, F.; Dooley, K. L.; Doravari, S.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Dumas, J.-C.; Dwyer, S.; Eberle, T.; Edwards, M.; Effler, A.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Endroczi, G.; Essick, R.; Etzel, T.; Evans, K.; Evans, M.; Evans, T.; Factourovich, M.; Fafone, V.; Fairhurst, S.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W.; Favata, M.; Fazi, D.; Fehrmann, H.; Feldbaum, D.; Ferrante, I.; Ferrini, F.; Fidecaro, F.; Finn, L. S.; Fiori, I.; Fisher, R.; Flaminio, R.; Foley, E.; Foley, S.; Forsi, E.; Fotopoulos, N.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frede, M.; Frei, M.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fujimoto, M.-K.; Fulda, P.; Fyffe, M.; Gair, J.; Gammaitoni, L.; Garcia, J.; Garufi, F.; Gehrels, N.; Gemme, G.; Genin, E.; Gennai, A.; Gergely, L.; Ghosh, S.; Giaime, J. A.; Giampanis, S.; Giardina, K. D.; Giazotto, A.; Gil-Casanova, S.; Gill, C.; Gleason, J.; Goetz, E.; Goetz, R.; Gondan, L.; Gonzalez, G.; Gordon, N.; Gorodetsky, M. L.; Gossan, S.; Gossler, S.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Griffo, C.; Groot, P.; Grote, H.; Grover, K.; Grunewald, S.; Guidi, G. M.; Guido, C.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hall, B.; Hall, E.; Hammer, D.; Hammond, G.; Hanke, M.; Hanks, J.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Hartman, M. T.; Haughian, K.; Hayama, K.; Heefner, J.; Heidmann, A.; Heintze, M.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Holt, K.; Holtrop, M.; Hong, T.; Hooper, S.; Horrom, T.; Hosken, D. J.; Hough, J.; Howell, E. J.; Hu, Y.; Hua, Z.; Huang, V.; Huerta, E. A.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh, M.; Huynh-Dinh, T.; Iafrate, J.; Ingram, D. R.; Inta, R.; Isogai, T.; Ivanov, A.; Iyer, B. R.; Izumi, K.; Jacobson, M.; James, E.; Jang, H.; Jang, Y. J.; Jaranowski, P.; Jimenez-Forteza, F.; Johnson, W. W.; Jones, D.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, H.; Kalmus, P.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Kasprzack, M.; Kasturi, R.; Katsavounidis, E.; Katzman, W.; Kaufer, H.; Kaufman, K.; Kawabe, K.; Kawamura, S.; Kawazoe, F.; Kefelian, F.; Keitel, D.; Kelley, D. B.; Kells, W.; Keppel, D. G.; Khalaidovski, A.; Khalili, F. Y.; Khazanov, E. A.; Kim, B. K.; Kim, C.; Kim, K.; Kim, N.; Kim, W.; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Klimen, Ko S.; Kline, J.; Koehlenbeck, S.; Kokeyama, K.; Kondrashov, V.; Koranda, S.; Korth, W. Z.; Kowalska, I.; Kozak, D.; Kremin, A.; Kringel, V.; Krishnan, B.; Krolak, A.; Kucharczyk, C.; Kudla, S.; Kuehn, G.; Kumar, A.; Kumar, P.; Kumar, R.; Kurdyumov, R.; Kwee, P.; Landry, M.; Lantz, B.; Larson, S.; Lasky, P. D.; Lawrie, C.; Lazzarini, A.; Le Roux, A.; Leaci, P.; Lebigot, E. O.; Lee, C.-H.; Lee, H. K.; Lee, H. M.; Lee, J.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levine, B.; Lewis, J. B.; Lhuillier, V.; Li, T. G. F.; Lin, A. C.; Littenberg, T. B.; Litvine, V.; Liu, F.; Liu, H.; Liu, Y.; Liu, Z.; Lloyd, D.; Lockerbie, N. A.; Lockett, V.; Lodhia, D.; Loew, K.; Logue, J.; Lombardi, A. L.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J.; Luan, J.; Lubinski, M. J.; Luck, H.; Lundgren, A. P.; MacArthur, J.; MacDonald, E.; Machenschalk, B.; Macinnis, M.; MacLeod, D. M.; Magana-Sandoval, F.; Mageswaran, M.; Mailand, K.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Manca, G. M.; Mandel, I.; Mandic, V.; Mangano, V.; Mantovani, M.; Marchesoni, F.; Marion, F.; Marka, S.; Marka, Z.; Markosyan, A.; Maros, E.; Marque, J.; Martelli, F.; Martin, I. W.; Martin, R. M.; Martinelli, L.; Martynov, D.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Matichard, F.; Matone, L.; Matzner, R. A.; Mavalvala, N.; May, G.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McGuire, S. C.; McIntyre, G.; McIver, J.; Meacher, D.; Meadors, G. D.; Mehmet, M.; Meidam, J.; Meier, T.; Melatos, A.; Mendell, G.; Mercer, R. A.; Meshkov, S.; Messenger, C.; Meyer, M. S.; Miao, H.; Michel, C.; Mikhailov, E. E.; Milano, L.; Miller, J.; Minenkov, Y.; Mingarelli, C. M. F.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moe, B.; Mohan, M.; Mohapatra, S. R. P.; Mokler, F.; Moraru, D.; Moreno, G.; Morgado, N.; Mori, T.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Mukherjee, S.; Mullavey, A.; Munch, J.; Murphy, D.; Murray, P. G.; Mytidis, A.; Nagy, M. F.; Nanda, Kumar D.; Nardecchia, I.; Nash, T.; Naticchioni, L.; Nayak, R.; Necula, V.; Nelemans, G.; Neri, I.; Neri, M.; Newton, G.; Nguyen, T.; Nishida, E.; Nishizawa, A.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L. K.; Ochsner, E.; O'Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oppermann, P.; O'Reilly, B.; Ortega Larcher, W.; O'Shaughnessy, R.; Osthelder, C.; Ottaway, D. J.; Ottens, R. S.; Ou, J.; Overmier, H.; Owen, B. J.; Padilla, C.; Pai, A.; Palomba, C.; Pan, Y.; Pankow, C.; Paoletti, F.; Paoletti, R.; Papa, M. A.; Paris, H.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Pedraza, M.; Peiris, P.; Penn, S.; Perreca, A.; Phelps, M.; Pichot, M.; Pickenpack, M.; Piergiovanni, F.; Pierro, V.; Pinard, L.; Pindor, B.; Pinto, I. M.; Pitkin, M.; Poeld, J.; Poggiani, R.; Poole, V.; Poux, C.; Predoi, V.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prix, R.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Quetschke, V.; Quintero, E.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Racz, I.; Radkins, H.; Raffai, P.; Raja, S.; Rajalakshmi, G.; Rakhmanov, M.; Ramet, C.; Rapagnani, P.; Raymond, V.; Re, V.; Reed, C. M.; Reed, T.; Regimbau, T.; Reid, S.; Reitze, D. H.; Ricci, F.; Riesen, R.; Riles, K.; Robertson, N. A.; Robinet, F.; Rocchi, A.; Roddy, S.; Rodriguez, C.; Rodruck, M.; Roever, C.; Rolland, L.; Rollins, J. G.; Romano, J. D.; Romano, R.; Romanov, G.; Romie, J. H.; Rosinska, D.; Rowan, S.; Rudiger, A.; Ruggi, P.; Ryan, K.; Salemi, F.; Sammut, L.; Sandberg, V.; Sanders, J.; Sannibale, V.; Santiago-Prieto, I.; Saracco, E.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Savage, R.; Schilling, R.; Schnabel, R.; Schofield, R. M. S.; Schreiber, E.; Schuette, D.; Schulz, B.; Schutz, B. F.; Schwinberg, P.; Scott, J.; Scott, S. M.; Seifert, F.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sergeev, A.; Shaddock, D.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shapiro, B.; Shawhan, P.; Shoemaker, D. H.; Sidery, T. L.; Siellez, K.; Siemens, X.; Sigg, D.; Simakov, D.; Singer, A.; Singer, L.; Sintes, A. M.; Skelton, G. R.; Slagmolen, B. J. J.; Slutsky, J.; Smith, J. R.; Smith, M. R.; Smith, R. J. E.; Smith-Lefebvre, N. D.; Soden, K.; Son, E. J.; Sorazu, B.; Souradeep, T.; Sperandio, L.; Staley, A.; Steinert, E.; Steinlechner, J.; Steinlechner, S.; Steplewski, S.; Stevens, D.; Stochino, A.; Stone, R.; Strain, K. A.; Straniero, N.; Strigin, S.; Stroeer, A. S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Susmithan, S.; Sutton, P. J.; Swinkels, B.; Szeifert, G.; Tacca, M.; Talukder, D.; Tang, L.; Tanner, D. B.; Tarabrin, S. P.; Taylor, R.; Ter Braack, A. P. M.; Thirugnanasambandam, M. P.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Toncelli, A.; Tonelli, M.; Torre, O.; Torres, C. V.; Torrie, C. I.; Travasso, F.; Traylor, G.; Tse, M.; Ugolini, D.; Unnikrishnan, C. S.; Vahlbruch, H.; Vajente, G.; Vallisneri, M.; van den Brand, J. F. J.; van den Broeck, C.; van der Putten, S.; van der Sluys, M. V.; van Heijningen, J.; van Veggel, A. A.; Vass, S.; Vasuth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Verma, S.; Vetrano, F.; Vicere, A.; Vincent-Finley, R.; Vinet, J.-Y.; Vitale, S.; Vlcek, B.; Vo, T.; Vocca, H.; Vorvick, C.; Vousden, W. D.; Vrinceanu, D.; Vyachanin, S. P.; Wade, A.; Wade, L.; Wade, M.; Waldman, S. J.; Walker, M.; Wallace, L.; Wan, Y.; Wang, J.; Wang, M.; Wang, X.; Wanner, A.; Ward, R. L.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wessels, P.; West, M.; Westphal, T.; Wette, K.; Whelan, J. T.; Whitcomb, S. E.; White, D. J.; Whiting, B. F.; Wibowo, S.; Wiesner, K.; Wilkinson, C.; Williams, L.; Williams, R.; Williams, T.; Willis, J. L.; Willke, B.; Wimmer, M.; Winkelmann, L.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Yablon, J.; Yakushin, I.; Yamamoto, H.; Yancey, C. C.; Yang, H.; Yeaton-Massey, D.; Yoshida, S.; Yum, H.; Yvert, M.; Zadrozny, A.; Zanolin, M.; Zendri, J.-P.; Zhang, F.; Zhang, L.; Zhao, C.; Zhu, H.; Zhu, X. J.; Zotov, N.; Zucker, M. E.; Zweizig, J.; Buchner, S.; Cognard, I.; Corongiu, A.; D'Amico, N.; Espinoza, C. M.; Freire, P. C. C.; Gotthelf, E. V.; Guillemot, L.; Hessels, J. W. T.; Hobbs, G. B.; Kramer, M.; Lyne, A. G.; Marshall, F. E.; Possenti, A.; Ransom, S. M.; Ray, P. S.; Roy, J.; Stappers, B. W.; VIRGO Collaboration
2017-06-01
In this paper we have used calibrated data from the Virgo second (Aasi et al. 2012CQGra..29o5002A) and fourth science runs (VSR2 and VSR4) and the LIGO sixth science run (S6). Virgo's third science run (VSR3) was relatively insensitive in comparison with VSR4 and has not been included in this analysis. This was partially because Virgo introduced monolithic mirror suspensions prior to VSR4 which improved sensitivity in the low-frequency range. During S6, the two LIGO 4 km detectors at Hanford, Washington (LHO/H1), and Livingston, Louisiana (LLO/L1), were running in an enhanced configuration (Adhikari et al. 2006, Enhanced LIGO, Tech. Rep. LIGO-T060156-v01, California Institute of Technology, Massachusetts Institute of Technology, https://dcc.ligo.org/LIGO-T060156-v1/public) over that from the previous S5 run (Abbott et al. 2009RPPh...72g6901A). (1 data file).
Mycotoxins: A Fungal Genomics Perspective
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Daren W.; Baker, Scott E.
The chemical and enzymatic diversity in the fungal kingdom is staggering. Large-scale fungal genome sequencing projects are generating a massive catalog of secondary metabolite biosynthetic genes and pathways. Fungal natural products are a boon and bane to man as valuable pharmaceuticals and harmful toxins. Understanding how these chemicals are synthesized will aid the development of new strategies to limit mycotoxin contamination of food and feeds as well as expand drug discovery programs. A survey of work focused on the fumonisin family of mycotoxins highlights technological advances and provides a blueprint for future studies of other fungal natural products. Expressed sequencemore » tags led to the discovery of new fumonisin genes (FUM) and hinted at a role for alternatively spliced transcripts in regulation. Phylogenetic studies of FUM genes uncovered a complex evolutionary history of the FUM cluster, as well as fungi with the potential to synthesize fumonisin or fumonisin-like chemicals. The application of new technologies (e.g., CRISPR) could substantially impact future efforts to harness fungal resources.« less
FunBlocks. A modular framework for AmI system development.
Baquero, Rafael; Rodríguez, José; Mendoza, Sonia; Decouchant, Dominique; Papis, Alfredo Piero Mateos
2012-01-01
The last decade has seen explosive growth in the technologies required to implement Ambient Intelligence (AmI) systems. Technologies such as facial and speech recognition, home networks, household cleaning robots, to name a few, have become commonplace. However, due to the multidisciplinary nature of AmI systems and the distinct requirements of different user groups, integrating these developments into full-scale systems is not an easy task. In this paper we propose FunBlocks, a minimalist modular framework for the development of AmI systems based on the function module abstraction used in the IEC 61499 standard for distributed control systems. FunBlocks provides a framework for the development of AmI systems through the integration of modules loosely joined by means of an event-driven middleware and a module and sensor/actuator catalog. The modular design of the FunBlocks framework allows the development of AmI systems which can be customized to a wide variety of usage scenarios.
FunBlocks. A Modular Framework for AmI System Development
Baquero, Rafael; Rodríguez, José; Mendoza, Sonia; Decouchant, Dominique; Papis, Alfredo Piero Mateos
2012-01-01
The last decade has seen explosive growth in the technologies required to implement Ambient Intelligence (AmI) systems. Technologies such as facial and speech recognition, home networks, household cleaning robots, to name a few, have become commonplace. However, due to the multidisciplinary nature of AmI systems and the distinct requirements of different user groups, integrating these developments into full-scale systems is not an easy task. In this paper we propose FunBlocks, a minimalist modular framework for the development of AmI systems based on the function module abstraction used in the IEC 61499 standard for distributed control systems. FunBlocks provides a framework for the development of AmI systems through the integration of modules loosely joined by means of an event-driven middleware and a module and sensor/actuator catalog. The modular design of the FunBlocks framework allows the development of AmI systems which can be customized to a wide variety of usage scenarios. PMID:23112599
UCMP and the Internet help hospital libraries share resources.
Dempsey, R; Weinstein, L
1999-07-01
The Medical Library Center of New York (MLCNY), a medical library consortium founded in 1959, has specialized in supporting resource sharing and fostering technological advances. In 1961, MLCNY developed and continues to maintain the Union Catalog of Medical Periodicals (UCMP), a resource tool including detailed data about the collections of more than 720 medical library participants. UCMP was one of the first library tools to capitalize on the benefits of computer technology and, from the beginning, invited hospital libraries to play a substantial role in its development. UCMP, beginning with products in print and later in microfiche, helped to create a new resource sharing environment. Today, UCMP continues to capitalize on new technology by providing access via the Internet and an Oracle-based search system providing subscribers with the benefits of: a database that contains serial holdings information on an issue specific level, a database that can be updated in real time, a system that provides multi-type searching and allows users to define how the results will be sorted, and an ordering function that can more precisely target libraries that have a specific issue of a medical journal. Current development of a Web-based system will ensure that UCMP continues to provide cost effective and efficient resource sharing in future years.
UCMP and the Internet help hospital libraries share resources.
Dempsey, R; Weinstein, L
1999-01-01
The Medical Library Center of New York (MLCNY), a medical library consortium founded in 1959, has specialized in supporting resource sharing and fostering technological advances. In 1961, MLCNY developed and continues to maintain the Union Catalog of Medical Periodicals (UCMP), a resource tool including detailed data about the collections of more than 720 medical library participants. UCMP was one of the first library tools to capitalize on the benefits of computer technology and, from the beginning, invited hospital libraries to play a substantial role in its development. UCMP, beginning with products in print and later in microfiche, helped to create a new resource sharing environment. Today, UCMP continues to capitalize on new technology by providing access via the Internet and an Oracle-based search system providing subscribers with the benefits of: a database that contains serial holdings information on an issue specific level, a database that can be updated in real time, a system that provides multi-type searching and allows users to define how the results will be sorted, and an ordering function that can more precisely target libraries that have a specific issue of a medical journal. Current development of a Web-based system will ensure that UCMP continues to provide cost effective and efficient resource sharing in future years. PMID:10427426
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blackwell, David D.; Chickering Pace, Cathy; Richards, Maria C.
The National Geothermal Data System (NGDS) is a Department of Energy funded effort to create a single cataloged source for a variety of geothermal information through a distributed network of databases made available via web services. The NGDS will help identify regions suitable for potential development and further scientific data collection and analysis of geothermal resources as a source for clean, renewable energy. A key NGDS repository or ‘node’ is located at Southern Methodist University developed by a consortium made up of: • SMU Geothermal Laboratory • Siemens Corporate Technology, a division of Siemens Corporation • Bureau of Economic Geologymore » at the University of Texas at Austin • Cornell Energy Institute, Cornell University • Geothermal Resources Council • MLKay Technologies • Texas Tech University • University of North Dakota. The focus of resources and research encompass the United States with particular emphasis on the Gulf Coast (on and off shore), the Great Plains, and the Eastern U.S. The data collection includes the thermal, geological and geophysical characteristics of these area resources. Types of data include, but are not limited to, temperature, heat flow, thermal conductivity, radiogenic heat production, porosity, permeability, geological structure, core geophysical logs, well tests, estimated reservoir volume, in situ stress, oil and gas well fluid chemistry, oil and gas well information, and conventional and enhanced geothermal system related resources. Libraries of publications and reports are combined into a unified, accessible, catalog with links for downloading non-copyrighted items. Field notes, individual temperature logs, site maps and related resources are included to increase data collection knowledge. Additional research based on legacy data to improve quality increases our understanding of the local and regional geology and geothermal characteristics. The software to enable the integration, analysis, and dissemination of this team’s NGDS contributions was developed by Siemens Corporate Technology. The SMU Node interactive application is accessible at http://geothermal.smu.edu. Additionally, files may be downloaded from either http://geothermal.smu.edu:9000/geoserver/web/ or through http://geothermal.smu.edu/static/DownloadFilesButtonPage.htm. The Geothermal Resources Council Library is available at https://www.geothermal-library.org/.« less
A Boon for the Architect Engineer
NASA Technical Reports Server (NTRS)
1992-01-01
Langley Research Center's need for an improved construction specification system led to an automated system called SPECSINTACT. A catalog of specifications, the system enables designers to retrieve relevant sections from computer storage and modify them as needed. SPECSINTACT has also been adopted by government agencies. The system is an integral part of the Construction Criteria Base (CCB), a single disc containing design and construction information for 10 government agencies including the American Institute of Architects' MASTERSPEC. CCB employs CD- ROM technologies and is available from the National Institute of Building Sciences. Users report substantial savings in time and productivity.
2015-08-05
A laboratory-created "chemical garden" made of a combination of black iron sulfide and orange iron hydroxide/oxide is shown in this photo. Chemical gardens are a nickname for chimney-like structures that form at bubbling vents on the seafloor. Some researchers think that life may have originated at structures like these billions of years ago. JPL's research team is part of the Icy Worlds team of the NASA Astrobiology Institute, based at NASA's Ames Research Center in Moffett Field, California. JPL is managed by the California Institute of Technology in Pasadena for NASA. http://photojournal.jpl.nasa.gov/catalog/PIA19835
VizieR Online Data Catalog: Stellar mass of brightest cluster galaxies (Bellstedt+, 2016)
NASA Astrophysics Data System (ADS)
Bellstedt, S.; Lidman, C.; Muzzin, A.; Franx, M.; Guatelli, S.; Hill, A. R.; Hoekstra, H.; Kurinsky, N.; Labbe, I.; Marchesini, D.; Marsan, Z. C.; Safavi-Naeini, M.; Sifon, C.; Stefanon, M.; van de Sande, J.; van Dokkum, P.; Weigel, C.
2017-11-01
We utilize a sample of 98 newly imaged galaxy clusters from the RELICS (REd Lens Infrared Cluster Survey) survey within this study. The data were collected during six observing runs on three instruments over a period spanning from 2013 October to 2015 March. The instruments utilized were the SofI2 camera on the New Technology Telescope at the European Southern Observatory (ESO) La Silla Observatory in Chile, WHIRC3 on the WIYN telescope at the Kitt Peak National Observatory and LIRIS4 on the William Herschel Telescope (WHT) in La Palma, Spain. (2 data files).
Sirianni, Nicky M; Yuan, Huijun; Rice, John E; Kaufman, Ronit S; Deng, John; Fulton, Chandler; Wangh, Lawrence J
2016-11-01
Here, we present a new approach for increasing the rate and lowering the cost of identifying, cataloging, and monitoring global biodiversity. These advances, which we call Closed-Tube Barcoding, are one application of a suite of proven PCR-based technologies invented in our laboratory. Closed-Tube Barcoding builds on and aims to enhance the profoundly important efforts of the International Barcode of Life initiative. Closed-Tube Barcoding promises to be particularly useful when large numbers of small or rare specimens need to be screened and characterized at an affordable price. This approach is also well suited for automation and for use in portable devices.
High Data Rate Architecture (HiDRA)
NASA Technical Reports Server (NTRS)
Hylton, Alan; Raible, Daniel
2016-01-01
One of the greatest challenges in developing new space technology is in navigating the transition from ground based laboratory demonstration at Technology Readiness Level 6 (TRL-6) to conducting a prototype demonstration in space (TRL-7). This challenge is com- pounded by the relatively low availability of new spacecraft missions when compared with aeronautical craft to bridge this gap, leading to the general adoption of a low-risk stance by mission management to accept new, unproven technologies into the system. Also in consideration of risk, the limited selection and availability of proven space-grade components imparts a severe limitation on achieving high performance systems by current terrestrial technology standards. Finally from a space communications point of view the long duration characteristic of most missions imparts a major constraint on the entire space and ground network architecture, since any new technologies introduced into the system would have to be compliant with the duration of the currently deployed operational technologies, and in some cases may be limited by surrounding legacy capabilities. Beyond ensuring that the new technology is verified to function correctly and validated to meet the needs of the end users the formidable challenge then grows to additionally include: carefully timing the maturity path of the new technology to coincide with a feasible and accepting future mission so it flies before its relevancy has passed, utilizing a limited catalog of available components to their maximum potential to create meaningful and unprecedented new capabilities, designing and ensuring interoperability with aging space and ground infrastructures while simultaneously providing a growth path to the future. The International Space Station (ISS) is approaching 20 years of age. To keep the ISS relevant, technology upgrades are continuously taking place. Regarding communications, the state-of-the-art communication system upgrades underway include high-rate laser terminals. These must interface with the existing, aging data infrastructure. The High Data Rate Architecture (HiDRA) project is designed to provide networked store, carry, and forward capability to optimize data flow through both the existing radio frequency (RF) and new laser communications terminal. The networking capability is realized through the Delay Tolerant Networking (DTN) protocol, and is used for scheduling data movement as well as optimizing the performance of existing RF channels. HiDRA is realized as a distributed FPGA memory and interface controller that is itself controlled by a local computer running DTN software. Thus HiDRA is applicable to other arenas seeking to employ next-generation communications technologies, e.g. deep space. In this paper, we describe HiDRA and its far-reaching research implications.
ISAIA: Interoperable Systems for Archival Information Access
NASA Technical Reports Server (NTRS)
Hanisch, Robert J.
2002-01-01
The ISAIA project was originally proposed in 1999 as a successor to the informal AstroBrowse project. AstroBrowse, which provided a data location service for astronomical archives and catalogs, was a first step toward data system integration and interoperability. The goals of ISAIA were ambitious: '...To develop an interdisciplinary data location and integration service for space science. Building upon existing data services and communications protocols, this service will allow users to transparently query hundreds or thousands of WWW-based resources (catalogs, data, computational resources, bibliographic references, etc.) from a single interface. The service will collect responses from various resources and integrate them in a seamless fashion for display and manipulation by the user.' Funding was approved only for a one-year pilot study, a decision that in retrospect was wise given the rapid changes in information technology in the past few years and the emergence of the Virtual Observatory initiatives in the US and worldwide. Indeed, the ISAIA pilot study was influential in shaping the science goals, system design, metadata standards, and technology choices for the virtual observatory. The ISAIA pilot project also helped to cement working relationships among the NASA data centers, US ground-based observatories, and international data centers. The ISAIA project was formed as a collaborative effort between thirteen institutions that provided data to astronomers, space physicists, and planetary scientists. Among the fruits we ultimately hoped would come from this project would be a central site on the Web that any space scientist could use to efficiently locate existing data relevant to a particular scientific question. Furthermore, we hoped that the needed technology would be general enough to allow smaller, more-focused community within space science could use the same technologies and standards to provide more specialized services. A major challenge to searching for data across a broad community is that information that describe some data products are either not relevant to other data or not applicable in the same way. Some previous metadata standard development efforts (e.g., in the earth science and library communities) have produced standards that are very large and difficult to support. To address this problem, we studied how a standard may be divided into separable pieces. Data providers that wish to participate in interoperable searches can support only those parts of the standard that are relevant to them. We prototyped a top-level metadata standard that was small and applicable to all space science data.
Catalog of Research Abstracts, 1993: Partnership opportunities at Lawrence Berkeley Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1993-09-01
The 1993 edition of Lawrence Berkeley Laboratory`s Catalog of Research Abstracts is a comprehensive listing of ongoing research projects in LBL`s ten research divisions. Lawrence Berkeley Laboratory (LBL) is a major multi-program national laboratory managed by the University of California for the US Department of Energy (DOE). LBL has more than 3000 employees, including over 1000 scientists and engineers. With an annual budget of approximately $250 million, LBL conducts a wide range of research activities, many that address the long-term needs of American industry and have the potential for a positive impact on US competitiveness. LBL actively seeks to sharemore » its expertise with the private sector to increase US competitiveness in world markets. LBL has transferable expertise in conservation and renewable energy, environmental remediation, materials sciences, computing sciences, and biotechnology, which includes fundamental genetic research and nuclear medicine. This catalog gives an excellent overview of LBL`s expertise, and is a good resource for those seeking partnerships with national laboratories. Such partnerships allow private enterprise access to the exceptional scientific and engineering capabilities of the federal laboratory systems. Such arrangements also leverage the research and development resources of the private partner. Most importantly, they are a means of accessing the cutting-edge technologies and innovations being discovered every day in our federal laboratories.« less
NASA Astrophysics Data System (ADS)
Pakuliak, L. K.; Andruk, V. M.; Golovnia, V. V.; Shatokhina, S. V.; Yizhakevych, O. M.; Ivanov, G. A.; Yatsenko, A. I.; Sergeeva, T. P.
Almost 40-year history of FON project ended with the creation of the whole northern sky catalog of objects down to B ≤ 16.5m. The idea of 4-fold overlapping of the northern sky with 6 wide-field astrographs has not been realized in full. For historical reasons it has been transformed into the 2-fold overlapping observational program of MAO NAS of Ukraine, resulted in three versions of the multimillion catalog of positions, proper motions, and B-magnitudes of stars. The first version of 1.2 million stars had been finished before the 2000s and is based on the AC object list. The measurements of plates were made by automatic measuring complex PARSEC, specially developed for massive photographic reviews. As the input list was limited by AC objects, the most part of stars on the FON plates remained unmeasured. Principles of workflow organization of such works formed the basis for the further development of the project using the latest IT-technologies. For the creation of the second and the third versions of the catalog, the list of objects was obtained as a result of total digitizing of plates and their image processing. The final third version contains 19.5 million stars and galaxies with the maximum possible for the photographic astrometry accuracy. The collection of plates, obtained in other observatories - participants of the project, are partially safe and can be used for the same astrometric tasks.
NASA Astrophysics Data System (ADS)
Petigura, Erik A.; Howard, Andrew W.; Marcy, Geoffrey W.; Johnson, John Asher; Isaacson, Howard; Cargile, Phillip A.; Hebb, Leslie; Fulton, Benjamin J.; Weiss, Lauren M.; Morton, Timothy D.; Winn, Joshua N.; Rogers, Leslie A.; Sinukoff, Evan; Hirsch, Lea A.; Crossfield, Ian J. M.
2017-09-01
The California-Kepler Survey (CKS) is an observational program developed to improve our knowledge of the properties of stars found to host transiting planets by NASA’s Kepler Mission. The improvement stems from new high-resolution optical spectra obtained using HIRES at the W. M. Keck Observatory. The CKS stellar sample comprises 1305 stars classified as Kepler objects of interest, hosting a total of 2075 transiting planets. The primary sample is magnitude-limited ({Kp}< 14.2) and contains 960 stars with 1385 planets. The sample was extended to include some fainter stars that host multiple planets, ultra-short period planets, or habitable zone planets. The spectroscopic parameters were determined with two different codes, one based on template matching and the other on direct spectral synthesis using radiative transfer. We demonstrate a precision of 60 K in {T}{eff}, 0.10 dex in {log}g, 0.04 dex in [{Fe}/{{H}}], and 1.0 {km} {{{s}}}-1 in V\\sin I. In this paper, we describe the CKS project and present a uniform catalog of spectroscopic parameters. Subsequent papers in this series present catalogs of derived stellar properties such as mass, radius, and age; revised planet properties; and statistical explorations of the ensemble. CKS is the largest survey to determine the properties of Kepler stars using a uniform set of high-resolution, high signal-to-noise ratio spectra. The HIRES spectra are available to the community for independent analyses. Based on observations obtained at the W. M. Keck Observatory, which is operated jointly by the University of California and the California Institute of Technology. Keck time was granted for this project by the University of California, and California Institute of Technology, the University of Hawaii, and NASA.
The National Geological and Geophysical Data Preservation Program
NASA Astrophysics Data System (ADS)
Dickinson, T. L.; Steinmetz, J. C.; Gundersen, L. C.; Pierce, B. S.
2006-12-01
The ability to preserve and maintain geoscience data and collections has not kept pace with the growing need for accessible digital information and the technology to make it so. The Nation has lost valuable and unique geologic records and is in danger of losing much more. Many federal and state geological repositories are currently at their capacity for maintaining and storing data or samples. Some repositories are gaining additional, but temporary and substandard space, using transport containers or offsite warehouses where access is limited and storage conditions are poor. Over the past several years, there has been an increasing focus on the state of scientific collections in the United States. For example, the National Geological and Geophysical Data Preservation Program (NGGDPP) Act was passed as part of the Energy Policy Act of 2005, authorizing $30 million in funding for each of five years. The Act directs the U.S. Geological Survey to administer this program that includes a National Digital Catalog and Federal assistance to support our nation's repositories. Implementation of the Program awaits federal appropriations. The NGGDPP is envisioned as a national network of cooperating geoscience materials and data repositories that are operated independently yet guided by unified standards, procedures, and protocols for metadata. The holdings will be widely accessible through a common and mirrored Internet-based catalog (National Digital Catalog). The National Digital Catalog will tie the observations and analyses to the physical materials they come from. Our Nation's geological and geophysical data are invaluable and in some instances irreplaceable due to the destruction of outcrops, urbanization and restricted access. These data will enable the next generation of scientific research and education, enable more effective and efficient research, and may have future economic benefits through the discovery of new oil and gas accumulations, and mineral deposits.
Buchanan, Carrie C; Torstenson, Eric S; Bush, William S
2012-01-01
Background Since publication of the human genome in 2003, geneticists have been interested in risk variant associations to resolve the etiology of traits and complex diseases. The International HapMap Consortium undertook an effort to catalog all common variation across the genome (variants with a minor allele frequency (MAF) of at least 5% in one or more ethnic groups). HapMap along with advances in genotyping technology led to genome-wide association studies which have identified common variants associated with many traits and diseases. In 2008 the 1000 Genomes Project aimed to sequence 2500 individuals and identify rare variants and 99% of variants with a MAF of <1%. Methods To determine whether the 1000 Genomes Project includes all the variants in HapMap, we examined the overlap between single nucleotide polymorphisms (SNPs) genotyped in the two resources using merged phase II/III HapMap data and low coverage pilot data from 1000 Genomes. Results Comparison of the two data sets showed that approximately 72% of HapMap SNPs were also found in 1000 Genomes Project pilot data. After filtering out HapMap variants with a MAF of <5% (separately for each population), 99% of HapMap SNPs were found in 1000 Genomes data. Conclusions Not all variants cataloged in HapMap are also cataloged in 1000 Genomes. This could affect decisions about which resource to use for SNP queries, rare variant validation, or imputation. Both the HapMap and 1000 Genomes Project databases are useful resources for human genetics, but it is important to understand the assumptions made and filtering strategies employed by these projects. PMID:22319179
Integrated Seismological Network of Brazil: Key developments in technology.
NASA Astrophysics Data System (ADS)
Pirchiner, Marlon; Assumpção, Marcelo; Ferreira, Joaquim; França, George
2010-05-01
The Integrated Seismological Network of Brazil - BRASIS - will integrate the main Brazilian seismology groups in an extensive permanent broadband network with a (near) real-time acquisition system and automatic preliminary processing of epicenters and magnitudes. About 60 stations will cover the whole country to continuously monitor the seismic activity. Most stations will be operating by the end of 2010. Data will be received from remote stations at each research group and redistributed to all other groups. In addition to issuing a national catalog of earthquakes, each institution will make its own analysis and issue their own bulletins taking into account local and regional lithospheric structure. We chose the SEED format, seedlink and SeisComP as standard data format, exchange protocol and software framework for the network management, respectively. All different existing equipment (eg, Guralp/Scream, Geotech/CD1.1, RefTek/RTP, Quanterra/seedlink) will be integrated into the same system. We plan to provide: 1) improved station management through remote control, and indexes for quality control of acquired data, sending alerts to operators in critical cases. 2) automatic processing: picking, location with local and regional models and determination of magnitudes, issuing newsletters and alerts. 3) maintainence of an earthquakes catalog, and a waveforms database. 4) query tools and access to metadata, catalogs and waveform available to all researchers. In addition, the catalog of earthquakes and other seismological data will be available as layers in a Spatial Data Infrastructure with open source standards, providing new analysis capabilities to all geoscientists. Seiscomp3 has already been installed in two centers (UFRN and USP) with successful tests of Q330, Guralp, RefTek and Geotech plug-ins to the seedlink protocol. We will discuss the main difficulties of our project and the solutions adopted to improve the Brazilian seismological infrastructure.
The Solar Rotation in the 1930s from the Sunspot and Flocculi Catalogs of the Ebro Observatory
NASA Astrophysics Data System (ADS)
de Paula, V.; Curto, J. J.; Casas, R.
2016-10-01
The tables of sunspot and flocculi heliographic positions included in the catalogs published by the Ebro Observatory in the 1930s have recently been recovered and converted into digital format by using optical character recognition (OCR) technology. We here analyzed these data by computing the angular velocity of several sunspot and flocculi groups. A difference was found in the rotational velocity for sunspots and flocculi groups at high latitudes, and we also detected an asymmetry between the northern and southern hemispheres, which is especially marked for the flocculi groups. The results were then fitted with a differential-rotation law [ω=a+b sin2 B] to compare the data obtained with the results published by other authors. A dependence on the latitude that is consistent with former studies was found. Finally, we studied the possible relationship between the sunspot/flocculi group areas and their corresponding angular velocity. There are strong indications that the rotational velocity of a sunspot/flocculi group is reduced (in relation to the differential rotation law) when its maximum area is larger.
Department of Energy: Nuclear S&T workforce development programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingham, Michelle; Bala, Marsha; Beierschmitt, Kelly
The U.S. Department of Energy (DOE) national laboratories use their expertise in nuclear science and technology (S&T) to support a robust national nuclear S&T enterprise from the ground up. Traditional academic programs do not provide all the elements necessary to develop this expertise, so the DOE has initiated a number of supplemental programs to develop and support the nuclear S&T workforce pipeline. This document catalogs existing workforce development programs that are supported by a number of DOE offices (such as the Offices of Nuclear Energy, Science, Energy Efficiency, and Environmental Management), and by the National Nuclear Security Administration (NNSA) andmore » the Naval Reactor Program. Workforce development programs in nuclear S&T administered through the Department of Homeland Security, the Nuclear Regulatory Commission, and the Department of Defense are also included. The information about these programs, which is cataloged below, is drawn from the program websites. Some programs, such as the Minority Serving Institutes Partnership Programs (MSIPPs) are available through more than one DOE office, so they appear in more than one section of this document.« less
Case Study for the ARRA-funded GSHP Demonstration at University at Albany
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Xiaobing; Malhotra, Mini; Xiong, Zeyu
High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This report highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects—a distributed GSHP system at a new 500-bed apartment-style student residence hall at the University at Albany. This case studymore » is based on the analysis of detailed design documents, measured performance data, published catalog data of heat pump equipment, and actual construction costs. Simulations with a calibrated computer model are performed for both the demonstrated GSHP system and a baseline heating, ventilation, and airconditioning (HVAC) system to determine the energy savings and other related benefits achieved by the GSHP system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GSHP system, as well as the pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the demonstrated GSHP system compared with the baseline HVAC system. This case study also identifies opportunities for improving the operational efficiency of the demonstrated GSHP system.« less
2014-09-04
This final rule changes the meaningful use stage timeline and the definition of certified electronic health record technology (CEHRT) to allow options in the use of CEHRT for the EHR reporting period in 2014. It also sets the requirements for reporting on meaningful use objectives and measures as well as clinical quality measure (CQM) reporting in 2014 for providers who use one of the CEHRT options finalized in this rule for their EHR reporting period in 2014. In addition, it finalizes revisions to the Medicare and Medicaid EHR Incentive Programs to adopt an alternate measure for the Stage 2 meaningful use objective for hospitals to provide structured electronic laboratory results to ambulatory providers; to correct the regulation text for the measures associated with the objective for hospitals to provide patients the ability to view online, download, and transmit information about a hospital admission; and to set a case number threshold exemption for CQM reporting applicable for eligible hospitals and critical access hospitals (CAHs) beginning with FY 2013. Finally, this rule finalizes the provisionally adopted replacement of the Data Element Catalog (DEC) and the Quality Reporting Document Architecture (QRDA) Category III standards with updated versions of these standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valassi, A.; Clemencic, M.; Dykstra, D.
The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less
1983-08-01
DIVISION ! - s .., ff FI f 84 02 e 085s r,.~ .O Leg.m . No. o NO -. W M The work repored In this document was conducted under contract MCA U3 79 C 3018...DOCUMENTATION PAGE EFoRE COIPLNT G UIOM 1. EPORT GUME OVT ACCESSION NO. 3. RECIPIENT’S CATALOG NUMBER 4. TITLE (nod 5bds) S . TYPE OF REPORT A PERIOD...Document D-40 7. AUTHOR.) 6. CONTRACT OR GRANT NUMMER(e) . Donald Hornbeck, EG&G Almond Instruments NM 903 79 C 0018 Working Group Chairman S . PERFORMING
Assessing user preferences for e-readers and tablets.
Le Ber, Jeanne M; Lombardo, Nancy T; Honisett, Amy; Jones, Peter Stevens; Weber, Alice
2013-01-01
Librarians purchased 12 e-readers and six tablets to provide patrons the opportunity to experiment with the latest mobile technologies. After several train-the-trainer sessions, librarians shared device information with the broader health sciences community. Devices were cataloged and made available for a two-week checkout. A limited number of books and applications (apps) were preloaded for all the devices, and patrons were allowed to download their own content. Each tablet has Google Books, iBooks, Kindle, and Nook apps available to allow choice in reading e-books. Upon return, patrons were asked to complete a ten-question survey to determine preferences for device use.
NASA Astrophysics Data System (ADS)
Hayes, Brian
1994-12-01
Gleaning further clues to the structure of the universe will require larger data samples. To that end, a major new survey of the skies called the Sloan Digital Star Survey (SDSS), is in preparation. It will catalog some 50 million galaxies and about 70 million stars. A new 2.5 meter telescope to be erected at Apache Point Observatory in New Mexico will be dedicated to the survey. The telescope is not the key innovation that will make the survey possible. The crucial factor is the technology for digitally recording large numbers of images and spectra and for automating the analysis, recognition, and classification of those images and spectra. The methods to be used are discussed.
Distributed Information Search and Retrieval for Astronomical Resource Discovery and Data Mining
NASA Astrophysics Data System (ADS)
Murtagh, Fionn; Guillaume, Damien
Information search and retrieval has become by nature a distributed task. We look at tools and techniques which are of importance in this area. Current technological evolution can be summarized as the growing stability and cohesiveness of distributed architectures of searchable objects. The objects themselves are more often than not multimedia, including published articles or grey literature reports, yellow page services, image data, catalogs, presentation and online display materials, and ``operations'' information such as scheduling and publicly accessible proposal information. The evolution towards distributed architectures, protocols and formats, and the direction of our own work, are focussed on in this paper.
Optical Counterpart to MAXI J1647-227
NASA Astrophysics Data System (ADS)
Garnavich, P.; Magno, K.; Applegate, A.
2012-06-01
We observed the field of the X-ray transient MAXI J1647-227 (Negoro et al., ATEL#4175) with the Vatican Advance Technology Telescope (VATT) and VATT4K CCD imager beginning June 16.244 UT. R-band images reveal an optical source near the position of the Swift localization (Kennea et al., ATEL#4178) that is not visible on the Digitized Sky Survey. Based on USNO-B1.0 catalog stars in the field, we find the optical transient has a position of 16:48:12.32 -23:00:53.56 (error of 0.2 arcsec) which is within 2 arcsec of the Swift X-ray position.
VizieR Online Data Catalog: Ionized gas in E/S0 galaxies with dust lanes (Finkelman+, 2010)
NASA Astrophysics Data System (ADS)
Finkelman, I.; Brosch, N.; Funes, J. G.; Kniazev, A. Y.; Vaisanen, P.
2011-01-01
We present broad-band U, B, V, R, I and Hα narrow-band observations of 30 galaxies acquired at the Mt. Graham International Observatory (MGIO), the South African Astronomical Observatory (SAAO) and the Wise Observatory (WO). Observations with the 1.8-m Vatican Advanced Technology Telescope (VATT) at the MGIO were performed in two runs in 2004 June and 2006 May using the Loral CCD at the aplanatic f/9 Gregorian focus. A second set of observations was performed at the SAAO 1.9-m telescope in Sutherland, South Africa, in 2008 November. (5 data files).
Integrating Top-down and Bottom-up Cybersecurity Guidance using XML
Lubell, Joshua
2016-01-01
This paper describes a markup-based approach for synthesizing disparate information sources and discusses a software implementation of the approach. The implementation makes it easier for people to use two complementary, but differently structured, guidance specifications together: the (top-down) Cybersecurity Framework and the (bottom-up) National Institute of Standards and Technology Special Publication 800-53 security control catalog. An example scenario demonstrates how the software implementation can help a security professional select the appropriate safeguards for restricting unauthorized access to an Industrial Control System. The implementation and example show the benefits of this approach and suggest its potential application to disciplines other than cybersecurity. PMID:27795810
Software to Manage the Unmanageable
NASA Technical Reports Server (NTRS)
2005-01-01
In 1995, NASA s Jet Propulsion Laboratory (JPL) contracted Redmond, Washington-based Lucidoc Corporation, to design a technology infrastructure to automate the intersection between policy management and operations management with advanced software that automates document workflow, document status, and uniformity of document layout. JPL had very specific parameters for the software. It expected to store and catalog over 8,000 technical and procedural documents integrated with hundreds of processes. The project ended in 2000, but NASA still uses the resulting highly secure document management system, and Lucidoc has managed to help other organizations, large and small, with integrating document flow and operations management to ensure a compliance-ready culture.
Mars 2020 MOXIE Laboratory and Principal Investigator
2016-07-15
One investigation on NASA's Mars 2020 rover will extract oxygen from the Martian atmosphere. It is called MOXIE, for Mars Oxygen In-Situ Resource Utilization Experiment. In this image, MOXIE Principal Investigator Michael Hecht, of the Massachusetts Institute of Technology, Cambridge, is in the MOXIE development laboratory at NASA's Jet Propulsion Laboratory, Pasadena, California. Mars' atmosphere is mostly carbon dioxide. Demonstration of the capability for extracting oxygen from it, under Martian environmental conditions, will be a pioneering step toward how humans on Mars will use the Red Planet's natural resources. Oxygen can be used in the rocket http://photojournal.jpl.nasa.gov/catalog/PIA20761
The Marine Physical Laboratory Multi-Disciplinary Ocean Science and Technology Program
1989-10-01
1793 INTRODUCTION Two specific objectives were planned for the next year: an updating and completion of a world-wide catalog of fish sounds, and a...separated rhe walls of Cauldron (’rater consist of nearly vertical cliffs by younger ponds of cream colored calcareous ooze. The out- (black in Figure 2h...2 SiO2 49.78 50.02 5014 50.60 50.14 49.84 50.00 50.54 49.95 48.83 51.83 TiO2 1.07 1.13 1.12 1.23 1.13 1.10 1.09 1.02 1.76 1.75 1.38 A12 0 3 16.15
OpenSearch technology for geospatial resources discovery
NASA Astrophysics Data System (ADS)
Papeschi, Fabrizio; Enrico, Boldrini; Mazzetti, Paolo
2010-05-01
In 2005, the term Web 2.0 has been coined by Tim O'Reilly to describe a quickly growing set of Web-based applications that share a common philosophy of "mutually maximizing collective intelligence and added value for each participant by formalized and dynamic information sharing". Around this same period, OpenSearch a new Web 2.0 technology, was developed. More properly, OpenSearch is a collection of technologies that allow publishing of search results in a format suitable for syndication and aggregation. It is a way for websites and search engines to publish search results in a standard and accessible format. Due to its strong impact on the way the Web is perceived by users and also due its relevance for businesses, Web 2.0 has attracted the attention of both mass media and the scientific community. This explosive growth in popularity of Web 2.0 technologies like OpenSearch, and practical applications of Service Oriented Architecture (SOA) resulted in an increased interest in similarities, convergence, and a potential synergy of these two concepts. SOA is considered as the philosophy of encapsulating application logic in services with a uniformly defined interface and making these publicly available via discovery mechanisms. Service consumers may then retrieve these services, compose and use them according to their current needs. A great degree of similarity between SOA and Web 2.0 may be leading to a convergence between the two paradigms. They also expose divergent elements, such as the Web 2.0 support to the human interaction in opposition to the typical SOA machine-to-machine interaction. According to these considerations, the Geospatial Information (GI) domain, is also moving first steps towards a new approach of data publishing and discovering, in particular taking advantage of the OpenSearch technology. A specific GI niche is represented by the OGC Catalog Service for Web (CSW) that is part of the OGC Web Services (OWS) specifications suite, which provides a set of services for discovery, access, and processing of geospatial resources in a SOA framework. GI-cat is a distributed CSW framework implementation developed by the ESSI Lab of the Italian National Research Council (CNR-IMAA) and the University of Florence. It provides brokering and mediation functionalities towards heterogeneous resources and inventories, exposing several standard interfaces for query distribution. This work focuses on a new GI-cat interface which allows the catalog to be queried according to the OpenSearch syntax specification, thus filling the gap between the SOA architectural design of the CSW and the Web 2.0. At the moment, there is no OGC standard specification about this topic, but an official change request has been proposed in order to enable the OGC catalogues to support OpenSearch queries. In this change request, an OpenSearch extension is proposed providing a standard mechanism to query a resource based on temporal and geographic extents. Two new catalog operations are also proposed, in order to publish a suitable OpenSearch interface. This extended interface is implemented by the modular GI-cat architecture adding a new profiling module called "OpenSearch profiler". Since GI-cat also acts as a clearinghouse catalog, another component called "OpenSearch accessor" is added in order to access OpenSearch compliant services. An important role in the GI-cat extension, is played by the adopted mapping strategy. Two different kind of mappings are required: query, and response elements mapping. Query mapping is provided in order to fit the simple OpenSearch query syntax to the complex CSW query expressed by the OGC Filter syntax. GI-cat internal data model is based on the ISO-19115 profile, that is more complex than the simple XML syndication formats, such as RSS 2.0 and Atom 1.0, suggested by OpenSearch. Once response elements are available, in order to be presented, they need to be translated from the GI-cat internal data model, to the above mentioned syndication formats; the mapping processing, is bidirectional. When GI-cat is used to access OpenSearch compliant services, the CSW query must be mapped to the OpenSearch query, and the response elements, must be translated according to the GI-cat internal data model. As results of such extensions, GI-cat provides a user friendly facade to the complex CSW interface, thus enabling it to be queried, for example, using a browser toolbar.
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Adams, David L.; Trinidad, P. Paul
1997-01-01
NASA Langley Technical Library has been involved in developing systems for full-text information delivery of NACA/NASA technical reports since 1991. This paper will describe the two prototypes it has developed and the present production system configuration. The prototype systems are a NACA CD-ROM of thirty-three classic paper NACA reports and a network-based Full-text Electronic Reports Documents System (FEDS) constructed from both paper and electronic formats of NACA and NASA reports. The production system is the DigiDoc System (DIGItal Documents) presently being developed based on the experiences gained from the two prototypes. DigiDoc configuration integrates the on-line catalog database World Wide Web interface and PDF technology to provide a powerful and flexible search and retrieval system. It describes in detail significant achievements and lessons learned in terms of data conversion, storage technologies, full-text searching and retrieval, and image databases. The conclusions from the experiences of digitization and full- text access and future plans for DigiDoc system implementation are discussed.
Structures and Materials Experimental Facilities and Capabilities Catalog
NASA Technical Reports Server (NTRS)
Horta, Lucas G. (Compiler); Kurtz-Husch, Jeanette D. (Compiler)
2000-01-01
The NASA Center of Excellent for Structures and Materials at Langley Research Center is responsible for conducting research and developing useable technology in the areas of advanced materials and processing technologies, durability, damage tolerance, structural concepts, advanced sensors, intelligent systems, aircraft ground operations, reliability, prediction tools, performance validation, aeroelastic response, and structural dynamics behavior for aerospace vehicles. Supporting the research activities is a complementary set of facilities and capabilities documented in this report. Because of the volume of information, the information collected was restricted in most cases to one page. Specific questions from potential customers or partners should be directed to the points of contacts provided with the various capabilities. Grouping of the equipment is by location as opposed to function. Geographical information of the various buildings housing the equipment is also provided. Since this is the first time that such an inventory is ever collected at Langley it is by no means complete. It is estimated that over 90 percent of the equipment capabilities at hand are included but equipment is continuously being updated and will be reported in the future.
Deep Space 1 Using its Ion Engine Artist Concept
2003-07-02
NASA's New Millennium Deep Space 1 spacecraft approaching the comet 19P/Borrelly. With its primary mission to serve as a technology demonstrator--testing ion propulsion and 11 other advanced technologies--successfully completed in September 1999, Deep Space 1 is now headed for a risky, exciting rendezvous with Comet Borrelly. NASA extended the mission, taking advantage of the ion propulsion and other systems to target the daring encounter with the comet in September 2001. Once a sci-fi dream, the ion propulsion engine has powered the spacecraft for over 12,000 hours. Another onboard experiment includes software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. The first flight in NASA's New Millennium Program, Deep Space 1 was launched October 24, 1998 aboard a Boeing Delta 7326 rocket from Cape Canaveral Air Station, FL. Deep Space 1 successfully completed and exceeded its mission objectives in July 1999 and flew by a near-Earth asteroid, Braille (1992 KD), in September 1999. http://photojournal.jpl.nasa.gov/catalog/PIA04604
2016-08-09
This image shows the bare bones of the first prototype starshade by NASA's Jet Propulsion Laboratory, Pasadena, California. The prototype was shown in technology partner Astro Aerospace/Northrup Grumman's facility in Santa Barbara, California in 2013. In order for the petals of the starshade to diffract starlight away from the camera of a space telescope, they must be deployed with accuracy once the starshade reaches space. The four petals pictured in the image are being measured for this positional accuracy with a laser. As shown by this 66-foot (20-meter) model, starshades can come in many shapes and sizes. This design shows petals that are more extreme in shape which properly diffracts starlight for smaller telescopes. http://photojournal.jpl.nasa.gov/catalog/PIA20903
NASA Astrophysics Data System (ADS)
Titov, O.; Pursimo, T.; Johnston, Helen M.; Stanford, Laura M.; Hunstead, Richard W.; Jauncey, David L.; Zenere, Katrina A.
2017-04-01
In extending our spectroscopic program, which targets sources drawn from the International Celestial Reference Frame (ICRF) Catalog, we have obtained spectra for ˜160 compact, flat-spectrum radio sources and determined redshifts for 112 quasars and radio galaxies. A further 14 sources with featureless spectra have been classified as BL Lac objects. Spectra were obtained at three telescopes: the 3.58 m European Southern Observatory New Technology Telescope, and the two 8.2 m Gemini telescopes in Hawaii and Chile. While most of the sources are powerful quasars, a significant fraction of radio galaxies is also included from the list of non-defining ICRF radio sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Titov, O.; Stanford, Laura M.; Pursimo, T.
In extending our spectroscopic program, which targets sources drawn from the International Celestial Reference Frame (ICRF) Catalog, we have obtained spectra for ∼160 compact, flat-spectrum radio sources and determined redshifts for 112 quasars and radio galaxies. A further 14 sources with featureless spectra have been classified as BL Lac objects. Spectra were obtained at three telescopes: the 3.58 m European Southern Observatory New Technology Telescope, and the two 8.2 m Gemini telescopes in Hawaii and Chile. While most of the sources are powerful quasars, a significant fraction of radio galaxies is also included from the list of non-defining ICRF radiomore » sources.« less
Quality control and assurance for validation of DOS/I measurements
NASA Astrophysics Data System (ADS)
Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.
2010-02-01
Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.
NASA Astrophysics Data System (ADS)
Genet, Russell M.; Fulton, B. J.; Bianco, Federica B.; Martinez, John; Baxter, John; Brewer, Mark; Carro, Joseph; Collins, Sarah; Estrada, Chris; Johnson, Jolyon; Salam, Akash; Wallen, Vera; Warren, Naomi; Smith, Thomas C.; Armstrong, James D.; McGaughey, Steve; Pye, John; Mohanan, Kakkala; Church, Rebecca
2012-05-01
Double stars have been systematically observed since William Herschel initiated his program in 1779. In 1803 he reported that, to his surprise, many of the systems he had been observing for a quarter century were gravitationally bound binary stars. In 1830 the first binary orbital solution was obtained, leading eventually to the determination of stellar masses. Double star observations have been a prolific field, with observations and discoveries - often made by students and amateurs - routinely published in a number of specialized journals such as the Journal of Double Star Observations. All published double star observations from Herschel's to the present have been incorporated in the Washington Double Star Catalog. In addition to reviewing the history of visual double stars, we discuss four observational technologies and illustrate these with our own observational results from both California and Hawaii on telescopes ranging from small SCTs to the 2-meter Faulkes Telescope North on Haleakala. Two of these technologies are visual observations aimed primarily at published "hands-on" student science education, and CCD observations of both bright and very faint doubles. The other two are recent technologies that have launched a double star renaissance. These are lucky imaging and speckle interferometry, both of which can use electron-multiplying CCD cameras to allow short (30 ms or less) exposures that are read out at high speed with very low noise. Analysis of thousands of high speed exposures allows normal seeing limitations to be overcome so very close doubles can be accurately measured.
IRIS Earthquake Browser with Integration to the GEON IDV for 3-D Visualization of Hypocenters.
NASA Astrophysics Data System (ADS)
Weertman, B. R.
2007-12-01
We present a new generation of web based earthquake query tool - the IRIS Earthquake Browser (IEB). The IEB combines the DMC's large set of earthquake catalogs (provided by USGS/NEIC, ISC and the ANF) with the popular Google Maps web interface. With the IEB you can quickly and easily find earthquakes in any region of the globe. Using Google's detailed satellite images, earthquakes can be easily co-located with natural geographic features such as volcanoes as well as man made features such as commercial mines. A set of controls allow earthquakes to be filtered by time, magnitude, and depth range as well as catalog name, contributor name and magnitude type. Displayed events can be easily exported in NetCDF format into the GEON Integrated Data Viewer (IDV) where hypocenters may be visualized in three dimensions. Looking "under the hood", the IEB is based on AJAX technology and utilizes REST style web services hosted at the IRIS DMC. The IEB is part of a broader effort at the DMC aimed at making our data holdings available via web services. The IEB is useful both educationally and as a research tool.
NASA Astrophysics Data System (ADS)
Cristóbal-Hornillos, D.; Varela, J.; Ederoclite, A.; Vázquez Ramió, H.; López-Sainz, A.; Hernández-Fuertes, J.; Civera, T.; Muniesa, D.; Moles, M.; Cenarro, A. J.; Marín-Franch, A.; Yanes-Díaz, A.
2015-05-01
The Observatorio Astrofísico de Javalambre consists of two main telescopes: JST/T250, a 2.5 m telescope with a FoV of 3 deg, and JAST/T80, a 83 cm with a 2 deg FoV. JST/T250 will be devoted to complete the Javalambre-PAU Astronomical Survey (J-PAS). It is a photometric survey with a system of 54 narrow-band plus 3 broad-band filters covering an area of 8500°^2. The JAST/T80 will perform the J-PLUS survey, covering the same area in a system of 12 filters. This contribution presents the software and hardware architecture designed to store and process the data. The processing pipeline runs daily and it is devoted to correct instrumental signature on the science images, to perform astrometric and photometric calibration, and the computation of individual image catalogs. In a second stage, the pipeline performs the combination of the tile mosaics and the computation of final catalogs. The catalogs are ingested in as Scientific database to be provided to the community. The processing software is connected with a management database to store persistent information about the pipeline operations done on each frame. The processing pipeline is executed in a computing cluster under a batch queuing system. Regarding the storage system, it will combine disk and tape technologies. The disk storage system will have capacity to store the data that is accessed by the pipeline. The tape library will store and archive the raw data and earlier data releases with lower access frequency.
Exploiting Untapped Information Resources in Earth Science
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Fox, P. A.; Kempler, S.; Maskey, M.
2015-12-01
One of the continuing challenges in any Earth science investigation is the amount of time and effort required for data preparation before analysis can begin. Current Earth science data and information systems have their own shortcomings. For example, the current data search systems are designed with the assumption that researchers find data primarily by metadata searches on instrument or geophysical keywords, assuming that users have sufficient knowledge of the domain vocabulary to be able to effectively utilize the search catalogs. These systems lack support for new or interdisciplinary researchers who may be unfamiliar with the domain vocabulary or the breadth of relevant data available. There is clearly a need to innovate and evolve current data and information systems in order to improve data discovery and exploration capabilities to substantially reduce the data preparation time and effort. We assert that Earth science metadata assets are dark resources, information resources that organizations collect, process, and store for regular business or operational activities but fail to utilize for other purposes. The challenge for any organization is to recognize, identify and effectively utilize the dark data stores in their institutional repositories to better serve their stakeholders. NASA Earth science metadata catalogs contain dark resources consisting of structured information, free form descriptions of data and pre-generated images. With the addition of emerging semantic technologies, such catalogs can be fully utilized beyond their original design intent of supporting current search functionality. In this presentation, we will describe our approach of exploiting these information resources to provide novel data discovery and exploration pathways to science and education communities
Repeating ice-earthquakes beneath David Glacier from the 2012-2015 TAMNNET array
NASA Astrophysics Data System (ADS)
Walter, J. I.; Peng, Z.; Hansen, S. E.
2017-12-01
The continent of Antarctica has approximately the same surface area as the continental United States, though we know significantly less about its underlying geology and seismic activity. In recent years, improvements in seismic instrumentation, battery technology, and field deployment practices have allowed for continuous broadband stations throughout the dark Antarctic winter. We utilize broadband seismic data from a recent experiment (TAMNNET), which was originally proposed as a structural seismology experiment, for seismic event detection. Our target is to address fundamental questions about regional-scale crustal and environmental seismicity in the study region that comprises the Transantarctic Mountain area of Victoria and Oates Land. We identify most seismicity emanating from David Glacier, upstream of the Drygalski Ice Tongue, which has been documented by several other studies. In order to improve the catalog completeness for the David Glacier area, we utilize a matched-filter technique to identify potential missing earthquakes that may not have been originally detected. This technique utilizes existing cataloged waveforms as templates to scan through continuous data and to identify repeating or nearby earthquakes. With a more robust catalog, we evaluate relative changes in icequake positions, recurrence intervals, and other first-order information. In addition, we attempt to further refine locations of other regional seismicity using a variety of methods including body and surface wave polarization, beamforming, surface wave dispersion, and other seismological methods. This project highlights the usefulness of archiving raw datasets (i.e., passive seismic continuous data), so that researchers may apply new algorithms or techniques to test hypotheses not originally or specifically targeted by the original experimental design.
Library services for people with disabilities: results of a survey.
Nelson, P P
1996-01-01
The Americans with Disabilities Act (ADA), enacted in 1990, has had a significant impact on the way many institutions, including libraries, do business. The Association of Research Libraries surveyed its members in 1991 to determine the effect of this legislation, and the author conducted a similar survey in 1995 to learn what progress academic health sciences libraries have made in serving the needs of people with disablities. A questionnaire was mailed to 131 members of the Association of Academic Health Sciences Library Directors. Nearly three-quarters of respondents reported elimination of physical barriers. The most common services provided are retrieval of materials from the stacks and photocopy assistance. Much less attention has been paid to the use of adaptive technology that allows disabled users to search a library's online catalog and databases; special technology is often provided by another unit on campus but there seems to be little coordination with library services Few libraries have assigned responsibility for disability services to a specific staff member and even fewer have done a formal assessment of the need for special services. The issues identified by the survey should challenge academic health sciences libraries to examine their status regarding compliance with ADA legislation. PMID:8883988
Heliophysics Data Environment: What's next? (Invited)
NASA Astrophysics Data System (ADS)
Martens, P.
2010-12-01
In the last two decades the Heliophysics community has witnessed the societal recognition of the importance of space weather and space climate for our technology and ecology, resulting in a renewed priority for and investment in Heliophysics. As a result of that and the explosive development of information technology, Heliophysics has experienced an exponential growth in the amount and variety of data acquired, as well as the easy electronic storage and distribution of these data. The Heliophysics community has responded well to these challenges. The first, most obvious and most needed response, was the development of Virtual Heliophysics Observatories. While the VxOs of Heliophysics still need a lot of work with respect to the expansion of search options and interoperability, I believe the basic structures and functionalities have been established, and that they meet the needs of the community. In the future we'll see a refinement, completion, and integration of VxOs, not a fundamentally different approach -- in my opinion. The challenge posed by the huge increase in amount of data is not met by VxOs alone. No individual scientist or group, even with the assistance of tons of graduate students, can analyze the torrent of data currently coming down from the fleet of heliospheric observatories. Once more information technology provides an opportunity: Automated feature recognition of solar imagery is feasible, has been implemented in a number of instances, and is strongly supported by NASA. For example, the SDO Feature Finding Team is developing a suite of 16 feature recognition modules for SDO imagery that operates in near-real time, produces space-weather warnings, and populates on-line event catalogs. Automated feature recognition -- "computer vision" -- not only save enormous amounts of time in the analysis of events, it also allows for a shift from the analysis of single events to that of sets of features and events -- the latter being by far the most important implication of computer vision. Consider some specific examples of possibilities here: From the on-line SDO metadata a user can produce with a few IDL line commands information that previously would have taken years to compile, e.g.: - Draw a butterfly diagram for Active Regions, - Find all filaments that coincide with sigmoids and correlate the automatically detected sigmoid handedness with filament chirality, - Correlate EUV jets with small scale flux emergence in coronal holes only, - Draw PIL maps with regions of high shear and large magnetic field gradients overlayed, to pinpoint potential flaring regions. Then correlate with actual flare occurrence. I emphasize that the access to those metadata will be provided by VxOs, and that the interplay between computer vision codes and data will be facilitated by VxOs. My vision for the near and medium future for the VxOs is then to provide a simple and seamless interface between data, cataloged metadata, and computer vision software, either existing or newly developed by the user. Heliospheric virtual observatories and computer vision systems will work together to constantly monitor the Sun, provide space weather warnings, populate catalogs of metadata, analyze trends, and produce real-time on-line imagery of current events.
Interplanetary CubeSat for Technology Demonstration at Mars Artist Concept
2015-06-12
NASA's two MarCO CubeSats will be flying past Mars in September 2016 just as NASA's next Mars lander, InSight, is descending through the Martian atmosphere and landing on the surface. MarCO, for Mars Cube One, will provide an experimental communications relay to inform Earth quickly about the landing. This illustration depicts a moment during the lander's descent when it is transmitting data in the UHF radio band, and the twin MarCO craft are receiving those transmissions while simultaneously relaying the data to Earth in a different radio band. Each of the MarCO twins carries two solar panels for power, and both UHF-band and X-band radio antennas. As a technology demonstration, MarCO could lead to other "bring-your-own-relay" mission designs and also to use of miniature spacecraft for a wide diversity of interplanetary missions. MarCO is the first interplanetary use of CubeSat technologies for small spacecraft. CubeSats are a class of spacecraft based on a standardized small size and modular use of off-the-shelf technologies to streamline development. Many have been made by university students, and dozens have been launched into Earth orbit using extra payload mass available on launches of larger spacecraft. The two briefcase-size MarCO CubeSats will ride along with InSight on an Atlas V launch vehicle lifting off in March 2016 from Vandenberg Air Force Base, California. MarCO is a technology demonstration aspect of the InSight mission and not needed for that mission's success. InSight, an acronym for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport, will investigate the deep interior of Mars to advance understanding of how rocky planets, including Earth, formed and evolved. After launch, the MarCO twins and InSight will be navigated separately to Mars. Note: After thorough examination, NASA managers have decided to suspend the planned March 2016 launch of the Interior Exploration using Seismic Investigations Geodesy and Heat Transport (InSight) mission. The decision follows unsuccessful attempts to repair a leak in a section of the prime instrument in the science payload. http://photojournal.jpl.nasa.gov/catalog/PIA19388
Enabling the development of Community Extensions to GI-cat - the SIB-ESS-C case study
NASA Astrophysics Data System (ADS)
Bigagli, L.; Meier, N.; Boldrini, E.; Gerlach, R.
2009-04-01
GI-cat is a Java software package that implements discovery and access services for disparate geospatial resources. An instance of GI-cat provides a single point of service for querying and accessing remote, as well as local, heterogeneous sources of geospatial information, either through standard interfaces, or taking advantage of GI-cat advanced features, such as incremental responses, query feedback, etc. GI-cat supports a number of de-iure and de-facto standards, but can also be extended to additional community catalog/inventory services, by defining appropriate mediation components. The GI-cat and the SIB-ESS-C development teams collaborated in the development of a mediator to the Siberian Earth Science System Cluster (SIB-ESS-C), a web-based infrastructure to support the communities of environmental and Earth System research in Siberia. This activity resulted in the identification of appropriate technologies and internal mechanisms supporting the development of GI-cat extensions, that are the object of this work. GI-cat is actually built up of a modular framework of SOA components, that can be variously arranged to fit the needs of a community of users. For example, a particular GI-cat instance may be configured to provide discovery functionalities onto an OGC WMS; or to adapt a THREDDS catalog to the standard OGC CSW interface; or to merge a number of CDI repositories into a single, more efficient catalog. The flexibility of GI-cat framework is achieved thanks to its design, that follows the Tree of Responsibility (ToR) pattern and the Uniform Pipe and Filter architectural style. This approach allows the building of software blocks that can be flexibly reused and composed in multiple ways. In fact, the components that make up any GI-cat configuration all implement two common interfaces (i.e. IChainNode and ICatalogService), that support chaining one component to another . Hence, it would suffice to implement those interfaces (plus an appropriate factory class: the mechanism used to create GI-cat components) to support a custom community catalog/inventory service in GI-cat. In general, all the terminal nodes of a GI-cat configuration chain are in charge of mediating between the GI-cat common interfaces and a backend, so we implemented a default behavior in an abstract class, termed Accessor, to be more easily subclassed. Moreover, we identified several typical backend scenarios and provided specialized Accessor subclasses, even simpler to implement. For example, in case of a coarse-grained backend service, that responds its data all at once, a specialized Accessor can retrieve the whole content the first time, and subsequently browse/query the local copy of the data. This was the approach followed for the development of SibesscAccessor. The SIB-ESS-C case study is also noticeable because it requires mediating between the relational and the semi-structured data models. In fact, SIB-ESS-C data are stored in a relational database, to provide performant access even to huge amounts of data. The SibesscAccessor is in charge of establishing a JDBC connection to the database, reading the data by means of SQL statements, creating Java objects according to the ISO 19115 data model, and marshalling the resulting information to an XML document. During the implementation of the SibesscAccessor, the mix of technologies and deployment environments and the geographical distribution of the development teams turned out to be important issues. To solve them, we relied on technologies and tools for collaborative software development: the Maven build system, the SVN version control system, the XPlanner project planning and tracking tool, and of course VOIP tools. Moreover, we shipped the Accessor Development Kit (ADK) Java library, containing the classes needed for extending GI-cat to custom community catalog/inventory services and other supporting material (documentation, best-practices, examples). The ADK is distributed as a Maven artifact, to simplify dependency management and ease the common tasks of testing, packaging, etc. The SibesscAccessor was the first custom addition to the set of GI-cat accessors. Later, also the so-called Standard Accessors library has been refactored onto the ADK. The SIB-ESS-C case study also gave us the opportunity to refine our policies for collaborative software development. Besides, several improvements were made to the overall GI-cat data model and framework. Finally, the SIB-ESS-C development team developed a GI-cat web client by means of Web 2.0 technologies (JavaScript, XML, HTML, CSS, etc.) The client can easily be integrated in any HTML context on any web page. The web GUI allows the user to define requests to GI-cat by entering parameter strings and/or selecting an area of interest on a map. The client sends its request to GI-cat via SOAP through HTTP-POST, parses GI-cat SOAP responses and presents user-friendly information on a web page.
First Starshade Prototype at JPL
2016-08-09
The first prototype starshade developed by NASA's Jet Propulsion Laboratory, shown in technology partner Astro Aerospace/Northrup Grumman's facility in Santa Barbara, California, in 2013. As shown by this 66 foot (20-meter) model, starshades can come in many shapes and sizes. This design shows petals that are more extreme in shape which properly diffracts starlight for smaller telescopes. Each petal is covered in a high-performance plastic film that resembles gold foil. On a starshade ready for launch, the thermal gold foil will only cover the side of the petals facing away from the telescope, with black on the other, so as not to reflect other light sources such as the Earth into its camera. http://photojournal.jpl.nasa.gov/catalog/PIA20906
Interoperability And Value Added To Earth Observation Data
NASA Astrophysics Data System (ADS)
Gasperi, J.
2012-04-01
Geospatial web services technology has provided a new means for geospatial data interoperability. Open Geospatial Consortium (OGC) services such as Web Map Service (WMS) to request maps on the Internet, Web Feature Service (WFS) to exchange vectors or Catalog Service for the Web (CSW) to search for geospatialized data have been widely adopted in the Geosciences community in general and in the remote sensing community in particular. These services make Earth Observation data available to a wider range of public users than ever before. The mapshup web client offers an innovative and efficient user interface that takes advantage of the power of interoperability. This presentation will demonstrate how mapshup can be effectively used in the context of natural disasters management.
LCG Persistency Framework (CORAL, COOL, POOL): Status and Outlook
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valassi, A.; /CERN; Clemencic, M.
2012-04-19
The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less
NASA SBIR product catalog, 1990
NASA Technical Reports Server (NTRS)
Schwenk, F. Carl; Gilman, J. A.
1990-01-01
Since 1983 the NASA Small Business Innovation Research (SBIR) program has benefitted both the agency and the high technology small business community. By making it possible for more small businesses to participate in NASA's research and development, SBIR also provides opportunities for these entrepreneurs to develop products which may also have significant commercial markets. Structured in three phases, the SBIR program uses Phase 1 to assess the technical feasibility of novel ideas proposed by small companies and Phase 2 to conduct research and development on the best concepts. Phase 3, not funded by SBIR, is the utilization and/or commercialization phase. A partial list of products of NASA SBIR projects which have advanced to some degree into Phase 3 are provided with a brief description.
LISA Pathfinder Spacecraft Artist Concept
2015-12-03
This artist's concept shows ESA's LISA Pathfinder spacecraft, which launched on Dec. 3, 2015, from Kourou, French Guiana, will help pave the way for a mission to detect gravitational waves. LISA Pathfinder, led by the European Space Agency (ESA), is designed to test technologies that could one day detect gravitational waves. Gravitational waves, predicted by Einstein's theory of general relativity, are ripples in spacetime produced by any accelerating body. But the waves are so weak that Earth- or space-based observatories would likely only be able to directly detect such signals coming from massive astronomical systems, such as binary black holes or exploding stars. Detecting gravitational waves would be an important piece in the puzzle of how our universe began. http://photojournal.jpl.nasa.gov/catalog/PIA20196
A catalogue of polymorphisms related to xenobiotic metabolism and cancer susceptibility.
Gemignani, Federica; Landi, Stefano; Vivant, Franck; Zienolddiny, Shanbeh; Brennan, Paul; Canzian, Federico
2002-08-01
High-throughput genotyping technology of multiple genes based on large samples of cases and controls are likely to be important in identifying common genes which have a moderate effect on the development of specific diseases. We present here a comprehensive list of 313 known experimentally confirmed polymorphisms in 54 genes which are particularly relevant for metabolism of drugs, alcohol, tobacco, and other potential carcinogens. We have compiled a catalog with a standardized format that summarizes the genetic and biochemical properties of the selected polymorphisms. We have also confirmed or redesigned experimental conditions for simplex or multiplex PCR amplification of a subset of 168 SNPs of particular interest, which will provide the basis for the design of assays compatible with high-throughput genotyping.
NASA Technical Reports Server (NTRS)
Miller, R. E., Jr.; Southall, J. W.; Kawaguchi, A. S.; Redhed, D. D.
1973-01-01
Reports on the design process, support of the design process, IPAD System design catalog of IPAD technical program elements, IPAD System development and operation, and IPAD benefits and impact are concisely reviewed. The approach used to define the design is described. Major activities performed during the product development cycle are identified. The computer system requirements necessary to support the design process are given as computational requirements of the host system, technical program elements and system features. The IPAD computer system design is presented as concepts, a functional description and an organizational diagram of its major components. The cost and schedules and a three phase plan for IPAD implementation are presented. The benefits and impact of IPAD technology are discussed.
NASA Astrophysics Data System (ADS)
Frisch, D.; Melia, F.
1983-09-01
The SAO Catalog of about 260,000 stars was studied to arrive at a sample of 'sibling', sun-like G-stars whose possible planetary systems' intelligent beings might feel drawn to single out each other for directional listening and broadcasting. A set of mostly untabulated sibling candidate stars can be defined, given a direction and a small solid angle that are mutually interesting to members of that set, so that overlapping broadcast/receiving cones can be selected on the basis of commonality. It is suggested that the double cone about the direction of the galactic center, whose half angle is 1/137 radian, is an almost inevitable choice in which sending and receiving with current technology can reach to about 1 kpsc, yielding an estimated 1000 G-star sibling candidates.
The Breakthrough Listen Search for Intelligent Life: Target Selection of Nearby Stars and Galaxies
NASA Astrophysics Data System (ADS)
Isaacson, Howard; Siemion, Andrew P. V.; Marcy, Geoffrey W.; Lebofsky, Matt; Price, Danny C.; MacMahon, David; Croft, Steve; DeBoer, David; Hickish, Jack; Werthimer, Dan; Sheikh, Sofia; Hellbourg, Greg; Enriquez, J. Emilio
2017-05-01
We present the target selection for the Breakthrough Listen search for extraterrestrial intelligence during the first year of observations at the Green Bank Telescope, Parkes Telescope, and Automated Planet Finder. On the way to observing 1,000,000 nearby stars in search of technological signals, we present three main sets of objects we plan to observe in addition to a smaller sample of exotica. We chose the 60 nearest stars, all within 5.1 pc from the Sun. Such nearby stars offer the potential to observe faint radio signals from transmitters that have a power similar to those on Earth. We add a list of 1649 stars drawn from the Hipparcos catalog that span the Hertzprung-Russell diagram, including all spectral types along the main sequence, subgiants, and giant stars. This sample offers diversity and inclusion of all stellar types, but with thoughtful limits and due attention to main sequence stars. Our targets also include 123 nearby galaxies composed of a “morphological-type-complete” sample of the nearest spirals, ellipticals, dwarf spherioidals, and irregulars. While their great distances hamper the detection of technological electromagnetic radiation, galaxies offer the opportunity to observe billions of stars simultaneously and to sample the bright end of the technological luminosity function. We will also use the Green Bank and Parkes telescopes to survey the plane and central bulge of the Milky Way. Finally, the complete target list includes several classes of exotica, including white dwarfs, brown dwarfs, black holes, neutron stars, and asteroids in our solar system.
The 'Wow' Signal, Drake Equation and Exoplanet Considerations
NASA Astrophysics Data System (ADS)
Wheeler, E.
It has been 38 years since the most likely artificial transmission ever recorded from a possible extraterrestrial source was received [1, 2]. Using greatly improved technology, subsequent efforts by the Search for Extraterrestrial Intelligence (SETI) have continued, yet silence from space prevails [3]. This article examines whether the transmission was an artificial signal, and if so why it matters, to include the possibility that the modest technology used by the "Big Ear" receiver could have been accommodated by the source. The transmission and the ensuing long silence may be intended. This paper reconsiders the Drake equation, an estimate for the number of civilizations in our galaxy that may possess technology for interstellar signaling [4, 5], and shows that statement of the current alleged best estimate of two civilizations is not supported [6]. An alternate and original method suggests ~100 civilizations. It importantly relies on experience and detectable events, including recent astronomical evidence about exoplanets as cataloged by the European Exoplanet program and by the National Aeronautics and Space Administration (NASA) Exoplanet Science Institute [7, 8]. In addition it addresses major geological and astronomical occurrences that profoundly affected development of life on Earth and might apply similarly for Extraterrestrial Intelligence (ETI). The alternate approach is not intended to compute ETI precisely but to examine the possibility that, though vastly spread, it likely exists. The discussion anticipates difficulties in communication with an alien civilization, hardly an exercise in science fiction, and explores how international groups can participate in future specific response. One response might be to monitor the electromagnetic radiation spectral line of an element to be determined by consensus.
Expanding Omics Resources for Improvement of Soybean Seed Composition Traits
Chaudhary, Juhi; Patil, Gunvant B.; Sonah, Humira; Deshmukh, Rupesh K.; Vuong, Tri D.; Valliyodan, Babu; Nguyen, Henry T.
2015-01-01
Food resources of the modern world are strained due to the increasing population. There is an urgent need for innovative methods and approaches to augment food production. Legume seeds are major resources of human food and animal feed with their unique nutrient compositions including oil, protein, carbohydrates, and other beneficial nutrients. Recent advances in next-generation sequencing (NGS) together with “omics” technologies have considerably strengthened soybean research. The availability of well annotated soybean genome sequence along with hundreds of identified quantitative trait loci (QTL) associated with different seed traits can be used for gene discovery and molecular marker development for breeding applications. Despite the remarkable progress in these technologies, the analysis and mining of existing seed genomics data are still challenging due to the complexity of genetic inheritance, metabolic partitioning, and developmental regulations. Integration of “omics tools” is an effective strategy to discover key regulators of various seed traits. In this review, recent advances in “omics” approaches and their use in soybean seed trait investigations are presented along with the available databases and technological platforms and their applicability in the improvement of soybean. This article also highlights the use of modern breeding approaches, such as genome-wide association studies (GWAS), genomic selection (GS), and marker-assisted recurrent selection (MARS) for developing superior cultivars. A catalog of available important resources for major seed composition traits, such as seed oil, protein, carbohydrates, and yield traits are provided to improve the knowledge base and future utilization of this information in the soybean crop improvement programs. PMID:26635846
Big Data Discovery and Access Services through NOAA OneStop
NASA Astrophysics Data System (ADS)
Casey, K. S.; Neufeld, D.; Ritchey, N. A.; Relph, J.; Fischman, D.; Baldwin, R.
2017-12-01
The NOAA OneStop Project was created as a pathfinder effort to to improve the discovery of, access to, and usability of NOAA's vast and diverse collection of big data. OneStop is led by the NOAA/NESDIS National Centers for Environmental Information (NCEI), and is seen as a key NESDIS contribution to NOAA's open data and data stewardship efforts. OneStop consists of an entire framework of services, from storage and interoperable access services at the base, through metadata and catalog services in the middle, to a modern user interface experience at the top. Importantly, it is an open framework where external tools and services can connect at whichever level is most appropriate. Since the beta release of the OneStop user interface at the 2016 Fall AGU meeting, significant progress has been made improving and modernizing many NOAA data collections to optimize their use within the framework. In addition, OneStop has made progress implementing robust metadata management and catalog systems at the collection and granule level and improving the user experience with the web interface. This progress will be summarized and the results of extensive user testing including professional usability studies will be reviewed. Key big data technologies supporting the framework will be presented and a community input sought on the future directions of the OneStop Project.
Pham, Nikki T.; Wei, Tong; Schackwitz, Wendy S.; Lipzen, Anna M.; Duong, Phat Q.; Jones, Kyle C.; Ruan, Deling; Bauer, Diane; Peng, Yi; Schmutz, Jeremy
2017-01-01
The availability of a whole-genome sequenced mutant population and the cataloging of mutations of each line at a single-nucleotide resolution facilitate functional genomic analysis. To this end, we generated and sequenced a fast-neutron-induced mutant population in the model rice cultivar Kitaake (Oryza sativa ssp japonica), which completes its life cycle in 9 weeks. We sequenced 1504 mutant lines at 45-fold coverage and identified 91,513 mutations affecting 32,307 genes, i.e., 58% of all rice genes. We detected an average of 61 mutations per line. Mutation types include single-base substitutions, deletions, insertions, inversions, translocations, and tandem duplications. We observed a high proportion of loss-of-function mutations. We identified an inversion affecting a single gene as the causative mutation for the short-grain phenotype in one mutant line. This result reveals the usefulness of the resource for efficient, cost-effective identification of genes conferring specific phenotypes. To facilitate public access to this genetic resource, we established an open access database called KitBase that provides access to sequence data and seed stocks. This population complements other available mutant collections and gene-editing technologies. This work demonstrates how inexpensive next-generation sequencing can be applied to generate a high-density catalog of mutations. PMID:28576844
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Guotian; Jain, Rashmi; Chern, Mawsheng
The availability of a whole-genome sequenced mutant population and the cataloging of mutations of each line at a single-nucleotide resolution facilitate functional genomic analysis. To this end, we generated and sequenced a fast-neutron-induced mutant population in the model rice cultivar Kitaake (Oryza sativa ssp japonica), which completes its life cycle in 9 weeks. We sequenced 1504 mutant lines at 45-fold coverage and identified 91,513 mutations affecting 32,307 genes, i.e., 58% of all rice genes. We detected an average of 61 mutations per line. Mutation types include single-base substitutions, deletions, insertions, inversions, translocations, and tandem duplications. We observed a high proportionmore » of loss-of-function mutations. We identified an inversion affecting a single gene as the causative mutation for the short-grain phenotype in one mutant line. This result reveals the usefulness of the resource for efficient, cost-effective identification of genes conferring specific phenotypes. To facilitate public access to this genetic resource, we established an open access database called KitBase that provides access to sequence data and seed stocks. This population complements other available mutant collections and gene-editing technologies. In conclusion, this work demonstrates how inexpensive next-generation sequencing can be applied to generate a high-density catalog of mutations.« less
NASA Astrophysics Data System (ADS)
Race, Margaret; Farmer, Jack
A 2009 report by the National Research Council (NRC) reviewed a previous study on Mars Sample Return (1997) and provided updated recommendations for future sample return mis-sions based on our current understanding about Mars and its biological potential, as well as advances in technology and analytical capabilities. The committee* made 12 specific recommen-dations that fall into three general categories—one related to current scientific understanding, ten based on changes in the technical and/or policy environment, and one aimed at public com-munication. Substantive changes from the 1997 report relate mainly to protocols and methods, technology and infrastructure, and general oversight. This presentation provides an overview of the 2009 report and its recommendations and analyzes how they may impact mission designs and plans. The full report, Assessment of Planetary Protection Requirements for Mars Sample Return Missions is available online at: http://www.nap.edu/catalog.php?recordi d = 12576 * Study participants: Jack D. Farmer, Arizona State University (chair) James F. Bell III, Cornell University Kathleen C. Benison, Central Michigan University William V. Boynton, University of Arizona Sherry L. Cady, Portland State University F. Grant Ferris, University of Toronto Duncan MacPherson, Jet Propulsion Laboratory Margaret S. Race, SETI Institute Mark H. Thiemens, University of California, San Diego Meenakshi Wadhwa, Arizona State University
Dollar, Daniel M; Gallagher, John; Glover, Janis; Marone, Regina Kenny; Crooker, Cynthia
2007-04-01
To support migration from print to electronic resources, the Cushing/Whitney Medical Library at Yale University reorganized its Technical Services Department to focus on managing electronic resources. The library hired consultants to help plan the changes and to present recommendations for integrating electronic resource management into every position. The library task force decided to focus initial efforts on the periodical collection. To free staff time to devote to electronic journals, most of the print subscriptions were switched to online only and new workflows were developed for e-journals. Staff learned new responsibilities such as activating e-journals, maintaining accurate holdings information in the online public access catalog and e-journals database ("electronic shelf reading"), updating the link resolver knowledgebase, and troubleshooting. All of the serials team members now spend significant amounts of time managing e-journals. The serials staff now spends its time managing the materials most important to the library's clientele (e-journals and databases). The team's proactive approach to maintenance work and rapid response to reported problems should improve patrons' experiences using e-journals. The library is taking advantage of new technologies such as an electronic resource management system, and library workflows and procedures will continue to evolve as technology changes.
Dollar, Daniel M.; Gallagher, John; Glover, Janis; Marone, Regina Kenny; Crooker, Cynthia
2007-01-01
Objective: To support migration from print to electronic resources, the Cushing/Whitney Medical Library at Yale University reorganized its Technical Services Department to focus on managing electronic resources. Methods: The library hired consultants to help plan the changes and to present recommendations for integrating electronic resource management into every position. The library task force decided to focus initial efforts on the periodical collection. To free staff time to devote to electronic journals, most of the print subscriptions were switched to online only and new workflows were developed for e-journals. Results: Staff learned new responsibilities such as activating e-journals, maintaining accurate holdings information in the online public access catalog and e-journals database (“electronic shelf reading”), updating the link resolver knowledgebase, and troubleshooting. All of the serials team members now spend significant amounts of time managing e-journals. Conclusions: The serials staff now spends its time managing the materials most important to the library's clientele (e-journals and databases). The team's proactive approach to maintenance work and rapid response to reported problems should improve patrons' experiences using e-journals. The library is taking advantage of new technologies such as an electronic resource management system, and library workflows and procedures will continue to evolve as technology changes. PMID:17443247
Earth Science Data Grid System
NASA Astrophysics Data System (ADS)
Chi, Y.; Yang, R.; Kafatos, M.
2004-05-01
The Earth Science Data Grid System (ESDGS) is a software system in support of earth science data storage and access. It is built upon the Storage Resource Broker (SRB) data grid technology. We have developed a complete data grid system consistent of SRB server providing users uniform access to diverse storage resources in a heterogeneous computing environment and metadata catalog server (MCAT) managing the metadata associated with data set, users, and resources. We also develop the earth science application metadata; geospatial, temporal, and content-based indexing; and some other tools. In this paper, we will describe software architecture and components of the data grid system, and use a practical example in support of storage and access of rainfall data from the Tropical Rainfall Measuring Mission (TRMM) to illustrate its functionality and features.
NASA Technical Reports Server (NTRS)
1997-01-01
Cogent Software, Inc. was formed in January 1995 by David Atkinson and Irene Woerner, both former employees of the Jet Propulsion Laboratory (JPL). Several other Cogent employees also worked at JPL. Atkinson headed JPL's Information Systems Technology section and Woerner lead the Advanced User Interfaces Group. Cogent's mission is to help companies organize and manage their online content by developing advanced software for the next generation of online directories and information catalogs. The company offers a complete range of Internet solutions, including Internet access, Web site design, local and wide-area networks, and custom software for online commerce applications. Cogent also offers DesignSphere Online, an electronic community for the communications arts industry. Customers range from small offices to manufacturers with thousands of employees, including Chemi-Con, one of the largest manufacturers of capacitors in the world.
VizieR Online Data Catalog: Observed red supergiants in the inner Galaxy (Messineo+, 2016)
NASA Astrophysics Data System (ADS)
Messineo, M.; Zhu, Q.; Menten, K. M.; Ivanov, V. D.; Figer, D. F.; Kudritzki, R.-P.; Rosie, Chen C.-H.
2018-02-01
Spectroscopic observations were carried out with the Son of ISAAC (SofI; Moorwood et al. 1998Msngr..91....9M) Spectrograph on the ESO/New Technology Telescope (NTT) 3.58 m telescope of the La Silla Observatory, on the three nights from UT 2015 June 16 to 19-program ID 095.D-0252(A). Spectra with the low-resolution red grism, and the 0.6" wide slit, delivering resolution R~980 over the wavelength range λ=1.53-2.52 μm were obtained for 94 targets. For each target a minimum number of four exposures, nodded along the slit, were taken in an ABBA sequence. Typical integration times per frame ranged from 2 to 100 s (DITsxNDITs). (1 data file).
CpG Island Methylation in Colorectal Cancer: Past, Present and Future
Curtin, Karen; Slattery, Martha L.; Samowitz, Wade S.
2011-01-01
The concept of a CpG island methylator phenotype, or CIMP, quickly became the focus of several colorectal cancer studies describing its clinical and pathological features after its introduction in 1999 by Toyota and colleagues. Further characterization of CIMP in tumors lead to widespread acceptance of the concept, as expressed by Shen and Issa in their 2005 editorial, “CIMP, at last.” Since that time, extensive research efforts have brought great insights into the epidemiology and prognosis of CIMP+ tumors and other epigenetic mechanisms underlying tumorigenesis. With the advances in technology and subsequent cataloging of the human methylome in cancer and normal tissue, new directions in research to understand CIMP and its role in complex biological systems yield hope for future epigenetically based diagnostics and treatments. PMID:21559209
GRACE-FO Spacecraft Artist Rendering
2017-05-04
This artist's rendering shows the twin spacecraft of the Gravity Recovery and Climate Experiment Follow-On (GRACE-FO) mission, a partnership between NASA and the German Research Centre for Geosciences (GFZ). GRACE-FO is a successor to the original GRACE mission, which began orbiting Earth on March 17, 2002. GRACE-FO will carry on the extremely successful work of its predecessor while testing a new technology designed to dramatically improve the already remarkable precision of its measurement system. The GRACE missions measure variations in gravity over Earth's surface, producing a new map of the gravity field every 30 days. Thus, GRACE shows how the planet's gravity differs not only from one location to another, but also from one period of time to another. https://photojournal.jpl.nasa.gov/catalog/PIA21607
Component Data Base for Space Station Resistojet Auxiliary Propulsion
NASA Technical Reports Server (NTRS)
Bader, Clayton H.
1988-01-01
The resistojet was baselined for Space Station auxiliary propulsion because of its operational versatility, efficiency, and durability. This report was conceived as a guide to designers and planners of the Space Station auxiliary propulsion system. It is directed to the low thrust resistojet concept, though it should have application to other station concepts or systems such as the Environmental Control and Life Support System (ECLSS), Manufacturing and Technology Laboratory (MTL), and the Waste Fluid Management System (WFMS). The information will likely be quite useful in the same capacity for other non-Space Station systems including satellite, freeflyers, explorers, and maneuvering vehicles. The report is a catalog of the most useful information for the most significant feed system components and is organized for the greatest convenience of the user.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This large document provides a catalog of the location of large numbers of reports pertaining to the charge of the Presidential Advisory Committee on Human Radiation Research and is arranged as a series of appendices. Titles of the appendices are Appendix A- Records at the Washington National Records Center Reviewed in Whole or Part by DoD Personnel or Advisory Committee Staff; Appendix B- Brief Descriptions of Records Accessions in the Advisory Committee on Human Radiation Experiments (ACHRE) Research Document Collection; Appendix C- Bibliography of Secondary Sources Used by ACHRE; Appendix D- Brief Descriptions of Human Radiation Experiments Identified by ACHRE,more » and Indexes; Appendix E- Documents Cited in the ACHRE Final Report and other Separately Described Materials from the ACHRE Document Collection; Appendix F- Schedule of Advisory Committee Meetings and Meeting Documentation; and Appendix G- Technology Note.« less
The DEIMOS 10K Spectroscopic Survey Catalog of the COSMOS Field
NASA Astrophysics Data System (ADS)
Hasinger, G.; Capak, P.; Salvato, M.; Barger, A. J.; Cowie, L. L.; Faisst, A.; Hemmati, S.; Kakazu, Y.; Kartaltepe, J.; Masters, D.; Mobasher, B.; Nayyeri, H.; Sanders, D.; Scoville, N. Z.; Suh, H.; Steinhardt, C.; Yang, Fengwei
2018-05-01
We present a catalog of 10,718 objects in the COSMOS field, observed through multi-slit spectroscopy with the Deep Imaging Multi-Object Spectrograph (DEIMOS) on the Keck II telescope in the wavelength range ∼5500–9800 Å. The catalog contains 6617 objects with high-quality spectra (two or more spectral features), and 1798 objects with a single spectroscopic feature confirmed by the photometric redshift. For 2024 typically faint objects, we could not obtain reliable redshifts. The objects have been selected from a variety of input catalogs based on multi-wavelength observations in the field, and thus have a diverse selection function, which enables the study of the diversity in the galaxy population. The magnitude distribution of our objects is peaked at I AB ∼ 23 and K AB ∼ 21, with a secondary peak at K AB ∼ 24. We sample a broad redshift distribution in the range 0 < z < 6, with one peak at z ∼ 1, and another one around z ∼ 4. We have identified 13 redshift spikes at z > 0.65 with chance probabilities < 4 × 10‑4, some of which are clearly related to protocluster structures of sizes >10 Mpc. An object-to-object comparison with a multitude of other spectroscopic samples in the same field shows that our DEIMOS sample is among the best in terms of fraction of spectroscopic failures and relative redshift accuracy. We have determined the fraction of spectroscopic blends to about 0.8% in our sample. This is likely a lower limit and at any rate well below the most pessimistic expectations. Interestingly, we find evidence for strong lensing of Lyα background emitters within the slits of 12 of our target galaxies, increasing their apparent density by about a factor of 4. The data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California and the National Aeronautics and Space Administration. The Observatory was made possible by the generous financial support of the W. M. Keck Foundation.
Nestler-Parr, Sandra; Korchagina, Daria; Toumi, Mondher; Pashos, Chris L; Blanchette, Christopher; Molsen, Elizabeth; Morel, Thomas; Simoens, Steven; Kaló, Zoltán; Gatermann, Ruediger; Redekop, William
2018-05-01
Successful development of new treatments for rare diseases (RDs) and their sustainable patient access require overcoming a series of challenges related to research and health technology assessment (HTA). These impediments, which may be unique to RDs or also apply to common diseases but are particularly pertinent in RDs, are diverse and interrelated. To develop for the first time a catalog of primary impediments to RD research and HTA, and to describe the cause and effect of individual challenges. Challenges were identified by an international 22-person expert working group and qualitative outreach to colleagues with relevant expertise. A broad range of stakeholder perspectives is represented. Draft results were presented at annual European and North American International Society for Pharmacoeconomics and Outcomes Research (ISPOR) congresses, and written comments were received by the 385-strong ISPOR Rare Disease Review Group from two rounds of review. Findings were refined and confirmed via targeted literature search. Research-related challenges linked to the low prevalence of RDs were categorized into those pertaining to disease recognition and diagnosis, evaluation of treatment effect, and patient recruitment for clinical research. HTA-related challenges were classified into issues relating to the lack of a tailored HTA method for RD treatments and uncertainty for HTA agencies and health care payers. Identifying and highlighting diverse, but interrelated, key challenges in RD research and HTA is an essential first step toward developing implementable and sustainable solutions. A collaborative multistakeholder effort is required to enable faster and less costly development of safe, efficacious, and appropriate new RD therapies that offer value for money. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
A Science Rationale for Mobility in Planetary Environments
NASA Technical Reports Server (NTRS)
1999-01-01
For the last several decades, the Committee on Planetary and Lunar Exploration (COMPLEX) has advocated a systematic approach to exploration of the solar system; that is, the information and understanding resulting from one mission provide the scientific foundations that motivate subsequent, more elaborate investigations. COMPLEX's 1994 report, An Integrated Strategy for the Planetary Sciences: 1995-2010,1 advocated an approach to planetary studies emphasizing "hypothesizing and comprehending" rather than "cataloging and categorizing." More recently, NASA reports, including The Space Science Enterprise Strategic Plan2 and, in particular, Mission to the Solar System: Exploration and Discovery-A Mission and Technology Roadmap,3 have outlined comprehensive plans for planetary exploration during the next several decades. The missions outlined in these plans are both generally consistent with the priorities outlined in the Integrated Strategy and other NRC reports,4-5 and are replete with examples of devices embodying some degree of mobility in the form of rovers, robotic arms, and the like. Because the change in focus of planetary studies called for in the Integrated Strategy appears to require an evolutionary change in the technical means by which solar system exploration missions are conducted, the Space Studies Board charged COMPLEX to review the science that can be uniquely addressed by mobility in planetary environments. In particular, COMPLEX was asked to address the following questions: (1) What are the practical methods for achieving mobility? (2) For surface missions, what are the associated needs for sample acquisition? (3) What is the state of technology for planetary mobility in the United States and elsewhere, and what are the key requirements for technology development? (4) What terrestrial field demonstrations are required prior to spaceflight missions?
Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora
2017-12-01
Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to ( a ) catalog feasibility measures/metrics and ( b ) propose a framework. For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization.
Pant Pai, Nitika; Chiavegatti, Tiago; Vijh, Rohit; Karatzas, Nicolaos; Daher, Jana; Smallwood, Megan; Wong, Tom; Engel, Nora
2017-01-01
Objective Pilot (feasibility) studies form a vast majority of diagnostic studies with point-of-care technologies but often lack use of clear measures/metrics and a consistent framework for reporting and evaluation. To fill this gap, we systematically reviewed data to (a) catalog feasibility measures/metrics and (b) propose a framework. Methods For the period January 2000 to March 2014, 2 reviewers searched 4 databases (MEDLINE, EMBASE, CINAHL, Scopus), retrieved 1441 citations, and abstracted data from 81 studies. We observed 2 major categories of measures, that is, implementation centered and patient centered, and 4 subcategories of measures, that is, feasibility, acceptability, preference, and patient experience. We defined and delineated metrics and measures for a feasibility framework. We documented impact measures for a comparison. Findings We observed heterogeneity in reporting of metrics as well as misclassification and misuse of metrics within measures. Although we observed poorly defined measures and metrics for feasibility, preference, and patient experience, in contrast, acceptability measure was the best defined. For example, within feasibility, metrics such as consent, completion, new infection, linkage rates, and turnaround times were misclassified and reported. Similarly, patient experience was variously reported as test convenience, comfort, pain, and/or satisfaction. In contrast, within impact measures, all the metrics were well documented, thus serving as a good baseline comparator. With our framework, we classified, delineated, and defined quantitative measures and metrics for feasibility. Conclusions Our framework, with its defined measures/metrics, could reduce misclassification and improve the overall quality of reporting for monitoring and evaluation of rapid point-of-care technology strategies and their context-driven optimization. PMID:29333105
A Scientific Rationale for Mobility in Planetary Environments
NASA Astrophysics Data System (ADS)
1999-01-01
For the last several decades, the COMmittee on Planetary and Lunar EXploration (COMPLEX) has advocated a systematic approach to exploration of the solar system; that is, the information and understanding resulting from one mission provide the scientific foundations that motivate subsequent, more elaborate investigations. COMPLEX's 1994 report, An Integrated Strategy for the Planetary Sciences: 1995-2010,1 advocated an approach to planetary studies emphasizing "hypothesizing and comprehending" rather than "cataloging and categorizing." More recently, NASA reports, including The Space Science Enterprise Strategic Plan' and, in particular, Mission to the Solar System: Exploration and Discovery-A Mission and Technology Roadmap, 3 have outlined comprehensive plans for planetary exploration during the next several decades. The missions outlined in these plans are both generally consistent with the priorities outlined in the Integrated Strategy and other NRC reports,4,5 and are replete with examples of devices embodying some degree of mobility in the form of rovers, robotic arms, and the like. Because the change in focus of planetary studies called for in the Integrated Strategy appears to require an evolutionary change in the technical means by which solar system exploration missions are conducted, the Space Studies Board charged COMPLEX to review the science that can be uniquely addressed by mobility in planetary environments. In particular, COMPLEX was asked to address the following questions: 1. What are the practical methods for achieving mobility? 2. For surface missions, what are the associated needs for sample acquisition? 3. What is the state of technology for planetary mobility in the United States and elsewhere, and what are the key requirements for technology development? 4. What terrestrial field demonstrations are required prior to spaceflight missions?
Search and Graph Database Technologies for Biomedical Semantic Indexing: Experimental Analysis.
Segura Bedmar, Isabel; Martínez, Paloma; Carruana Martín, Adrián
2017-12-01
Biomedical semantic indexing is a very useful support tool for human curators in their efforts for indexing and cataloging the biomedical literature. The aim of this study was to describe a system to automatically assign Medical Subject Headings (MeSH) to biomedical articles from MEDLINE. Our approach relies on the assumption that similar documents should be classified by similar MeSH terms. Although previous work has already exploited the document similarity by using a k-nearest neighbors algorithm, we represent documents as document vectors by search engine indexing and then compute the similarity between documents using cosine similarity. Once the most similar documents for a given input document are retrieved, we rank their MeSH terms to choose the most suitable set for the input document. To do this, we define a scoring function that takes into account the frequency of the term into the set of retrieved documents and the similarity between the input document and each retrieved document. In addition, we implement guidelines proposed by human curators to annotate MEDLINE articles; in particular, the heuristic that says if 3 MeSH terms are proposed to classify an article and they share the same ancestor, they should be replaced by this ancestor. The representation of the MeSH thesaurus as a graph database allows us to employ graph search algorithms to quickly and easily capture hierarchical relationships such as the lowest common ancestor between terms. Our experiments show promising results with an F1 of 69% on the test dataset. To the best of our knowledge, this is the first work that combines search and graph database technologies for the task of biomedical semantic indexing. Due to its horizontal scalability, ElasticSearch becomes a real solution to index large collections of documents (such as the bibliographic database MEDLINE). Moreover, the use of graph search algorithms for accessing MeSH information could provide a support tool for cataloging MEDLINE abstracts in real time. ©Isabel Segura Bedmar, Paloma Martínez, Adrián Carruana Martín. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 01.12.2017.
Semantics Enabled Queries in EuroGEOSS: a Discovery Augmentation Approach
NASA Astrophysics Data System (ADS)
Santoro, M.; Mazzetti, P.; Fugazza, C.; Nativi, S.; Craglia, M.
2010-12-01
One of the main challenges in Earth Science Informatics is to build interoperability frameworks which allow users to discover, evaluate, and use information from different scientific domains. This needs to address multidisciplinary interoperability challenges concerning both technological and scientific aspects. From the technological point of view, it is necessary to provide a set of special interoperability arrangement in order to develop flexible frameworks that allow a variety of loosely-coupled services to interact with each other. From a scientific point of view, it is necessary to document clearly the theoretical and methodological assumptions underpinning applications in different scientific domains, and develop cross-domain ontologies to facilitate interdisciplinary dialogue and understanding. In this presentation we discuss a brokering approach that extends the traditional Service Oriented Architecture (SOA) adopted by most Spatial Data Infrastructures (SDIs) to provide the necessary special interoperability arrangements. In the EC-funded EuroGEOSS (A European approach to GEOSS) project, we distinguish among three possible functional brokering components: discovery, access and semantics brokers. This presentation focuses on the semantics broker, the Discovery Augmentation Component (DAC), which was specifically developed to address the three thematic areas covered by the EuroGEOSS project: biodiversity, forestry and drought. The EuroGEOSS DAC federates both semantics (e.g. SKOS repositories) and ISO-compliant geospatial catalog services. The DAC can be queried using common geospatial constraints (i.e. what, where, when, etc.). Two different augmented discovery styles are supported: a) automatic query expansion; b) user assisted query expansion. In the first case, the main discovery steps are: i. the query keywords (the what constraint) are “expanded” with related concepts/terms retrieved from the set of federated semantic services. A default expansion regards the multilinguality relationship; ii. The resulting queries are submitted to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. In the second case, the main discovery steps are: i. the user browses the federated semantic repositories and selects the concepts/terms-of-interest; ii. The DAC creates the set of geospatial queries based on the selected concepts/terms and submits them to the federated catalog services; iii. The DAC performs a “smart” aggregation of the queries results and provides them back to the client. A Graphical User Interface (GUI) was also developed for testing and interacting with the DAC. The entire brokering framework is deployed in the context of EuroGEOSS infrastructure and it is used in a couple of GEOSS AIP-3 use scenarios: the “e-Habitat Use Scenario” for the Biodiversity and Climate Change topic, and the “Comprehensive Drought Index Use Scenario” for Water/Drought topic
OrthoDB v8: update of the hierarchical catalog of orthologs and the underlying free software.
Kriventseva, Evgenia V; Tegenfeldt, Fredrik; Petty, Tom J; Waterhouse, Robert M; Simão, Felipe A; Pozdnyakov, Igor A; Ioannidis, Panagiotis; Zdobnov, Evgeny M
2015-01-01
Orthology, refining the concept of homology, is the cornerstone of evolutionary comparative studies. With the ever-increasing availability of genomic data, inference of orthology has become instrumental for generating hypotheses about gene functions crucial to many studies. This update of the OrthoDB hierarchical catalog of orthologs (http://www.orthodb.org) covers 3027 complete genomes, including the most comprehensive set of 87 arthropods, 61 vertebrates, 227 fungi and 2627 bacteria (sampling the most complete and representative genomes from over 11,000 available). In addition to the most extensive integration of functional annotations from UniProt, InterPro, GO, OMIM, model organism phenotypes and COG functional categories, OrthoDB uniquely provides evolutionary annotations including rates of ortholog sequence divergence, copy-number profiles, sibling groups and gene architectures. We re-designed the entirety of the OrthoDB website from the underlying technology to the user interface, enabling the user to specify species of interest and to select the relevant orthology level by the NCBI taxonomy. The text searches allow use of complex logic with various identifiers of genes, proteins, domains, ontologies or annotation keywords and phrases. Gene copy-number profiles can also be queried. This release comes with the freely available underlying ortholog clustering pipeline (http://www.orthodb.org/software). © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Li, Guotian; Jain, Rashmi; Chern, Mawsheng; Pham, Nikki T; Martin, Joel A; Wei, Tong; Schackwitz, Wendy S; Lipzen, Anna M; Duong, Phat Q; Jones, Kyle C; Jiang, Liangrong; Ruan, Deling; Bauer, Diane; Peng, Yi; Barry, Kerrie W; Schmutz, Jeremy; Ronald, Pamela C
2017-06-01
The availability of a whole-genome sequenced mutant population and the cataloging of mutations of each line at a single-nucleotide resolution facilitate functional genomic analysis. To this end, we generated and sequenced a fast-neutron-induced mutant population in the model rice cultivar Kitaake ( Oryza sativa ssp japonica ), which completes its life cycle in 9 weeks. We sequenced 1504 mutant lines at 45-fold coverage and identified 91,513 mutations affecting 32,307 genes, i.e., 58% of all rice genes. We detected an average of 61 mutations per line. Mutation types include single-base substitutions, deletions, insertions, inversions, translocations, and tandem duplications. We observed a high proportion of loss-of-function mutations. We identified an inversion affecting a single gene as the causative mutation for the short-grain phenotype in one mutant line. This result reveals the usefulness of the resource for efficient, cost-effective identification of genes conferring specific phenotypes. To facilitate public access to this genetic resource, we established an open access database called KitBase that provides access to sequence data and seed stocks. This population complements other available mutant collections and gene-editing technologies. This work demonstrates how inexpensive next-generation sequencing can be applied to generate a high-density catalog of mutations. © 2017 American Society of Plant Biologists. All rights reserved.
Li, Guotian; Jain, Rashmi; Chern, Mawsheng; ...
2017-06-02
The availability of a whole-genome sequenced mutant population and the cataloging of mutations of each line at a single-nucleotide resolution facilitate functional genomic analysis. To this end, we generated and sequenced a fast-neutron-induced mutant population in the model rice cultivar Kitaake (Oryza sativa ssp japonica), which completes its life cycle in 9 weeks. We sequenced 1504 mutant lines at 45-fold coverage and identified 91,513 mutations affecting 32,307 genes, i.e., 58% of all rice genes. We detected an average of 61 mutations per line. Mutation types include single-base substitutions, deletions, insertions, inversions, translocations, and tandem duplications. We observed a high proportionmore » of loss-of-function mutations. We identified an inversion affecting a single gene as the causative mutation for the short-grain phenotype in one mutant line. This result reveals the usefulness of the resource for efficient, cost-effective identification of genes conferring specific phenotypes. To facilitate public access to this genetic resource, we established an open access database called KitBase that provides access to sequence data and seed stocks. This population complements other available mutant collections and gene-editing technologies. In conclusion, this work demonstrates how inexpensive next-generation sequencing can be applied to generate a high-density catalog of mutations.« less
Trade Space Assessment for Human Exploration Mission Design
NASA Technical Reports Server (NTRS)
Joosten, B. Kent
2006-01-01
Many human space exploration mission architecture assessments have been performed over the years by diverse organizations and individuals. Direct comparison of metrics among these studies is extremely difficult due to widely varying assumptions involving projected technology readiness, mission goals, acceptable risk criteria, and socio-political environments. However, constant over the years have been the physical laws of celestial dynamics and rocket propulsion systems. A finite diverse yet finite architecture trade space should exist which captures methods of human exploration - particularly of the Moon and Mars - by delineating technical trades and cataloging the physically realizable options of each. A particular architectural approach should then have a traceable path through this "trade tree". It should be pointed out that not every permutation of paths will result in a physically realizable mission approach, but cataloging options that have been examined by past studies should help guide future analysis. This effort was undertaken in two phases by multi-center NASA working groups in the spring and summer of 2004 using more than thirty years of past studies to "flesh out" the Moon-Mars human exploration trade space. The results are presented, not as a "trade tree", which would be unwieldy, but as a "menu" of potential technical options as a function of mission phases. This is envisioned as a tool to aid future mission designers by offering guidance to relevant past analyses.
Earth Science Data Grid System
NASA Astrophysics Data System (ADS)
Chi, Y.; Yang, R.; Kafatos, M.
2004-12-01
The Earth Science Data Grid System (ESDGS) is a software in support of earth science data storage and access. It is built upon the Storage Resource Broker (SRB) data grid technology. We have developed a complete data grid system consistent of SRB server providing users uniform access to diverse storage resources in a heterogeneous computing environment and metadata catalog server (MCAT) managing the metadata associated with data set, users, and resources. We are also developing additional services of 1) metadata management, 2) geospatial, temporal, and content-based indexing, and 3) near/on site data processing, in response to the unique needs of Earth science applications. In this paper, we will describe the software architecture and components of the system, and use a practical example in support of storage and access of rainfall data from the Tropical Rainfall Measuring Mission (TRMM) to illustrate its functionality and features.
NASA CORE: Central Operation of Resources for Educators-Educational Materials Catalog
NASA Technical Reports Server (NTRS)
1998-01-01
The NASA Central Operation of Resources for Educators (CORE), established in cooperation with Lorain County Joint Vocational School, serves as the worldwide distribution center for NASA-produced educational materials. For a minimal charge, CORE will provide a valuable service to educators unable to visit one of the NASA Educator Resource Centers by making NASA educational audiovisual materials available through its mail order service. Through CORE's distribution network, the public has access to more than 200 videocassette, slide, and CD-ROM programs, chronicling NASA!s state-of-the-art research and technology. Through the use of these curriculum supplement materials, teachers can provide their students with the latest in aerospace information. NASAs educational materials on aeronautics and space provide a springboard for classroom discussion of life science, physical science, astronomy, energy, Earth resources, environment, mathematics, and career education.
Secrets of Hidden Valley on Mars
2015-10-08
An image taken at the Hidden Valley site, en-route to Mount Sharp, by NASA Curiosity rover. A variety of mudstone strata in the area indicate a lakebed deposit, with river- and stream-related deposits nearby. Decoding the history of how these sedimentary rocks were formed, and during what period of time, was a key component in the confirming of the role of water and sedimentation in the formation of the floor of Gale Crater and Mount Sharp. This image was taken by the Mast Camera (Mastcam) on Curiosity on the 703rd Martian day, or sol, of the mission. Malin Space Science Systems, San Diego, built and operates Curiosity's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, built the rover and manages the project for NASA's Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA19840
NASA Technical Reports Server (NTRS)
Tarter, Jill C.
1993-01-01
The final report for the period 15 Mar. 1986 to 31 Mar. 1993 for the Cooperative Agreement is presented. The purpose of this Cooperative Agreement was to collaborate with NASA civil servant and contractor personnel, and other Institute personnel in a project to use all available cataloged astronomical infrared data to construct a detailed three dimensional model of the infrared sky. Areas of research included: IRAS colors of normal stars and the infrared excesses in Be stars; galactic structure; how to use the observed IRAS source counts as a function of position to deduce the physical structure of the galaxy; IRAS properties of metal-poor stars; IRAS database studies; and solar space exploration including projects such as the Space Station Gas-Grain Simulator and the Mars Rover/Sample Return Mission.
Multilevel library instruction for emerging nursing roles.
Francis, B W; Fisher, C C
1995-10-01
As new nursing roles emerge that involve greater decision making than in the past, added responsibility for outcomes and cost control, and increased emphasis on primary care, the information-seeking skills needed by nurses change. A search of library and nursing literature indicates that there is little comprehensive library instruction covering all levels of nursing programs: undergraduate, returning registered nurses, and graduate students. The University of Florida is one of the few places that has such a multilevel, course-integrated curriculum in place for all entrants into the nursing program. Objectives have been developed for each stage of learning. The courses include instruction in the use of the online public access catalog, printed resources, and electronic databases. A library classroom equipped with the latest technology enables student interaction with electronic databases. This paper discusses the program and several methods used to evaluate it.
Navigation for the new millennium: Autonomous navigation for Deep Space 1
NASA Technical Reports Server (NTRS)
Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.;
1997-01-01
The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.
The future is now: single-cell genomics of bacteria and archaea
Blainey, Paul C.
2013-01-01
Interest in the expanding catalog of uncultivated microorganisms, increasing recognition of heterogeneity among seemingly similar cells, and technological advances in whole-genome amplification and single-cell manipulation are driving considerable progress in single-cell genomics. Here, the spectrum of applications for single-cell genomics, key advances in the development of the field, and emerging methodology for single-cell genome sequencing are reviewed by example with attention to the diversity of approaches and their unique characteristics. Experimental strategies transcending specific methodologies are identified and organized as a road map for future studies in single-cell genomics of environmental microorganisms. Over the next decade, increasingly powerful tools for single-cell genome sequencing and analysis will play key roles in accessing the genomes of uncultivated organisms, determining the basis of microbial community functions, and fundamental aspects of microbial population biology. PMID:23298390
Taboada, María; Martínez, Diego; Pilo, Belén; Jiménez-Escrig, Adriano; Robinson, Peter N; Sobrido, María J
2012-07-31
Semantic Web technology can considerably catalyze translational genetics and genomics research in medicine, where the interchange of information between basic research and clinical levels becomes crucial. This exchange involves mapping abstract phenotype descriptions from research resources, such as knowledge databases and catalogs, to unstructured datasets produced through experimental methods and clinical practice. This is especially true for the construction of mutation databases. This paper presents a way of harmonizing abstract phenotype descriptions with patient data from clinical practice, and querying this dataset about relationships between phenotypes and genetic variants, at different levels of abstraction. Due to the current availability of ontological and terminological resources that have already reached some consensus in biomedicine, a reuse-based ontology engineering approach was followed. The proposed approach uses the Ontology Web Language (OWL) to represent the phenotype ontology and the patient model, the Semantic Web Rule Language (SWRL) to bridge the gap between phenotype descriptions and clinical data, and the Semantic Query Web Rule Language (SQWRL) to query relevant phenotype-genotype bidirectional relationships. The work tests the use of semantic web technology in the biomedical research domain named cerebrotendinous xanthomatosis (CTX), using a real dataset and ontologies. A framework to query relevant phenotype-genotype bidirectional relationships is provided. Phenotype descriptions and patient data were harmonized by defining 28 Horn-like rules in terms of the OWL concepts. In total, 24 patterns of SWQRL queries were designed following the initial list of competency questions. As the approach is based on OWL, the semantic of the framework adapts the standard logical model of an open world assumption. This work demonstrates how semantic web technologies can be used to support flexible representation and computational inference mechanisms required to query patient datasets at different levels of abstraction. The open world assumption is especially good for describing only partially known phenotype-genotype relationships, in a way that is easily extensible. In future, this type of approach could offer researchers a valuable resource to infer new data from patient data for statistical analysis in translational research. In conclusion, phenotype description formalization and mapping to clinical data are two key elements for interchanging knowledge between basic and clinical research.
Optimal Fragmentation and Dispersion of Hazardous Near-Earth Objects
NASA Technical Reports Server (NTRS)
Wie, Bong
2012-01-01
The complex problem of protecting the Earth from the possibility of a catastrophic impact by a hazardous near-Earth object (NEO) has been recently reassessed in [1]. In a letter on NEOs from the White House Office of Science and Technology Policy (OSTP) to the U.S. Senate and Congress in 2010, the White House OSTP strongly recommended that NASA take the lead in conducting research activities for NEO detection, characterization, and deflection technologies. Furthermore, President Obama's new National Space Policy specifically directs NASA to "pursue capabilities, in cooperation with other departments, agencies, and commercial partners, to detect, track, catalog, and characterize NEOs to reduce the risk of harm to humans from an unexpected impact on our planet." The Planetary Defense Task Force of the NASA Advisory Council also recommended that the NASA Office of the Chief Technologist (OCT) begin efforts to investigate asteroid deflection techniques. With national interest growing in the United States, the NEO threat detection and mitigation problem was recently identified as one of NASA's Space Technology Grand Challenges. An innovative solution to NASA's NEO Impact Threat Mitigation Grand Challenge problem was developed through a NIAC Phase I study (9/16/11 - 9/15/12), and it will be further investigated for a NIAC Phase II study (9/10/12 - 9/9/14). Various NEO deflection technologies, including nuclear explosions, kinetic impactors, and slow-pull gravity tractors, have been proposed and examined during the past two decades. Still, there is no consensus on how to reliably deflect or disrupt hazardous NEOs in a timely manner. It is expected that the most probable mission scenarios will have a mission lead time much shorter than 10 years, so the use of nuclear explosives becomes the most feasible method for planetary defense. Direct intercept missions with a short warning time will result in arrival closing velocities of 10-30 kilometers per second with respect to the target asteroid. Given such a large arrival delta V requirement, a rendezvous mission to the target asteroid is infeasible with existing launch vehicles. Furthermore, state-of-the-art penetrating subsurface nuclear explosion technology limits the penetrator's impact velocity to less than approximately 300 meters per second because higher impact velocities prematurely destroy the nuclear fuzing mechanisms. Therefore, significant advances in hypervelocity nuclear interceptor/ penetrator technology must be achieved to enable a last-minute nuclear disruption mission with intercept velocities as high as 30 kilometers per second. Consequently, a HAIV (Hypervelocity Asteroid Intercept Vehicle) mission architecture (Figure 1.1), which blends a hypervelocity kinetic impactor with a subsurface nuclear explosion for optimal fragmentation and dispersion of hazardous NEOs, has been developed through a Phase I study, and it will be further developed and validated through a Phase II study.
Adapting biomarker technologies to adverse outcome ...
Adverse outcome pathways (AOP) research is a relatively new concept in human systems biology for assessing the molecular level linkage from an initiating (chemical) event that could lead to a disease state. Although most implementations of AOPs are based on liquids analyses, there are now new technologies being considered derived from the broad field of breath research, especially in applications of gas-phase analysis and instrumentation. The ultimate goal is to discover disease progressions in human or animal systems, identify them at the molecular or cellular level, and then choose analytes that can distinctly define the presence of a particular path (Ankley et al. 2010, Villeneuve et al. 2014). Once such in vivo pathways are identified, then in vitro assays can be developed for streamlined testing of chemical effects without additional human or animal based studies (Pleil et al. 2012). Recent work has focused on discovery analysis in breath, or other biological media, wherein as many as possible compounds are cataloged and then linked to their biochemical source as exogenous (external), endogenous (internal) or from the microbiome (Pleil et al. 2013a, de Lacy Costello 2014, Pleil et al. 2013b, Trefz et al. 2013, Pleil et al. 2014). Such research lays the groundwork for identifying compounds from systems biology that might be relevant for developing AOPs for in vitro research. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atm
NASA Astrophysics Data System (ADS)
Sanz, D.; Ruiz, M.; Castro, R.; Vega, J.; Afif, M.; Monroe, M.; Simrock, S.; Debelle, T.; Marawar, R.; Glass, B.
2016-04-01
To aid in assessing the functional performance of ITER, Fission Chambers (FC) based on the neutron diagnostic use case deliver timestamped measurements of neutron source strength and fusion power. To demonstrate the Plant System Instrumentation & Control (I&C) required for such a system, ITER Organization (IO) has developed a neutron diagnostics use case that fully complies with guidelines presented in the Plant Control Design Handbook (PCDH). The implementation presented in this paper has been developed on the PXI Express (PXIe) platform using products from the ITER catalog of standard I&C hardware for fast controllers. Using FlexRIO technology, detector signals are acquired at 125 MS/s, while filtering, decimation, and three methods of neutron counting are performed in real-time via the onboard Field Programmable Gate Array (FPGA). Measurement results are reported every 1 ms through Experimental Physics and Industrial Control System (EPICS) Channel Access (CA), with real-time timestamps derived from the ITER Timing Communication Network (TCN) based on IEEE 1588-2008. Furthermore, in accordance with ITER specifications for CODAC Core System (CCS) application development, the software responsible for the management, configuration, and monitoring of system devices has been developed in compliance with a new EPICS module called Nominal Device Support (NDS) and RIO/FlexRIO design methodology.
Transfer of heavy metals through terrestrial food webs: a review.
Gall, Jillian E; Boyd, Robert S; Rajakaruna, Nishanta
2015-04-01
Heavy metals are released into the environment by both anthropogenic and natural sources. Highly reactive and often toxic at low concentrations, they may enter soils and groundwater, bioaccumulate in food webs, and adversely affect biota. Heavy metals also may remain in the environment for years, posing long-term risks to life well after point sources of heavy metal pollution have been removed. In this review, we compile studies of the community-level effects of heavy metal pollution, including heavy metal transfer from soils to plants, microbes, invertebrates, and to both small and large mammals (including humans). Many factors contribute to heavy metal accumulation in animals including behavior, physiology, and diet. Biotic effects of heavy metals are often quite different for essential and non-essential heavy metals, and vary depending on the specific metal involved. They also differ for adapted organisms, including metallophyte plants and heavy metal-tolerant insects, which occur in naturally high-metal habitats (such as serpentine soils) and have adaptations that allow them to tolerate exposure to relatively high concentrations of some heavy metals. Some metallophyte plants are hyperaccumulators of certain heavy metals and new technologies using them to clean metal-contaminated soil (phytoextraction) may offer economically attractive solutions to some metal pollution challenges. These new technologies provide incentive to catalog and protect the unique biodiversity of habitats that have naturally high levels of heavy metals.
ISFET sensor evaluation and modification for seawater pH measurement
NASA Astrophysics Data System (ADS)
Martz, T. R.; Johnson, K. S.; Jannasch, H.; Coletti, L.; Barry, J.; Lovera, C.
2008-12-01
In the future, short-term cycles (daily to subannual) and long-term trends (annual and greater) in the carbonate system will be observed by autonomous sensors operating from a variety of platforms (e.g., moorings, profiling floats, AUVs, etc.). Of the four carbonate parameters, pH measurement has the longest history of development - yet robust autonomous sensing techniques remain elusive due to a catalog of technical challenges. Existing commercial sensor technologies generally do not meet the stringent demands of accuracy, long-term stability, low power, pressure tolerance, resistance to biofouling, and ease of use required by the oceanographic community. We report here on some recent advances in Ion Sensitive Field Effect Transistor (ISFET) technology that may open the door for more widespread autonomous seawater pH measurements. Much of our work has focused on applications of the Honeywell Durafet pH sensor, a product designed for industrial process control. Initial results from laboratory testing and deployments in the MBARI test tank and near shore moorings will be presented. Sensor calibration techniques will be addressed. Applications of now-available off-the-shelf sensors including shipboard underway measurement, shallow water mooring deployment, and a gas controlled seawater aquarium for pH perturbation experiments will be discussed. We hope that an ongoing collaboration between MBARI and Honeywell will result in a commercially available product, designed specifically for oceanographic applications, within the next several years.
VOClient: Application Integration in the Virtual Observatory
NASA Astrophysics Data System (ADS)
Fitzpatrick, Michael J.; Tody, D.
2007-12-01
We present VOClient, a new software package that provides a high-level, easy-to-use, programmable interface between desktop applications and the distributed VO framework, providing access to remote VO data and services, reference implementations for VO data-providers and end-user applications. Applications have traditionally been written to deal directly with local images, catalogs or spectra; VOClient allows these applications to use remote VO data and services without requiring a developer to know the details of the underlying and evolving VO technologies. The programmable interface provides equivalent functionality for a wide variety of both legacy and modern development languages and environments and can be easily extended to add new functionality. The server component of the project provides a reference implementation and toolkit which can be used to build VO data services, and the commandline tools provide ready-to-use applications to access VO data and services from the desktop or scripting environment. The use of VOClient to integrate VO technologies with legacy systems such as IRAF is examined as a case-study, and the use of these techniques in other environments, especially their applicability to legacy code and systems, is also discussed. VOClient is meant both for the astronomer wishing to revive an old and trusted task with new VO capabiities, as well as the institutional project providing data or services to the Virtual Observatory.
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan K.; Murphy, Charles; Moore, James; Wetzel, Melanie; Knight, David; Ruscher, Paul; Mullen, Steve; Desouza, Russel; Hawk, Denise S.; Fulker, David
1995-12-01
This report summarizes discussions that took place during a Unidata Cooperative Program for Operational Meteorology, Education, and Training (COMET) workshop on Mesoscale Meteorology Instruction in the Age of the Modernized Weather Service. The workshop was held 13-17 June 1994 in Boulder, Colorado, and it was organized by the Unidata Users Committee, with help from Unidata, COMET, and the National Center for Atmospheric Research staff. The principal objective of the workshop was to assess the need for and to initiate those changes at universities that will be required if students are to learn mesoscale and synoptic meteorology more effectively in this era of rapid technological advances. Seventy-one participants took part in the workshop, which included invited lectures, breakout roundtable discussions on focused topics, electronic poster sessions, and a forum for discussing recommendations and findings in a plenary session. Leading scientists and university faculty in the area of synoptic and mesoscale meteorology were invited to share their ideas for integrating data from new observing systems, research and operational weather prediction models, and interactive computer technologies into the classroom. As a result, many useful ideas for incorporating mesoscale datasets and analysis tools into the classroom emerged. Also, recommendations for future coordinated activities to create, catalog, and distribute case study datasets were made by the attendees.
TARPs: Tracked Active Region Patches from SoHO/MDI
NASA Astrophysics Data System (ADS)
Turmon, M.; Hoeksema, J. T.; Bobra, M.
2013-12-01
We describe progress toward creating a retrospective MDI data product consisting of tracked magnetic features on the scale of solar active regions, abbreviated TARPs (Tracked Active Region Patches). The TARPs are being developed as a backward-looking extension (covering approximately 3500 regions spanning 1996-2010) to the HARP (HMI Active Region Patch) data product that has already been released for HMI (2010-present). Like the HARPs, the MDI TARP data set is designed to be a catalog of active regions (ARs), indexed by a region ID number, analogous to a NOAA AR number, and time. TARPs from MDI are computed based on the 96-minute synoptic magnetograms and pseudo-continuum intensitygrams. As with the related HARP data product, the approximate threshold for significance is 100G. Use of both image types together allows faculae and sunspots to be separated out as sub-classes of activity, in addition to identifying the overall active region that the faculae/sunspots are part of. After being identified in single images, the magnetically-active patches are grouped and tracked from image to image. Merges among growing active regions, as well as faint active regions hovering at the threshold of detection, are handled automatically. Regions are tracked from their inception until they decay within view, or transit off the visible disk. The final data product is indexed by a nominal AR number and time. For each active region and for each time, a bitmap image is stored containing the precise outline of the active region. Additionaly, metadata such as areas and integrated fluxes are stored for each AR and for each time. Because there is a calibration between the HMI and MDI magnetograms (Liu, Hoeksema et al. 2012), it is straightforward to use the same classification and tracking rules for the HARPs (from HMI) and the MDI TARPs. We anticipate that this will allow a consistent catalog spanning both instruments. We envision several uses for the TARP data product, which will be available in the MDI resident archive (RA). The catalog, indexed by AR number and time, eases data subsetting, which is useful to focus computationally expensive studies on just the active parts of the Sun. The catalog will enable per-AR studies such as the relation between AR structure and energetic events like flares, in a way that can readily consider AR age and geometry. The TARP catalog, combined with the HARP catalog, could enable extended studies, such as solar irradiance, across cycles 23 and 24, and allow analyses that had been confined to just a handful of ARs to be extended to a larger set. A portion of this research was performed at the Jet Propulsion Laboratory, California Institute of Technology. All Rights Reserved. A tracked AR as described here (compare NOAA 10095). Center panel: selected appearances of the AR. Right and left panels: snapshots at times T1 and T2. At starred times, the AR contains multiple unconnected pieces.
Heimdall System for MSSS Sensor Tasking
NASA Astrophysics Data System (ADS)
Herz, A.; Jones, B.; Herz, E.; George, D.; Axelrad, P.; Gehly, S.
In Norse Mythology, Heimdall uses his foreknowledge and keen eyesight to keep watch for disaster from his home near the Rainbow Bridge. Orbit Logic and the Colorado Center for Astrodynamics Research (CCAR) at the University of Colorado (CU) have developed the Heimdall System to schedule observations of known and uncharacterized objects and search for new objects from the Maui Space Surveillance Site. Heimdall addresses the current need for automated and optimized SSA sensor tasking driven by factors associated with improved space object catalog maintenance. Orbit Logic and CU developed an initial baseline prototype SSA sensor tasking capability for select sensors at the Maui Space Surveillance Site (MSSS) using STK and STK Scheduler, and then added a new Track Prioritization Component for FiSST-inspired computations for predicted Information Gain and Probability of Detection, and a new SSA-specific Figure-of-Merit (FOM) for optimized SSA sensor tasking. While the baseline prototype addresses automation and some of the multi-sensor tasking optimization, the SSA-improved prototype addresses all of the key elements required for improved tasking leading to enhanced object catalog maintenance. The Heimdall proof-of-concept was demonstrated for MSSS SSA sensor tasking for a 24 hour period to attempt observations of all operational satellites in the unclassified NORAD catalog, observe a small set of high priority GEO targets every 30 minutes, make a sky survey of the GEO belt region accessible to MSSS sensors, and observe particular GEO regions that have a high probability of finding new objects with any excess sensor time. This Heimdall prototype software paves the way for further R&D that will integrate this technology into the MSSS systems for operational scheduling, improve the software's scalability, and further tune and enhance schedule optimization. The Heimdall software for SSA sensor tasking provides greatly improved performance over manual tasking, improved coordinated sensor usage, and tasking schedules driven by catalog improvement goals (reduced overall covariance, etc.). The improved performance also enables more responsive sensor tasking to address external events, newly detected objects, newly detected object activity, and sensor anomalies. Instead of having to wait until the next day's scheduling phase, events can be addressed with new tasking schedules immediately (within seconds or minutes). Perhaps the most important benefit is improved SSA based on an overall improvement to the quality of the space catalog. By driving sensor tasking and scheduling based on predicted Information Gain and other relevant factors, better decisions are made in the application of available sensor resources, leading to an improved catalog and better information about the objects of most interest. The Heimdall software solution provides a configurable, automated system to improve sensor tasking efficiency and responsiveness for SSA applications. The FISST algorithms for Track Prioritization, SSA specific task and resource attributes, Scheduler algorithms, and configurable SSA-specific Figure-of-Merit together provide optimized and tunable scheduling for the Maui Space Surveillance Site and possibly other sites and organizations across the U.S. military and for allies around the world.
Facilitating Science Discoveries from NED Today and in the 2020s
NASA Astrophysics Data System (ADS)
Mazzarella, Joseph M.; NED Team
2018-06-01
I will review recent developments, work in progress, and major challenges that lie ahead as we enhance the capabilities of the NASA/IPAC Extragalactic Database (NED) to facilitate and accelerate multi-wavelength research on objects beyond our Milky Way galaxy. The recent fusion of data for over 470 million sources from the 2MASS Point Source Catalog and approximately 750 million sources from the AllWISE Source Catalog (next up) with redshifts from the SDSS and other data in NED is increasing the holdings to over a billion distinct objects with cross-identifications, providing a rich resource for multi-wavelength research. Combining data across such large surveys, as well as integrating data from over 110,000 smaller but scientifically important catalogs and journal articles, presents many challanges including the need to update the computing infrastructure and re-tool production and operations on a regular basis. Integration of the Firefly toolkit into the new user interface is ushering in a new phase of interative data visualization in NED, with features and capabilities familiar to users of IRSA and the emerging LSST science user interface. Graphical characterizations of NED content and estimates of completeness in different sky and spectral regions are also being developed. A newly implemented service that follows the Table Access Protocol (TAP) enables astronomers to issue queries to the NED object directory using Astronomical Data Language (ADQL), a standard shared in common with the NASA mission archives and other virtual observatories around the world. A brief review will be given of new science capabilities under development and planned for 2019-2020, as well as initiatives underway involving deployment of a parallel database, cloud technologies, machine learning, and first steps in bringing analysis capabilities close to the database in collaboration with IRSA. I will close with some questions for the community to consider in helping us plan future science capabilities and directions for NED in the 2020s.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monteleone, S.
This three-volume report contains 90 papers out of the 102 that were presented at the Twenty-First Water Reactor Safety Information Meeting held at the Bethesda Marriott Hotel, Bethesda, Maryland, during the week of October 25--27, 1993. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Germany, Japan, Russia, Switzerland, Taiwan, and United Kingdom. The titles of the papers and the names of the authors have been updatedmore » and may differ from those that appeared in the final program of the meeting. Individual papers have been cataloged separately. This document, Volume 1 covers the following topics: Advanced Reactor Research; Advanced Instrumentation and Control Hardware; Advanced Control System Technology; Human Factors Research; Probabilistic Risk Assessment Topics; Thermal Hydraulics; and Thermal Hydraulic Research for Advanced Passive Light Water Reactors.« less
Proteome Studies of Filamentous Fungi
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Scott E.; Panisko, Ellen A.
2011-04-20
The continued fast pace of fungal genome sequence generation has enabled proteomic analysis of a wide breadth of organisms that span the breadth of the Kingdom Fungi. There is some phylogenetic bias to the current catalog of fungi with reasonable DNA sequence databases (genomic or EST) that could be analyzed at a global proteomic level. However, the rapid development of next generation sequencing platforms has lowered the cost of genome sequencing such that in the near future, having a genome sequence will no longer be a time or cost bottleneck for downstream proteomic (and transcriptomic) analyses. High throughput, non-gel basedmore » proteomics offers a snapshot of proteins present in a given sample at a single point in time. There are a number of different variations on the general method and technologies for identifying peptides in a given sample. We present a method that can serve as a “baseline” for proteomic studies of fungi.« less
A catalog of automated analysis methods for enterprise models.
Florez, Hector; Sánchez, Mario; Villalobos, Jorge
2016-01-01
Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.
Harnessing Electricity from Chemical Gardens
2015-08-05
This photo simulation shows a laboratory-created "chemical garden," which is a chimney-like structure found at bubbling vents on the seafloor. Some researchers think life on Earth might have got its start at structures like these billions of years ago, partly due to their ability to transfer electrical currents -- an essential trait of life as we know it. The battery-like property of these chemical gardens was demonstrated by linking several together in series to light an LED (light-emitting diode) bulb. In this photo simulation, the bulb is not really attached to the chimney. The chimney membranes are made of iron sulfides and iron hydroxides, geologic materials that conduct electrons. JPL's research team is part of the Icy Worlds team of the NASA Astrobiology Institute, based at NASA's Ames Research Center in Moffett Field, California. JPL is managed by the California Institute of Technology in Pasadena for NASA. http://photojournal.jpl.nasa.gov/catalog/PIA19834
Clean room survey and assessment, volume 5, appendix H
NASA Technical Reports Server (NTRS)
1991-01-01
The scope of this task is to perform a comparative analysis of the various Environmental Control Life Support System (ECLSS) options for different growth scenarios. The Space Station Freedom ECLSS design and existing ground-based clean room facilities are used as a baseline for comparison. Specifically addressed here are the ground based clean room facilities at the Marshall Space Flight Center (MSFC). Given here is an evaluation of the facilities, equipment, technologies, and procedures used to maintain specified environments in typical aerospace industrial areas. Twenty-five specific clean rooms are evaluated. The objectives were to collect, compare, and catalog data for each specified facility in the areas of engineering and design, construction materials, work stations, contamination control, particulate elimination, entry systems, and instrumentation, and to make recommendations concerning enhancements required to assure an efficient and orderly evolution of MSFC clean room environmental control facilities.
NASA Technical Reports Server (NTRS)
1981-01-01
Preparation for the Apollo Soyuz mission entailed large-scale informational exchange that was accomplished by a computerized translation system. Based on this technology of commercial machine translation, a system known as SYSTRAN II was developed by LATSEC, Inc. and the World Translation Company of Canada. This system increases the output of a human translator by five to eight times, affording cost savings by allowing a large increase in document production without hiring additional people. Extra savings accrue from automatic production of camera-ready copy. Applications include translation of service manuals, proposals and tenders, planning studies, catalogs, list of parts and prices, textbooks, technical reports and education/training materials. System is operational for six language pairs. Systran users include Xerox Corporation, General Motors of Canada, Bell Northern Research of Canada, the U.S. Air Force and the European Commission. The company responsible for the production of SYSTRAN II has changed its name to SYSTRAN.
McClure, L W
1998-01-01
From the vantage point of her personal experience, the author examines milestones since the 1960s which have changed the medical library profession and helped shape the Medical Library Association. The advent of automation, including cataloging with OCLC and online literature searching through the SUNY Biomedical Communication Network, was a dramatic event that transformed the work and priorities of librarians, fulfilling the dreams of earlier visionaries. The application of technology in libraries led to an increased demand for education and training for librarians. The Medical Library Association responded with continuing education programs, and a series of important reports influenced how the association filled its role in professional development. Legislation providing federal funding, such as the Medical Library Assistance Act, resulted in a period of expansion for libraries and their services. The Medical Library Association has developed a legislative agenda to influence action in areas such as copyright. In the future, health sciences librarians must take a leadership role. PMID:9578947
McClure, L W
1998-04-01
From the vantage point of her personal experience, the author examines milestones since the 1960s which have changed the medical library profession and helped shape the Medical Library Association. The advent of automation, including cataloging with OCLC and online literature searching through the SUNY Biomedical Communication Network, was a dramatic event that transformed the work and priorities of librarians, fulfilling the dreams of earlier visionaries. The application of technology in libraries led to an increased demand for education and training for librarians. The Medical Library Association responded with continuing education programs, and a series of important reports influenced how the association filled its role in professional development. Legislation providing federal funding, such as the Medical Library Assistance Act, resulted in a period of expansion for libraries and their services. The Medical Library Association has developed a legislative agenda to influence action in areas such as copyright. In the future, health sciences librarians must take a leadership role.
Ferderer, David A.
2001-01-01
Documented, reliable, and accessible data and information are essential building blocks supporting scientific research and applications that enhance society's knowledge base (fig. 1). The U.S. Geological Survey (USGS), a leading provider of science data, information, and knowledge, is uniquely positioned to integrate science and natural resource information to address societal needs. The USGS Central Energy Resources Team (USGS-CERT) provides critical information and knowledge on the quantity, quality, and distribution of the Nation's and the world's oil, gas, and coal resources. By using a life-cycle model, the USGS-CERT Data Management Project is developing an integrated data management system to (1) promote access to energy data and information, (2) increase data documentation, and (3) streamline product delivery to the public, scientists, and decision makers. The project incorporates web-based technology, data cataloging systems, data processing routines, and metadata documentation tools to improve data access, enhance data consistency, and increase office efficiency
Note: Sub-Kelvin refrigeration with dry-coolers on a rotating system.
Oguri, S; Ishitsuka, H; Choi, J; Kawai, M; Tajima, O
2014-08-01
We developed a cryogenic system on a rotating table that achieves sub-Kelvin conditions. The cryogenic system consists of a helium sorption cooler and a pulse tube cooler in a cryostat mounted on a rotating table. Two rotary-joint connectors for electricity and helium gas circulation enable the coolers to be operated and maintained with ease. We performed cool-down tests under a condition of continuous rotation at 20 rpm. We obtained a temperature of 0.23 K with a holding time of more than 24 h, thus complying with catalog specifications. We monitored the system's performance for four weeks; two weeks with and without rotation. A few-percent difference in conditions was observed between these two states. Most applications can tolerate such a slight difference. The technology developed is useful for various scientific applications requiring sub-Kelvin conditions on rotating platforms.
Formal Methods for Autonomic and Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Swarms of intelligent rovers and spacecraft are being considered for a number of future NASA missions. These missions will provide MSA scientist and explorers greater flexibility and the chance to gather more science than traditional single spacecraft missions. These swarms of spacecraft are intended to operate for large periods of time without contact with the Earth. To do this, they must be highly autonomous, have autonomic properties and utilize sophisticated artificial intelligence. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm type of missions NASA is considering. This mission will explore the asteroid belt using an insect colony analogy cataloging the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. Verifying such a system would be a huge task. This paper discusses ongoing work to develop a formal method for verifying swarm and autonomic systems.
Developing Interoperable Air Quality Community Portals
NASA Astrophysics Data System (ADS)
Falke, S. R.; Husar, R. B.; Yang, C. P.; Robinson, E. M.; Fialkowski, W. E.
2009-04-01
Web portals are intended to provide consolidated discovery, filtering and aggregation of content from multiple, distributed web sources targeted at particular user communities. This paper presents a standards-based information architectural approach to developing portals aimed at air quality community collaboration in data access and analysis. An important characteristic of the approach is to advance beyond the present stand-alone design of most portals to achieve interoperability with other portals and information sources. We show how using metadata standards, web services, RSS feeds and other Web 2.0 technologies, such as Yahoo! Pipes and del.icio.us, helps increase interoperability among portals. The approach is illustrated within the context of the GEOSS Architecture Implementation Pilot where an air quality community portal is being developed to provide a user interface between the portals and clearinghouse of the GEOSS Common Infrastructure and the air quality community catalog of metadata and data services.
Open Source Clinical NLP - More than Any Single System.
Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.
VizieR Online Data Catalog: Transition probabilities in TeII + TeIII spectra (Zhang+, 2013)
NASA Astrophysics Data System (ADS)
Zhang, W.; Palmeri, P.; Quinet, P.; Biemont, E.
2013-02-01
Computed weighted oscillator strengths (loggf) and transition probabilities (gA) for Te II (Table 8) and Te III (Table 9). Transitions with wavelengths <1um, loggf>-1 and CF>0.05 are only quoted. Air wavelengths are given above 200 nm. In Table 8 the levels are taken from Kamida et al (Kamida, A., Ralchenko, Yu., Reader, J., and NIST ASD Team (2012). NIST Atomic Spectra Database (ver. 5.0), [Online]. Available: http://physics.nist.gov/asd [2012, September 20]. National Institute of Standards and Technology, Gaithersburg, MD.). In Table 9 the levels are those given in Tauheed & Naz (Tauheed, A., Naz, A. 2011, Journal of the Korean Physical Society 59, 2910) with the exceptions of the 5p6p levels which were taken from Kramida et al. The wavelengths were computed from the experimental levels of Kramida et al and Tauheed & Naz. (2 data files).
VizieR Online Data Catalog: Photometry and spectra of SN 2017dio (Kuncarayakti+, 2018)
NASA Astrophysics Data System (ADS)
Kuncarayakti, H.; Maeda, K.; Ashall, C. J.; Prentice, S. J.; Mattila, S.; Kankare, E.; Fransson, C.; Lundqvist, P.; Pastorello, A.; Leloudas, G.; Anderson, J. P.; Benetti, S.; Bersten, M. C.; Cappellaro, E.; Cartier, R.; Denneau, L.; Della Valle, M.; Elias-Rosa, N.; Folatelli, G.; Fraser, M.; Galbany, L.; Gall, C.; Gal-Yam, A.; Gutierrez, C. P.; Hamanowicz, A.; Heinze, A.; Inserra, C.; Kangas, T.; Mazzali, P.; Melandri, A.; Pignata, G.; Rest, A.; Reynolds, T.; Roy, R.; Smartt, S. J.; Smith, K. W.; Sollerman, J.; Somero, A.; Stalder, B.; Stritzinger, M.; Taddia, F.; Tomasella, L.; Tonry, J.; Weiland, H.; Young, D. R.
2018-02-01
ugriz + JHK photometry and optical spectra of SN 2017dio. Optical photometry was obtained in ugriz filters using ALFOSC at the Nordic Optical Telescope (NOT), IO:O at the Liverpool Telescope (LT), and Spectral at the 2 m telescopes of Las Cumbres Observatory. JHK photometry was obtained using NOT/NOTCam at one epoch, 2017 July 27. Optical slit-spectroscopy was obtained using ALFOSC at the NOT, EFOSC2 at the ESO New Technology Telescope (NTT, through the ePESSTO program), and SPRAT 45 at the LT. ALFOSC spectra cover 3300-9500Å, with a resolution of 16Å. With EFOSC2, grism #13, the coverage was 3500-9300Å, at a 21Å resolution. Additionally, in two epochs the EFOSC2 spectra were taken using grism #11 (16Å resolution). SPRAT covers 4000-8000Å, with a 18Å resolution. (10 data files).
Proteome studies of filamentous fungi.
Baker, Scott E; Panisko, Ellen A
2011-01-01
The continued fast pace of fungal genome sequence generation has enabled proteomic analysis of a wide variety of organisms that span the breadth of the Kingdom Fungi. There is some phylogenetic bias to the current catalog of fungi with reasonable DNA sequence databases (genomic or EST) that could be analyzed at a global proteomic level. However, the rapid development of next generation sequencing platforms has lowered the cost of genome sequencing such that in the near future, having a genome sequence will no longer be a time or cost bottleneck for downstream proteomic (and transcriptomic) analyses. High throughput, nongel-based proteomics offers a snapshot of proteins present in a given sample at a single point in time. There are a number of variations on the general methods and technologies for identifying peptides in a given sample. We present a method that can serve as a "baseline" for proteomic studies of fungi.
HSI-Find: A Visualization and Search Service for Terascale Spectral Image Catalogs
NASA Astrophysics Data System (ADS)
Thompson, D. R.; Smith, A. T.; Castano, R.; Palmer, E. E.; Xing, Z.
2013-12-01
Imaging spectrometers are remote sensing instruments commonly deployed on aircraft and spacecraft. They provide surface reflectance in hundreds of wavelength channels, creating data cubes known as hyperspecrtral images. They provide rich compositional information making them powerful tools for planetary and terrestrial science. These data products can be challenging to interpret because they contain datapoints numbering in the thousands (Dawn VIR) or millions (AVIRIS-C). Cross-image studies or exploratory searches involving more than one scene are rare; data volumes are often tens of GB per image and typical consumer-grade computers cannot store more than a handful of images in RAM. Visualizing the information in a single scene is challenging since the human eye can only distinguish three color channels out of the hundreds available. To date, analysis has been performed mostly on single images using purpose-built software tools that require extensive training and commercial licenses. The HSIFind software suite provides a scalable distributed solution to the problem of visualizing and searching large catalogs of spectral image data. It consists of a RESTful web service that communicates to a javascript-based browser client. The software provides basic visualization through an intuitive visual interface, allowing users with minimal training to explore the images or view selected spectra. Users can accumulate a library of spectra from one or more images and use these to search for similar materials. The result appears as an intensity map showing the extent of a spectral feature in a scene. Continuum removal can isolate diagnostic absorption features. The server-side mapping algorithm uses an efficient matched filter algorithm that can process a megapixel image cube in just a few seconds. This enables real-time interaction, leading to a new way of interacting with the data: the user can launch a search with a single mouse click and see the resulting map in seconds. This allows the user to quickly explore each image, ascertain the main units of surface material, localize outliers, and develop an understanding of the various materials' spectral characteristics. The HSIFind software suite is currently in beta testing at the Planetary Science Institute and a process is underway to release it under an open source license to the broader community. We believe it will benefit instrument operations during remote planetary exploration, where tactical mission decisions demand rapid analysis of each new dataset. The approach also holds potential for public spectral catalogs where its shallow learning curve and portability can make these datasets accessible to a much wider range of researchers. Acknowledgements: The HSIFind project acknowledges the NASA Advanced MultiMission Operating System (AMMOS) and the Multimission Ground Support Services (MGSS). E. Palmer is with the Planetary Science Institute, Tucson, AZ. Other authors are with the Jet Propulsion Laboratory, Pasadena, CA. This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology under a contract with the National Aeronautics and Space Administration. Copyright 2013, California Institute of Technology.
A design for a ground-based data management system
NASA Technical Reports Server (NTRS)
Lambird, Barbara A.; Lavine, David
1988-01-01
An initial design for a ground-based data management system which includes intelligent data abstraction and cataloging is described. The large quantity of data on some current and future NASA missions leads to significant problems in providing scientists with quick access to relevant data. Human screening of data for potential relevance to a particular study is time-consuming and costly. Intelligent databases can provide automatic screening when given relevent scientific parameters and constraints. The data management system would provide, at a minimum, information of availability of the range of data, the type available, specific time periods covered together with data quality information, and related sources of data. The system would inform the user about the primary types of screening, analysis, and methods of presentation available to the user. The system would then aid the user with performing the desired tasks, in such a way that the user need only specify the scientific parameters and objectives, and not worry about specific details for running a particular program. The design contains modules for data abstraction, catalog plan abstraction, a user-friendly interface, and expert systems for data handling, data evaluation, and application analysis. The emphasis is on developing general facilities for data representation, description, analysis, and presentation that will be easily used by scientists directly, thus bypassing the knowledge acquisition bottleneck. Expert system technology is used for many different aspects of the data management system, including the direct user interface, the interface to the data analysis routines, and the analysis of instrument status.
Next Generation Search Interfaces
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Ly, L.; Goldina, T.
2015-09-01
Astronomers are constantly looking for easier ways to access multiple data sets. While much effort is spent on VO, little thought is given to the types of User Interfaces we need to effectively search this sort of data. For instance, an astronomer might need to search Spitzer, WISE, and 2MASS catalogs and images then see the results presented together in one UI. Moving seamlessly between data sets is key to presenting integrated results. Results need to be viewed using first class, web based, integrated FITS viewers, XY Plots, and advanced table display tools. These components should be able to handle very large datasets. To make a powerful Web based UI that can manage and present multiple searches to the user requires taking advantage of many HTML5 features. AJAX is used to start searches and present results. Push notifications (Server Sent Events) monitor background jobs. Canvas is required for advanced result displays. Lesser known CSS3 technologies makes it all flow seamlessly together. At IPAC, we have been developing our Firefly toolkit for several years. We are now using it to solve this multiple data set, multiple queries, and integrated presentation problem to create a powerful research experience. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). Firefly is the core for applications serving many project archives, including Spitzer, Planck, WISE, PTF, LSST and others. It is also used in IRSA's new Finder Chart and catalog and image displays.
NASA Astrophysics Data System (ADS)
Basoglu, Burak; Halicioglu, Kerem; Albayrak, Muge; Ulug, Rasit; Tevfik Ozludemir, M.; Deniz, Rasim
2017-04-01
In the last decade, the importance of high-precise geoid determination at local or national level has been pointed out by Turkish National Geodesy Commission. The Commission has also put objective of modernization of national height system of Turkey to the agenda. Meanwhile several projects have been realized in recent years. In Istanbul city, a GNSS/Levelling geoid was defined in 2005 for the metropolitan area of the city with an accuracy of ±3.5cm. In order to achieve a better accuracy in this area, "Local Geoid Determination with Integration of GNSS/Levelling and Astro-Geodetic Data" project has been conducted in Istanbul Technical University and Bogazici University KOERI since January 2016. The project is funded by The Scientific and Technological Research Council of Turkey. With the scope of the project, modernization studies of Digital Zenith Camera System are being carried on in terms of hardware components and software development. Accentuated subjects are the star catalogues, and centroiding algorithm used to identify the stars on the zenithal star field. During the test observations of Digital Zenith Camera System performed between 2013-2016, final results were calculated using the PSF method for star centroiding, and the second USNO CCD Astrograph Catalogue (UCAC2) for the reference star positions. This study aims to investigate the position accuracy of the star images by comparing different centroiding algorithms and available star catalogs used in astro-geodetic observations conducted with the digital zenith camera system.
Architecture for the Interdisciplinary Earth Data Alliance
NASA Astrophysics Data System (ADS)
Richard, S. M.
2016-12-01
The Interdisciplinary Earth Data Alliance (IEDA) is leading an EarthCube (EC) Integrative Activity to develop a governance structure and technology framework that enables partner data systems to share technology, infrastructure, and practice for documenting, curating, and accessing heterogeneous geoscience data. The IEDA data facility provides capabilities in an extensible framework that enables domain-specific requirements for each partner system in the Alliance to be integrated into standardized cross-domain workflows. The shared technology infrastructure includes a data submission hub, a domain-agnostic file-based repository, an integrated Alliance catalog and a Data Browser for data discovery across all partner holdings, as well as services for registering identifiers for datasets (DOI) and samples (IGSN). The submission hub will be a platform that facilitates acquisition of cross-domain resource documentation and channels users into domain and resource-specific workflows tailored for each partner community. We are exploring an event-based message bus architecture with a standardized plug-in interface for adding capabilities. This architecture builds on the EC CINERGI metadata pipeline as well as the message-based architecture of the SEAD project. Plug-in components for file introspection to match entities to a data type registry (extending EC Digital Crust and Research Data Alliance work), extract standardized keywords (using CINERGI components), location, cruise, personnel and other metadata linkage information (building on GeoLink and existing IEDA partner components). The submission hub will feed submissions to appropriate partner repositories and service endpoints targeted by domain and resource type for distribution. The Alliance governance will adopt patterns (vocabularies, operations, resource types) for self-describing data services using standard HTTP protocol for simplified data access (building on EC GeoWS and other `RESTful' approaches). Exposure of resource descriptions (datasets and service distributions) for harvesting by commercial search engines as well as geoscience-data focused crawlers (like EC B-Cube crawler) will increase discoverability of IEDA resources with minimal effort by curators.
ROADNET: A Real-time Data Aware System for Earth, Oceanographic, and Environmental Applications
NASA Astrophysics Data System (ADS)
Vernon, F.; Hansen, T.; Lindquist, K.; Ludascher, B.; Orcutt, J.; Rajasekar, A.
2003-12-01
The Real-time Observatories, Application, and Data management Network (ROADNet) Program aims to develop an integrated, seamless, and transparent environmental information network that will deliver geophysical, oceanographic, hydrological, ecological, and physical data to a variety of users in real-time. ROADNet is a multidisciplinary, multinational partnership of researchers, policymakers, natural resource managers, educators, and students who aim to use the data to advance our understanding and management of coastal, ocean, riparian, and terrestrial Earth systems in Southern California, Mexico, and well off shore. To date, project activity and funding have focused on the design and deployment of network linkages and on the exploratory development of the real-time data management system. We are currently adapting powerful "Data Grid" technologies to the unique challenges associated with the management and manipulation of real-time data. Current "Grid" projects deal with static data files, and significant technical innovation is required to address fundamental problems of real-time data processing, integration, and distribution. The technologies developed through this research will create a system that dynamically adapt downstream processing, cataloging, and data access interfaces when sensors are added or removed from the system; provide for real-time processing and monitoring of data streams--detecting events, and triggering computations, sensor and logger modifications, and other actions; integrate heterogeneous data from multiple (signal) domains; and provide for large-scale archival and querying of "consolidated" data. The software tools which must be developed do not exist, although limited prototype systems are available. This research has implications for the success of large-scale NSF initiatives in the Earth sciences (EarthScope), ocean sciences (OOI- Ocean Observatories Initiative), biological sciences (NEON - National Ecological Observatory Network) and civil engineering (NEES - Network for Earthquake Engineering Simulation). Each of these large scale initiatives aims to collect real-time data from thousands of sensors, and each will require new technologies to process, manage, and communicate real-time multidisciplinary environmental data on regional, national, and global scales.
The VERCE Science Gateway: Enabling User Friendly HPC Seismic Wave Simulations.
NASA Astrophysics Data System (ADS)
Casarotti, E.; Spinuso, A.; Matser, J.; Leong, S. H.; Magnoni, F.; Krause, A.; Garcia, C. R.; Muraleedharan, V.; Krischer, L.; Anthes, C.
2014-12-01
The EU-funded project VERCE (Virtual Earthquake and seismology Research Community in Europe) aims to deploy technologies which satisfy the HPC and data-intensive requirements of modern seismology.As a result of VERCE official collaboration with the EU project SCI-BUS, access to computational resources, like local clusters and international infrastructures (EGI and PRACE), is made homogeneous and integrated within a dedicated science gateway based on the gUSE framework. In this presentation we give a detailed overview on the progress achieved with the developments of the VERCE Science Gateway, according to a use-case driven implementation strategy. More specifically, we show how the computational technologies and data services have been integrated within a tool for Seismic Forward Modelling, whose objective is to offer the possibility to performsimulations of seismic waves as a service to the seismological community.We will introduce the interactive components of the OGC map based web interface and how it supports the user with setting up the simulation. We will go through the selection of input data, which are either fetched from federated seismological web services, adopting community standards, or provided by the users themselves by accessing their own document data store. The HPC scientific codes can be selected from a number of waveform simulators, currently available to the seismological community as batch tools or with limited configuration capabilities in their interactive online versions.The results will be staged out via a secure GridFTP transfer to a VERCE data layer managed by iRODS. The provenance information of the simulation will be automatically cataloged by the data layer via NoSQL techonologies.Finally, we will show the example of how the visualisation output of the gateway could be enhanced by the connection with immersive projection technology at the Virtual Reality and Visualisation Centre of Leibniz Supercomputing Centre (LRZ).
Overview of Human-Centric Space Situational Awareness (SSA) Science and Technology (S&T)
NASA Astrophysics Data System (ADS)
Ianni, J.; Aleva, D.; Ellis, S.
2012-09-01
A number of organizations, within the government, industry, and academia, are researching ways to help humans understand and react to events in space. The problem is both helped and complicated by the fact that there are numerous data sources that need to be planned (i.e., tasked), collected, processed, analyzed, and disseminated. A large part of the research is in support of the Joint Space Operational Center (JSpOC), National Air and Space Intelligence Center (NASIC), and similar organizations. Much recent research has been specifically targeting the JSpOC Mission System (JMS) which has provided a unifying software architecture. This paper will first outline areas of science and technology (S&T) related to human-centric space situational awareness (SSA) and space command and control (C2) including: 1. Object visualization - especially data fused from disparate sources. Also satellite catalog visualizations that convey the physical relationships between space objects. 2. Data visualization - improve data trend analysis as in visual analytics and interactive visualization; e.g., satellite anomaly trends over time, space weather visualization, dynamic visualizations 3. Workflow support - human-computer interfaces that encapsulate multiple computer services (i.e., algorithms, programs, applications) into a 4. Command and control - e.g., tools that support course of action (COA) development and selection, tasking for satellites and sensors, etc. 5. Collaboration - improve individuals or teams ability to work with others; e.g., video teleconferencing, shared virtual spaces, file sharing, virtual white-boards, chat, and knowledge search. 6. Hardware/facilities - e.g., optimal layouts for operations centers, ergonomic workstations, immersive displays, interaction technologies, and mobile computing. Secondly we will provide a survey of organizations working these areas and suggest where more attention may be needed. Although no detailed master plan exists for human-centric SSA and C2, we see little redundancy among the groups supporting SSA human factors at this point.
NASA Technical Reports Server (NTRS)
Maisel, James E.
1988-01-01
Addressed are some of the space electrical power system technologies that should be developed for the U.S. space program to remain competitive in the 21st century. A brief historical overview of some U.S. manned/unmanned spacecraft power systems is discussed to establish the fact that electrical systems are and will continue to become more sophisticated as the power levels appoach those on the ground. Adaptive/Expert power systems that can function in an extraterrestrial environment will be required to take an appropriate action during electrical faults so that the impact is minimal. Manhours can be reduced significantly by relinquishing tedious routine system component maintenance to the adaptive/expert system. By cataloging component signatures over time this system can set a flag for a premature component failure and thus possibly avoid a major fault. High frequency operation is important if the electrical power system mass is to be cut significantly. High power semiconductor or vacuum switching components will be required to meet future power demands. System mass tradeoffs have been investigated in terms of operating at high temperature, efficiency, voltage regulation, and system reliability. High temperature semiconductors will be required. Silicon carbide materials will operate at a temperature around 1000 K and the diamond material up to 1300 K. The driver for elevated temperature operation is that radiator mass is reduced significantly because of inverse temperature to the fourth power.
Status report on renewable energy in the States
NASA Astrophysics Data System (ADS)
Swezey, B.; Sinclair, K.
1992-12-01
As the concept of integrated resource planning has spread among states and utilities, a reexamination of the role of renewable energy sources in the utility resource mix is taking place. This report documents the findings of a study of state regulatory commissions undertaken to: (1) help assess the state of knowledge and awareness about renewable energy resources and technologies; (2) assess the impacts of state policies on renewable energy development; and (3) identify important information needs. The key findings from this effort are: Renewable energy development has occurred only slowly over the last decade, and a small number of states account for the bulk of development. The development that has occurred has been limited to non-utility entities. Directed state policies have been a key driver in renewable energy development. Those states not currently addressing renewables may need more data and information before they proceed with directed policies. Other important observations are: The cost of renewables is an overriding concern. Regulators distinguish between 'emerging' and 'established' renewable energy technologies. Specific data are lacking on state-level renewable energy development. Detailed renewable resource assessments have yet to be performed in many states. This report identifies renewable energy information needs of state regulators. However, a number of concerns are also identified that must be addressed before renewables will receive serious attention in many of those states with limited renewables experience. Finally, the report catalogs a wide variety of policies that have been utilized in the states to promote greater development of renewable energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue, Peng; Gong, Jianya; Di, Liping
Abstract A geospatial catalogue service provides a network-based meta-information repository and interface for advertising and discovering shared geospatial data and services. Descriptive information (i.e., metadata) for geospatial data and services is structured and organized in catalogue services. The approaches currently available for searching and using that information are often inadequate. Semantic Web technologies show promise for better discovery methods by exploiting the underlying semantics. Such development needs special attention from the Cyberinfrastructure perspective, so that the traditional focus on discovery of and access to geospatial data can be expanded to support the increased demand for processing of geospatial information andmore » discovery of knowledge. Semantic descriptions for geospatial data, services, and geoprocessing service chains are structured, organized, and registered through extending elements in the ebXML Registry Information Model (ebRIM) of a geospatial catalogue service, which follows the interface specifications of the Open Geospatial Consortium (OGC) Catalogue Services for the Web (CSW). The process models for geoprocessing service chains, as a type of geospatial knowledge, are captured, registered, and discoverable. Semantics-enhanced discovery for geospatial data, services/service chains, and process models is described. Semantic search middleware that can support virtual data product materialization is developed for the geospatial catalogue service. The creation of such a semantics-enhanced geospatial catalogue service is important in meeting the demands for geospatial information discovery and analysis in Cyberinfrastructure.« less
Semantic Service Design for Collaborative Business Processes in Internetworked Enterprises
NASA Astrophysics Data System (ADS)
Bianchini, Devis; Cappiello, Cinzia; de Antonellis, Valeria; Pernici, Barbara
Modern collaborating enterprises can be seen as borderless organizations whose processes are dynamically transformed and integrated with the ones of their partners (Internetworked Enterprises, IE), thus enabling the design of collaborative business processes. The adoption of Semantic Web and service-oriented technologies for implementing collaboration in such distributed and heterogeneous environments promises significant benefits. IE can model their own processes independently by using the Software as a Service paradigm (SaaS). Each enterprise maintains a catalog of available services and these can be shared across IE and reused to build up complex collaborative processes. Moreover, each enterprise can adopt its own terminology and concepts to describe business processes and component services. This brings requirements to manage semantic heterogeneity in process descriptions which are distributed across different enterprise systems. To enable effective service-based collaboration, IEs have to standardize their process descriptions and model them through component services using the same approach and principles. For enabling collaborative business processes across IE, services should be designed following an homogeneous approach, possibly maintaining a uniform level of granularity. In the paper we propose an ontology-based semantic modeling approach apt to enrich and reconcile semantics of process descriptions to facilitate process knowledge management and to enable semantic service design (by discovery, reuse and integration of process elements/constructs). The approach brings together Semantic Web technologies, techniques in process modeling, ontology building and semantic matching in order to provide a comprehensive semantic modeling framework.
NASA Technical Reports Server (NTRS)
1978-01-01
The building in the top photo is the new home of the National Permanent Savings Bank in Washington, D.C., designed by Hartman-Cox Architects. Its construction was based on a money-saving method of preparing building specifications which derived from NASA technology developed to obtain quality construction while holding down cost of launch facilities, test centers and other structures. Written technical specifications spell out materials and components to be used on construction projects and identify the quality tests each item must pass. Specifications can have major impact on construction costs. Poorly formulated specifications can lead to unacceptable construction which must be replaced, unnecessarily high materials costs, safety hazards, disputes and often additional costs due to delays and litigation. NASA's Langley Research Center developed a novel approach to providing accurate, uniform, cost-effective specifications which can be readily updated to incorporate new building technologies. Called SPECSINTACT, it is a computerized - system accessible to all NASA centers involved in construction programs. The system contains a comprehensive catalog of master specifications applicable to many types of construction. It enables designers of any structure to call out relevant sections from computer storage and modify them to fit the needs of the project at hand. Architects and engineers can save time by concentrating their efforts on needed modifications rather than developing all specifications from scratch. Successful use of SPECSINTACT has led to a number of spinoff systems. One of the first was MASTERSPEC, developed from NASA's experience by Production Systems for Architects and Engineers, Inc., an organization established by the American Institute of Architects. MASTERSPEC, used in construction of the bank building pictured, follows the same basic format as SPECSINTACT and can be used in either automated or manual modes. The striking appearance of the bank building shows that, while MASTERSPEC saves time and money, its use involves no sacrfice in architectural design freedom. The Naval Engineering Facilities Command employs an automated specifications system based on SPECSINTACT. The Public Buildings Service of the General Services Administration used SPECSINTACT as a starting point in a plan to make its guideline specifications available to architects and engineers on a nationwide computer network. Public Technology, Inc., a NASA Technology Application Team, is working with Production Systems for Architects and Engineers, Inc., to promote widespread use of the system by state and local governments for cost benefits to taxpayers.
2012-01-01
Background Semantic Web technology can considerably catalyze translational genetics and genomics research in medicine, where the interchange of information between basic research and clinical levels becomes crucial. This exchange involves mapping abstract phenotype descriptions from research resources, such as knowledge databases and catalogs, to unstructured datasets produced through experimental methods and clinical practice. This is especially true for the construction of mutation databases. This paper presents a way of harmonizing abstract phenotype descriptions with patient data from clinical practice, and querying this dataset about relationships between phenotypes and genetic variants, at different levels of abstraction. Methods Due to the current availability of ontological and terminological resources that have already reached some consensus in biomedicine, a reuse-based ontology engineering approach was followed. The proposed approach uses the Ontology Web Language (OWL) to represent the phenotype ontology and the patient model, the Semantic Web Rule Language (SWRL) to bridge the gap between phenotype descriptions and clinical data, and the Semantic Query Web Rule Language (SQWRL) to query relevant phenotype-genotype bidirectional relationships. The work tests the use of semantic web technology in the biomedical research domain named cerebrotendinous xanthomatosis (CTX), using a real dataset and ontologies. Results A framework to query relevant phenotype-genotype bidirectional relationships is provided. Phenotype descriptions and patient data were harmonized by defining 28 Horn-like rules in terms of the OWL concepts. In total, 24 patterns of SWQRL queries were designed following the initial list of competency questions. As the approach is based on OWL, the semantic of the framework adapts the standard logical model of an open world assumption. Conclusions This work demonstrates how semantic web technologies can be used to support flexible representation and computational inference mechanisms required to query patient datasets at different levels of abstraction. The open world assumption is especially good for describing only partially known phenotype-genotype relationships, in a way that is easily extensible. In future, this type of approach could offer researchers a valuable resource to infer new data from patient data for statistical analysis in translational research. In conclusion, phenotype description formalization and mapping to clinical data are two key elements for interchanging knowledge between basic and clinical research. PMID:22849591
Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.
2004-12-01
From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an integrated and scalable framework for identifying, accessing, preparing, assimilating, predicting, managing, analyzing, mining, and visualizing a broad array of meteorological data and model output, independent of format and physical location. To that end, LEAD will create a series of interconnected, heterogeneous Grid environments to provide a complete framework for mesoscale research, including a set of integrated Grid and Web Services. This talk will focus on the transition from today's end-to-end systems into the types of systems that the LEAD project envisions and the multidisciplinary research problems they will enable.
NASA Astrophysics Data System (ADS)
Schweitzer, R. H.
2001-05-01
The Climate Diagnostics Center maintains a collection of gridded climate data primarily for use by local researchers. Because this data is available on fast digital storage and because it has been converted to netCDF using a standard metadata convention (called COARDS), we recognize that this data collection is also useful to the community at large. At CDC we try to use technology and metadata standards to reduce our costs associated with making these data available to the public. The World Wide Web has been an excellent technology platform for meeting that goal. Specifically we have developed Web-based user interfaces that allow users to search, plot and download subsets from the data collection. We have also been exploring use of the Pacific Marine Environment Laboratory's Live Access Server (LAS) as an engine for this task. This would result in further savings by allowing us to concentrate on customizing the LAS where needed, rather that developing and maintaining our own system. One such customization currently under development is the use of Java Servlets and JavaServer pages in conjunction with a metadata database to produce a hierarchical user interface to LAS. In addition to these Web-based user interfaces all of our data are available via the Distributed Oceanographic Data System (DODS). This allows other sites using LAS and individuals using DODS-enabled clients to use our data as if it were a local file. All of these technology systems are driven by metadata. When we began to create netCDF files, we collaborated with several other agencies to develop a netCDF convention (COARDS) for metadata. At CDC we have extended that convention to incorporate additional metadata elements to make the netCDF files as self-describing as possible. Part of the local metadata is a set of controlled names for the variable, level in the atmosphere and ocean, statistic and data set for each netCDF file. To allow searching and easy reorganization of these metadata, we loaded the metadata from the netCDF files into a mySQL database. The combination of the mySQL database and the controlled names makes it possible to automate the construction of user interfaces and standard format metadata descriptions, like Federal Geographic Data Committee (FGDC) and Directory Interchange Format (DIF). These standard descriptions also include an association between our controlled names and standard keywords such as those developed by the Global Change Master Directory (GCMD). This talk will give an overview of each of these technology and metadata standards as it applies to work at the Climate Diagnostics Center. The talk will also discuss the pros and cons of each approach and discuss areas for future development.
Automated Classification of ROSAT Sources Using Heterogeneous Multiwavelength Source Catalogs
NASA Technical Reports Server (NTRS)
McGlynn, Thomas; Suchkov, A. A.; Winter, E. L.; Hanisch, R. J.; White, R. L.; Ochsenbein, F.; Derriere, S.; Voges, W.; Corcoran, M. F.
2004-01-01
We describe an on-line system for automated classification of X-ray sources, ClassX, and present preliminary results of classification of the three major catalogs of ROSAT sources, RASS BSC, RASS FSC, and WGACAT, into six class categories: stars, white dwarfs, X-ray binaries, galaxies, AGNs, and clusters of galaxies. ClassX is based on a machine learning technology. It represents a system of classifiers, each classifier consisting of a considerable number of oblique decision trees. These trees are built as the classifier is 'trained' to recognize various classes of objects using a training sample of sources of known object types. Each source is characterized by a preselected set of parameters, or attributes; the same set is then used as the classifier conducts classification of sources of unknown identity. The ClassX pipeline features an automatic search for X-ray source counterparts among heterogeneous data sets in on-line data archives using Virtual Observatory protocols; it retrieves from those archives all the attributes required by the selected classifier and inputs them to the classifier. The user input to ClassX is typically a file with target coordinates, optionally complemented with target IDs. The output contains the class name, attributes, and class probabilities for all classified targets. We discuss ways to characterize and assess the classifier quality and performance and present the respective validation procedures. Based on both internal and external validation, we conclude that the ClassX classifiers yield reasonable and reliable classifications for ROSAT sources and have the potential to broaden class representation significantly for rare object types.
VizieR Online Data Catalog: NuSTAR serendipitous survey: the 40-month catalog (Lansbury+, 2017)
NASA Astrophysics Data System (ADS)
Lansbury, G. B.; Stern, D.; Aird, J.; Alexander, D. M.; Fuentes, C.; Harrison, F. A.; Treister, E.; Bauer, F. E.; Tomsick, J. A.; Balokovic, M.; Del Moro, A.; Gandhi, P.; Ajello, M.; Annuar, A.; Ballantyne, D. R.; Boggs, S. E.; Brandt, W. N.; Brightman, M.; Chen, C.-T. J.; Christensen, F. E.; Civano, F.; Comastri, A.; Craig, W. W.; Forster, K.; Grefenstette, B. W.; Hailey, C. J.; Hickox, R. C.; Jiang, B.; Jun, H. D.; Koss, M.; Marchesi, S.; Melo, A. D.; Mullaney, J. R.; Noirot, G.; Schulze, S.; Walton, D. J.; Zappacosta, L.; Zhang, W. W.
2017-09-01
Over the period from 2012 July to 2015 November, which is the focus of the current study, there are 510 individual NuSTAR exposures that have been incorporated into the serendipitous survey. These exposures were performed over 331 unique fields (i.e., 331 individual sky regions, each with contiguous coverage composed of one or more NuSTAR exposures), yielding a total sky area coverage of 13deg2. Table 1 lists the fields chronologically. The fields have a cumulative exposure time of 20.4Ms. We have undertaken a campaign of dedicated spectroscopic follow-up in the optical-IR bands, obtaining spectroscopic identifications for a large fraction (56%) of the total sample. Since NuSTAR performs science pointings across the whole sky, a successful ground-based follow-up campaign requires the use of observatories at a range of geographic latitudes, and preferably across a range of dates throughout the sidereal year. This has been achieved through observing programs with, primarily, the following telescopes over a multiyear period (2012 Oct 10 to 2016 Jul 10): the Hale Telescope at Palomar Observatory (5.1m; PIs F. A. Harrison and D. Stern); Keck I and II at the W. M. Keck Observatory (10m; PIs F. A. Harrison and D. Stern); the New Technology Telescope (NTT) at La Silla Observatory (3.6m; PI G. B. Lansbury); the Magellan I (Baade) and Magellan II (Clay) Telescopes at Las Campanas Observatory (6.5m; PIs E. Treister and F. E. Bauer); and the Gemini-South observatory (8.1m; PI E. Treister). (5 data files).
Accounting for GC-content bias reduces systematic errors and batch effects in ChIP-seq data.
Teng, Mingxiang; Irizarry, Rafael A
2017-11-01
The main application of ChIP-seq technology is the detection of genomic regions that bind to a protein of interest. A large part of functional genomics' public catalogs is based on ChIP-seq data. These catalogs rely on peak calling algorithms that infer protein-binding sites by detecting genomic regions associated with more mapped reads (coverage) than expected by chance, as a result of the experimental protocol's lack of perfect specificity. We find that GC-content bias accounts for substantial variability in the observed coverage for ChIP-seq experiments and that this variability leads to false-positive peak calls. More concerning is that the GC effect varies across experiments, with the effect strong enough to result in a substantial number of peaks called differently when different laboratories perform experiments on the same cell line. However, accounting for GC content bias in ChIP-seq is challenging because the binding sites of interest tend to be more common in high GC-content regions, which confounds real biological signals with unwanted variability. To account for this challenge, we introduce a statistical approach that accounts for GC effects on both nonspecific noise and signal induced by the binding site. The method can be used to account for this bias in binding quantification as well to improve existing peak calling algorithms. We use this approach to show a reduction in false-positive peaks as well as improved consistency across laboratories. © 2017 Teng and Irizarry; Published by Cold Spring Harbor Laboratory Press.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Titov, O.; Stanford, Laura M.; Johnston, Helen M.
2013-07-01
Continuing our program of spectroscopic observations of International Celestial Reference Frame (ICRF) sources, we present redshifts for 120 quasars and radio galaxies. Data were obtained with five telescopes: the 3.58 m European Southern Observatory New Technology Telescope, the two 8.2 m Gemini telescopes, the 2.5 m Nordic Optical Telescope (NOT), and the 6.0 m Big Azimuthal Telescope of the Special Astrophysical Observatory in Russia. The targets were selected from the International VLBI Service for Geodesy and Astrometry candidate International Celestial Reference Catalog which forms part of an observational very long baseline interferometry (VLBI) program to strengthen the celestial reference frame.more » We obtained spectra of the potential optical counterparts of more than 150 compact flat-spectrum radio sources, and measured redshifts of 120 emission-line objects, together with 19 BL Lac objects. These identifications add significantly to the precise radio-optical frame tie to be undertaken by Gaia, due to be launched in 2013, and to the existing data available for analyzing source proper motions over the celestial sphere. We show that the distribution of redshifts for ICRF sources is consistent with the much larger sample drawn from Faint Images of the Radio Sky at Twenty cm (FIRST) and Sloan Digital Sky Survey, implying that the ultra-compact VLBI sources are not distinguished from the overall radio-loud quasar population. In addition, we obtained NOT spectra for five radio sources from the FIRST and NRAO VLA Sky Survey catalogs, selected on the basis of their red colors, which yielded three quasars with z > 4.« less
La mujer en la astronomía: pasado y presente
NASA Astrophysics Data System (ADS)
Dubner, G.
There exists a long and honorable tradition of participation of women in astronomy, affording many significant contributions to the field. Historically, however, many of these contributions have remained ignored, or recorded under the names of husbands, brothers or bosses. The present report includes an historical perspective, summarizing some of the most signicant contributions done along the last three centuries by female astronomers. Briefly: Catherina Hevelius (1646-1693), author of the largest and last stars catalog made without the aid of a telescope; Nicole-Reine Lepaute (1723-1788) extraordinary mathematician who predicted the path of Halley's Comet in 1757; Caroline Herschel (1750-1848) assistant of her brother William, discovered 8 comets, reduced the positions to a common epoch and published the catalog of 2500 nebulae observed by her brother, was elected honorary member of the Royal Astronomical Society (RAS); Maria Mitchell (1818-1889), professor of astronomy and director of the Vassar College Observatory, dedicated her life to women's education; Williamina Fleming (1857-1911)discovered 94 of the 107 Wolf-Rayet stars known at her time, the bulk of the first HD catalog was based on her spectral types classification; Annie Cannon (1863-1941) examined and classified nearly 500.000 stars, rearranged Fleming's spectral system, defining the OBAFGKM series; Henrietta Swan Leavitt (1868-1921) worked cataloging variable stars, discovered the period-luminosity relations in Cepheids; Cecilia Payne-Gaposhkin combined observations with theory to obtain a temperature scale for Cannon's spectral types; Ruby Payne-Scott (1912-1981), the first female radioastronomer in the world, developed the theory of aperture synthesis, in which most of the larger radio interferometers are based. The present trends are analized based on statistics of the International Astronomical Union (IAU): women represent 11.8% of the total of IAU members; in Argentina the percentage is 33.3%, the highest among countries with more than one member. Based on studies carried out by the American Astronomical Society (AAS) (Boyce 1993) it is concluded that there is virtually no difference in the productivity of men and women. Papers, however, receive different rate depending on the gender of the author (Billard 1993). In CONICET (Argentina), based on studies carried out by the Network of Gender, Science and Technology (for all sciences), it is concluded that even when women are majority in the lower categories, the female percentage rapidly decay for higher categories, suggesting than gender facts may be biasing promotions. Most of these data are taken from Mercury XXI, No. 1, 1992
Use of real-time tools to support field operations of NSF's Lower Atmosphere Observing Facilities
NASA Astrophysics Data System (ADS)
Daniels, M.; Stossmeister, G.; Johnson, E.; Martin, C.; Webster, C.; Dixon, M.; Maclean, G.
2012-12-01
NCAR's Earth Observing Laboratory (EOL) operates Lower Atmosphere Observing Facilities (LAOF) for the scientific community, under sponsorship of the National Science Foundation. In order to obtain the highest quality dataset during field campaigns, real-time decision-making critically depends on the availability of timely data and reliable communications between field operations staff and instrument operators. EOL incorporates the latest technologies to monitor the health of instrumentation, facilitate remote operations of instrumentation and keep project participants abreast of changing conditions in the field. As the availability of bandwidth on mobile communication networks and the capabilities of their associated devices (smart phone, tablets, etc.) improved, so has the ability of researchers to respond to rapidly changing conditions and coordinate ever more detailed measurements from multiple remote fixed, portable and airborne platforms. This presentation will describe several new tools that EOL is making available to project investigators and how these tools are being used in a mobile computing environment to support enhanced data collection during field campaigns. LAOF platforms such as radars, aircraft, sondes, balloons and surface stations all rely on displays of real-time data for their operations. Data from sondes are ingested into the Global Telecommunications System (GTS) for assimilation into regional forecasting models that help guide project operations. Since many of EOL's projects occur around the globe and at the same time instrument complexity has increased, automated monitoring of instrumentation platforms and systems has become essential. Tools are being developed to allow remote instrument control of our suite of observing systems where feasible. The Computing, Data and Software (CDS) Facility of EOL develops and supports a Field Catalog used in field campaigns for nearly two decades. Today, the Field Catalog serves as a hub for the collection and browsing of field research products, related operational and forecast imagery, project documentation as well as tools for real-time decision-making, communication, mission planning and post analysis. Incorporation of new capabilities into the Field Catalog to support the mobile computing environment and devices has led to the development of new tools which will be described. EOL/CDS has also developed a customized Internet Relay Chat (IRC) chat system to enable communication between all project participants distributed across various land-based, shipboard and airborne remote sites. The CDS chat system has incorporated aspects of fault tolerance in order to handle intermittent communications links. NOAA and NASA have used this chat system for their field missions as well. These new tools were recently deployed in support of the Deep Convective Clouds and Chemistry (DC3) field campaign that took place May - June 2012 in the Central United States. This presentation will show examples of these real-time tools from recent projects. We will also describe some of the challenges, problems and surprises, as well as improvements that have been made to the tools. The capabilities of this system continue to advance, taking advantage of new technology and guided by our experience and feedback from users participating in field campaigns.
The Digital Library for Earth System Education: A Progress Report from the DLESE Program Center
NASA Astrophysics Data System (ADS)
Marlino, M. R.; Sumner, T. R.; Kelly, K. K.; Wright, M.
2002-12-01
DLESE is a community-owned and governed digital library offering easy access to high quality electronic resources about the Earth system at all educational levels. Currently in its third year of development and operation, DLESE resources are designed to support systemic educational reform, and include web-based teaching resources, tools, and services for the inclusion of data in classroom activities, as well as a "virtual community center" that supports community goals and growth. "Community-owned" and "community-governed" embody the singularity of DLESE through its unique participatory approach to both library building and governance. DLESE is guided by policy development vested in the DLESE Steering Committee, and informed by Standing Committees centered on Collections, Services, Technology, and Users, and community working groups covering a wide variety of interest areas. This presentation highlights both current and projected status of the library and opportunities for community engagement. It is specifically structured to engage community members in the design of the next version of the library release. The current Version 1.0 of the library consists of a web-accessible graphical user interface connected to a database of catalogued educational resources (approximately 3000); a metadata framework enabling resource characterization; a cataloging tool allowing community cataloging and indexing of materials; a search and discovery system allowing browsing based on topic, grade level, and resource type, and permitting keyword and controlled vocabulary-based searches; and a portal website supporting library use, community action, and DLESE partnerships. Future stages of library development will focus on enhanced community collaborative support; development of controlled vocabularies; collections building and community review systems; resource discovery integrating the National Science Education Standards and geography standards; Earth system science vocabulary; georeferenced discovery; and ultimately, AAAS Benchmarks. DLESE is being designed from the outset to support resource discovery across a diverse, federated network of holdings and collections, including the Alexandria Digital Library Earth Prototype (ADL/ADEPT), NASA education collections, the DLESE reviewed collection, and other community-held resources that have been cataloged and indexed as part of the overall DLESE collections.
Uncertainty in georeferencing current and historic plant locations
McEachern, K.; Niessen, K.
2009-01-01
With shrinking habitats, weed invasions, and climate change, repeated surveys are becoming increasingly important for rare plant conservation and ecological restoration. We often need to relocate historical sites or provide locations for newly restored sites. Georeferencing is the technique of giving geographic coordinates to the location of a site. Georeferencing has been done historically using verbal descriptions or field maps that accompany voucher collections. New digital technology gives us more exact techniques for mapping and storing location information. Error still exists, however, and even georeferenced locations can be uncertain, especially if error information is not included with the observation. We review the concept of uncertainty in georeferencing and compare several institutional database systems for cataloging error and uncertainty with georeferenced locations. These concepts are widely discussed among geographers, but ecologists and restorationists need to become more aware of issues related to uncertainty to improve our use of spatial information in field studies. ?? 2009 by the Board of Regents of the University of Wisconsin System.
Origins of the Human Genome Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook-Deegan, Robert
1993-07-01
The human genome project was borne of technology, grew into a science bureaucracy in the US and throughout the world, and is now being transformed into a hybrid academic and commercial enterprise. The next phase of the project promises to veer more sharply toward commercial application, harnessing both the technical prowess of molecular biology and the rapidly growing body of knowledge about DNA structure to the pursuit of practical benefits. Faith that the systematic analysis of DNA structure will prove to be a powerful research tool underlies the rationale behind the genome project. The notion that most genetic information ismore » embedded in the sequence of CNA base pairs comprising chromosomes is a central tenet. A rough analogy is to liken an organism's genetic code to computer code. The coal of the genome project, in this parlance, is to identify and catalog 75,000 or more files (genes) in the software that directs construction of a self-modifying and self-replicating system -- a living organism.« less
2015-10-08
A view from the Kimberly formation on Mars taken by NASA Curiosity rover. The strata in the foreground dip towards the base of Mount Sharp, indicating the ancient depression that existed before the larger bulk of the mountain formed. The colors are adjusted so that rocks look approximately as they would if they were on Earth, to help geologists interpret the rocks. This "white balancing" to adjust for the lighting on Mars overly compensates for the absence of blue on Mars, making the sky appear light blue and sometimes giving dark, black rocks a blue cast. This image was taken by the Mast Camera (Mastcam) on Curiosity on the 580th Martian day, or sol, of the mission. Malin Space Science Systems, San Diego, built and operates Curiosity's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, built the rover and manages the project for NASA's Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA19839
BCube: Building a Geoscience Brokering Framework
NASA Astrophysics Data System (ADS)
Jodha Khalsa, Siri; Nativi, Stefano; Duerr, Ruth; Pearlman, Jay
2014-05-01
BCube is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. As a prototype "building block" for NSF's EarthCube cyberinfrastructure initiative, BCube is demonstrating how a broker can serve as an intermediary between information systems that implement well-defined interfaces, thereby providing a bridge between communities that employ different specifications. Building on the GEOSS Discover and Access Broker (DAB), BCube will develop new modules and services including: • Expanded semantic brokering capabilities • Business Model support for work flows • Automated metadata generation • Automated linking to services discovered via web crawling • Credential passing for seamless access to data • Ranking of search results from brokered catalogs Because facilitating cross-discipline research involves cultural and well as technical challenges, BCube is also addressing the sociological and educational components of infrastructure development. We are working, initially, with four geoscience disciplines: hydrology, oceans, polar and weather, with an emphasis on connecting existing domain infrastructure elements to facilitate cross-domain communications.
Shea, Katheryn E; Wagner, Elizabeth L; Marchesani, Leah; Meagher, Kevin; Giffen, Carol
2017-02-01
Reducing costs by improving storage efficiency has been a focus of the National Heart, Lung, and Blood Institute (NHLBI) Biologic Specimen Repository (Biorepository) and Biologic Specimen and Data Repositories Information Coordinating Center (BioLINCC) programs for several years. Study specimen profiles were compiled using the BioLINCC collection catalog. Cost assessments and calculations on the return on investments to consolidate or reduce a collection, were developed and implemented. Over the course of 8 months, the NHLBI Biorepository evaluated 35 collections that consisted of 1.8 million biospecimens. A total of 23 collections were selected for consolidation, with a total of 1.2 million specimens located in 21,355 storage boxes. The consolidation resulted in a savings of 4055 boxes of various sizes and 10.2 mechanical freezers (∼275 cubic feet) worth of space. As storage costs in a biorepository increase over time, the development and use of information technology tools to assess the potential advantage and feasiblity of vial consolidation can reduce maintenance expenses.
Lighter-Than-Air (LTA) "AirStation": Unmanned Aircraft System (UAS) Carrier Concept
NASA Technical Reports Server (NTRS)
Hochstetler, Ronald D.; Bosma, John; Chachad, Girish H.; Blanken, Matthew L.
2016-01-01
The advantages of utilizing an airship as an airborne carrier for support and deployment of Unmanned Aircraft Systems (UAS) are examined. Whether as a stand-alone platform or in concert with conventional aircraft, the airship UAS carrier provides a number of compelling benefits for both military and civilian missions. As a mobile base it can remain operational despite political fallout that may render ground or ocean based UAS sites unavailable. It offers the psychological impact of a power projection tool that has few geographical limits, and holds promise as a new method for cost-saving intelligence gathering. It is also adaptable for civilian variants for supporting: emergency response, security/surveillance, delivery of medical/food supplies, as well as commercial package delivery to metropolitan and remote communities. This paper presents the background on airship-aircraft operations, and explores the general airship carrier concept. Additionally, a catalog of contemporary technologies available to support the airship carrier concept are discussed, and essential elements for an Air-Station Development program proposed.
Open Source Clinical NLP – More than Any Single System
Masanz, James; Pakhomov, Serguei V.; Xu, Hua; Wu, Stephen T.; Chute, Christopher G.; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP’s mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice. PMID:25954581
2016-01-20
Engineers for NASA's MarCO (Mars Cube One) technology demonstration inspect one of the two MarCO CubeSats. Cody Colley, MarCO integration and test deputy, left, and Andy Klesh, MarCO chief engineer, are on the team at NASA's Jet Propulsion Laboratory, Pasadena, California, preparing twin MarCO CubeSats. The briefcase-size MarCO twins were designed to ride along with NASA's next Mars lander, InSight. Its planned March 2016 launch was suspended. InSight -- an acronym for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport -- will study the interior of Mars to improve understanding of the processes that formed and shaped rocky planets, including Earth. Note: After thorough examination, NASA managers have decided to suspend the planned March 2016 launch of the Interior Exploration using Seismic Investigations Geodesy and Heat Transport (InSight) mission. The decision follows unsuccessful attempts to repair a leak in a section of the prime instrument in the science payload. http://photojournal.jpl.nasa.gov/catalog/PIA20342
2018-05-15
The first image captured by one of NASA's Mars Cube One (MarCO) CubeSats. The image, which shows both the CubeSat's unfolded high-gain antenna at right and the Earth and its moon in the center, was acquired by MarCO-B on May 9. MarCO is a pair of small spacecraft accompanying NASA's InSight (Interior Investigations Using Seismic Investigations, Geodesy and Heat Transport) lander. Together, MarCO-A and MarCO-B are the first CubeSats ever sent to deep space. InSight is the first mission to ever explore Mars' deep interior. If the MarCO CubeSats make the entire journey to Mars, they will attempt to relay data about InSight back to Earth as the lander enters the Martian atmosphere and lands. MarCO will not collect any science, but are intended purely as a technology demonstration. They could serve as a pathfinder for future CubeSat missions. An annotated version is available at https://photojournal.jpl.nasa.gov/catalog/PIA22323
Survey of Technologies Relevant to Defense From Near-Earth Objects
NASA Technical Reports Server (NTRS)
Adams, R. B.; Alexander, R.; Bonometti, J.; Chapman, J.; Fincher, S.; Hopkins, R.; Kalkstein, M.; Polsgrove, T.; Statham, G.; White, S.
2004-01-01
Several recent near-miss encounters with asteroids and comets have focused attention on the threat of a catastrophic impact with the Earth. This Technical Publication reviews the historical impact record and current understanding of the number and location of near-Earth objects (NEOs) to address their impact probability. Various ongoing projects intended to survey and catalog the NEO population are also reviewed. Details are given of a Marshall Space Flight Center-led study intended to develop and assess various candidate systems for protection of the Earth against NEOs. Details of analytical tools, trajectory tools, and a tool that was created to model both the undeflected inbound path of an NEO as well as the modified, postdeflection path are given. A representative selection of these possible options was modeled and evaluated. It is hoped that this study will raise the level of attention about this very real threat and also demonstrate that successful defense is both possible and practicable, provided appropriate steps are taken.
2016-01-20
Engineers for NASA's MarCO (Mars Cube One) technology demonstration inspect one of the two MarCO CubeSats. Joel Steinkraus, MarCO lead mechanical engineer, left, and Andy Klesh, MarCO chief engineer, are on the team at NASA's Jet Propulsion Laboratory, Pasadena, California, preparing twin MarCO CubeSats. The briefcase-size MarCO twins were designed to ride along with NASA's next Mars lander, InSight. Its planned March 2016 launch was suspended. InSight -- an acronym for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport -- will study the interior of Mars to improve understanding of the processes that formed and shaped rocky planets, including Earth. Note: After thorough examination, NASA managers have decided to suspend the planned March 2016 launch of the Interior Exploration using Seismic Investigations Geodesy and Heat Transport (InSight) mission. The decision follows unsuccessful attempts to repair a leak in a section of the prime instrument in the science payload. http://photojournal.jpl.nasa.gov/catalog/PIA20343
NASA Technical Reports Server (NTRS)
1976-01-01
Developments required to support the space power, SETI, solar system exploration and global services programs are identified. Instrumentation and calibration sensors (rather than scientific) are needed for the space power system. Highly sophisticated receivers for narrowband detection of microwave sensors and sensors for automated stellar cataloging to provide a mapping data base for SETI are needed. Various phases of solar system exploration require large area solid state imaging arrays from UV to IR; a long focal plane telescope; high energy particle detectors; advanced spectrometers; a gravitometer; and atmospheric distanalyzer; sensors for penetrometers; in-situ sensors for surface chemical analysis, life detection, spectroscopic and microscopic analyses of surface soils, and for meteorological measurements. Active and passive multiapplication sensors, advanced multispectral scanners with improved resolution in the UV and IR ranges, and laser techniques for advanced probing and oceanographic characterization will enhance for global services.
StarTrax --- The Next Generation User Interface
NASA Astrophysics Data System (ADS)
Richmond, Alan; White, Nick
StarTrax is a software package to be distributed to end users for installation on their local computing infrastructure. It will provide access to many services of the HEASARC, i.e. bulletins, catalogs, proposal and analysis tools, initially for the ROSAT MIPS (Mission Information and Planning System), later for the Next Generation Browse. A user activating the GUI will reach all HEASARC capabilities through a uniform view of the system, independent of the local computing environment and of the networking method of accessing StarTrax. Use it if you prefer the point-and-click metaphor of modern GUI technology, to the classical command-line interfaces (CLI). Notable strengths include: easy to use; excellent portability; very robust server support; feedback button on every dialog; painstakingly crafted User Guide. It is designed to support a large number of input devices including terminals, workstations and personal computers. XVT's Portability Toolkit is used to build the GUI in C/C++ to run on: OSF/Motif (UNIX or VMS), OPEN LOOK (UNIX), or Macintosh, or MS-Windows (DOS), or character systems.
Innovative Near Real-Time Data Dissemination Tools Developed by the Space Weather Research Center
NASA Astrophysics Data System (ADS)
Mullinix, R.; Maddox, M. M.; Berrios, D.; Kuznetsova, M.; Pulkkinen, A.; Rastaetter, L.; Zheng, Y.
2012-12-01
Space weather affects virtually all of NASA's endeavors, from robotic missions to human exploration. Knowledge and prediction of space weather conditions are therefore essential to NASA operations. The diverse nature of currently available space environment measurements and modeling products compels the need for a single access point to such information. The Integrated Space Weather Analysis (iSWA) System provides this single point access along with the capability to collect and catalog a vast range of sources including both observational and model data. NASA Goddard Space Weather Research Center heavily utilizes the iSWA System daily for research, space weather model validation, and forecasting for NASA missions. iSWA provides the capabilities to view and analyze near real-time space weather data from any where in the world. This presentation will describe the technology behind the iSWA system and describe how to use the system for space weather research, forecasting, training, education, and sharing.
Vilnius Multicolor CCD Photometry of the Open Cluster NGC 752
NASA Astrophysics Data System (ADS)
Bartašiūtė, S.; Janusz, R.; Boyle, R. P.; Philip, A. G. Davis
We have performed multicolor CCD observations of the central area of NGC 752 to search for faint, low-mass members of this open cluster. Four 12'x12' fields were taken on the 1.8 m Vatican Advanced Technology Telescope (Mt. Graham, Arizona) using a 4K CCD camera and eight intermediate-band filters of the Strömvil system. In this paper we present a catalog of photometry for 405 stars down to the limiting magnitude V=18.5, which contains V magnitudes and color indices of the Vilnius system, together with photometric determinations of spectral types, absolute magnitudes MV, interstellar reddening values EY-V and metallicity parameters [Fe/H]. The good quality multicolor data made it possible to identify the locus of the lower main sequence to four magnitudes beyond the previous (photographic) limit. A relatively small number of photometric members identified at faint magnitudes seems to be indicative of actual dissolution of the cluster from the low-mass end.
VizieR Online Data Catalog: IC 361 Vilnius photometry (Zdanavicius+, 2010)
NASA Astrophysics Data System (ADS)
Zdanavicius, J.; Bartasiute, S.; Boyle, R. P.; Vrba, F. J.; Zdanavicius, K.
2015-03-01
CCD observations in seven filters U,P,X,Y,Z,V,S of the Vilnius system plus the filter I of the Cousins system were carried out in December of 1999 with a 2K CCD camera on the 1m telescope of the USNO Flagstaff Station (Arizona), which gives a field of the diameter of 20'. Repeated observations in the Vilnius filters were done with the same telescope and a new 2Kx2K CCD camera in March of 2009. During the latter run we have obtained well-calibrated CCD data only for filters Y, Z, V, S, since observations through the remaining three filters on the succeeding night were curtailed by cirrus clouds. Additional frames in the Vilnius filters U,Y,V were taken for the central part of the field (12'x12') in December of 2008 with a 4K CCD camera on the 1.8m Vatican Advanced Technology Telescope (VATT) on Mt. Graham (Arizona). (1 data file).
Origins of the Human Genome Project
DOE R&D Accomplishments Database
Cook-Deegan, Robert (Affiliation: Institute of Medicine, National Academy of Sciences)
1993-07-01
The human genome project was borne of technology, grew into a science bureaucracy in the United States and throughout the world, and is now being transformed into a hybrid academic and commercial enterprise. The next phase of the project promises to veer more sharply toward commercial application, harnessing both the technical prowess of molecular biology and the rapidly growing body of knowledge about DNA structure to the pursuit of practical benefits. Faith that the systematic analysis of DNA structure will prove to be a powerful research tool underlies the rationale behind the genome project. The notion that most genetic information is embedded in the sequence of CNA base pairs comprising chromosomes is a central tenet. A rough analogy is to liken an organism's genetic code to computer code. The coal of the genome project, in this parlance, is to identify and catalog 75,000 or more files (genes) in the software that directs construction of a self-modifying and self-replicating system -- a living organism.
Education and Science Connect at Sea
NASA Astrophysics Data System (ADS)
Leckie, R. Mark; St. John, Kristen; Peart, Leslie; Klaus, Ann; Slough, Scott; Niemitz, Matt
2006-06-01
In the past several decades, the scientific community's collective understanding of Earth's history and the processes that shape this dynamic planet has grown exponentially. Yet communicating the current understanding of Earth systems to the community outside of science (educators and students, policy makers, and the general public) has lagged. In 1995, the U.S. National Academy of Sciences (NAS) led the effort to establish National Science Education Standards (http://www.nap.edu/readingroom/books/nses/), with the goal of helping all students achieve scientific literacy. Earth and space sciences are one of the eight categories of content standards. Clearly the establishment of science education standards alone will not foster a scientifically literate society, as indicated in the NAS report ``Rising Above the Gathering Storm'' (http://www.nap.edu/catalog/11463.html). This report, released last fall, warns that without strong steps to improve federal support for science and technology education, the quality of life in the United States is threatened as the country loses its competitive edge.
NASA Astrophysics Data System (ADS)
Weber, J.; Domenico, B.
2004-12-01
This paper is an example of what we call data interactive publications. With a properly configured workstation, the readers can click on "hotspots" in the document that launches an interactive analysis tool called the Unidata Integrated Data Viewer (IDV). The IDV will enable the readers to access, analyze and display datasets on remote servers as well as documents describing them. Beyond the parameters and datasets initially configured into the paper, the analysis tool will have access to all the other dataset parameters as well as to a host of other datasets on remote servers. These data interactive publications are built on top of several data delivery, access, discovery, and visualization tools developed by Unidata and its partner organizations. For purposes of illustrating this integrative technology, we will use data from the event of Hurricane Charley over Florida from August 13-15, 2004. This event illustrates how components of this process fit together. The Local Data Manager (LDM), Open-source Project for a Network Data Access Protocol (OPeNDAP) and Abstract Data Distribution Environment (ADDE) services, Thematic Realtime Environmental Distributed Data Service (THREDDS) cataloging services, and the IDV are highlighted in this example of a publication with embedded pointers for accessing and interacting with remote datasets. An important objective of this paper is to illustrate how these integrated technologies foster the creation of documents that allow the reader to learn the scientific concepts by direct interaction with illustrative datasets, and help build a framework for integrated Earth System science.
Filming seismograms and related materials at the California Institute of Technology
NASA Astrophysics Data System (ADS)
Goodstein, Judith R.; Roberts, Paul
As part of the worldwide effort to create an international earthquake data bank, the seismology archive of the California Institute of Technology (Caltech) has been organized, labeled, described, and microfilmed. It includes a wide variety of original records, documents, and printed materials relating to local and distant earthquakes. The single largest and most complex component of the task has been the preparation and microfilming of Caltech's vast collection of original seismograms. The original proposal envisioned a modest project in which a selected number of seismographic records at Caltech could be made more generally available to the scientific community. These single-copy records are stored at Kresge Laboratory and comprise thousands of individual photographic sheets, each 30×92 cm. In the end, we microfilmed both the Pasadena station records and those written at the six original stations in the Caltech network. This task got underway in June 1981 and was completed in January 1985. In the course of the project, the staff sorted, arranged, inventoried, copied, and refiled more than 276,000 records written between January 10, 1923 and December 31, 1962. The microfilm edition of the earthquake records at the Seismological Laboratory at Pasadena and at auxiliary stations at Mount Wilson, Riverside, Santa Barbara, La Jolla, Tinemaha, and Haiwee (the latter two in the Owens Valley) consists of 461 reels of film. The film archive is cataloged and available to researchers in Caltech's Millikan Library in Pasadena, at the U.S. Geological Survey in Menlo Park, Calif, and at the World Data Center (National Oceanic and Atmospheric Administration) in Boulder, Colo.
VizieR Online Data Catalog: W1J00 and W2J00 Transit Circle Catalogs (Rafferty+, 2016)
NASA Astrophysics Data System (ADS)
Rafferty, T. J.; Holdenried, E. R.; Urban, S. E.
2016-06-01
The W1J00, named because it was the first (of two) Washington transit circle catalog to be referred to the Equinox of J2000.0, is the result of observations made with the Six-inch Transit Circle in Washington, D.C., between September 1977 and July 1982. The observing program was structured to be absolute, in the sense that the positions were not explicitly relying on any previous observations. The absolute positions were defined with respect to an internally consistent frame that was unique to the particular instrument. Following the reductions, comparisons with stars from the Hipparcos Catalogue (European Space Agency 1997) revealed unaccounted for systematic differences on the level of 100-200mas. It was decided, therefore, to include data on both the absolute positions reduced in way common to many past Washington transit circle catalogs, as well as the positions differentially adjusted to the system of the Hipparcos Catalog. The W1J00 contains mean positions of 7267 stars and 4383 observations of solar system objects. The majority of the stars fall into two categories; those from the Fifth Fundamental Catalog (FK5; Fricke et al 1988), and those from the Catalog Of 3539 Zodiacal Stars For The Equinox 1950.0 (Robertson 1940). The solar system objects include the Sun, Mercury, Venus, Mars, Jupiter, Saturn, Uranus, Neptune, eight minor planets (Eunomia, Flora, Hebe, Iris, Juno, Metis, Pallas, and Vesta), and the dwarf planet Ceres. Characteristics of the W1J00 catalog: Category Range Average ------------------------------------------------------------- Magnitudes -1.6 to 10.4 7.18 RA standard errors of the mean 15 to 460 mas 98 mas Dec standard errors of the mean 10 to 400 mas 107 mas RA Number of observations / star 3 to 187 10 Dec Number of observations / star 2 to 179 10 Declination coverage -39 to +90 degrees ------------------------------------------------------------- Details of the W1J00 can be found in Rafferty, Holdenried, and Urban (2016, Publ. USNO, 2nd series, vol. XXVII (part 1)). The W2J00 is the result of observations made with the Six-inch Transit Circle in Washington, D.C., and the Seven-inch Transit Circle at the Black Birch station near Blenheim, New Zealand. It is named as such because it was the second (of two) transit circle catalogs to be referred to the Equinox of J2000.0, and reduced at the Washington D.C. headquarters of the U.S. Naval Observatory. It is sometimes referred to as the "Pole-to-Pole" program due to the fact that the telescopes were situated at latitudes such that a fundamental determination could be made of the azimuth using circumpolar stars of both the northern and southern sky. The observations were made between April 1985 and February 1996. The W2J00 project is the latest and largest of a long series of transit circle catalogs produced by the U.S. Naval Observatory. It is also, because of advancing technologies, certainly the last. The observing program was structured to be absolute, in the sense that the reported positions were not to explicitly rely on previous observations. However, with the availability of Hipparcos observational data, it was decided to differentially adjust the observations to the ICRF using the Hipparcos star positions (ESA, 1997, Cat. I/239). A catalog on the ICRF was judged be more useful than one tied to the dynamical reference frame, as was the tradition. The W2J00 contains mean positions of 44,395 globally distributed stars, 5048 observations of the planets, and 6518 observations of the brighter minor planets. The majority of stars are FK stars (Fricke, et al., 1988, Cat. I/149 and 1991, Cat. I/175) and International Reference Stars (IRS) (Corbin, 1991, Cat. I/172). The solar system objects include Mars, Jupiter, Saturn, Uranus, Neptune, twelve minor planets (Amphitrite, Eunomia, Flora, Hebe, Hygiea, Iris, Juno, Melphomene, Metis, Nemausa, Pallas, and Vesta), and the dwarf planet Ceres. Daytime observations of the Sun, Mercury, Venus, Mars, and bright stars were made but not included in the final catalog due to the problems inherent in reducing observations made in the daylight. The W2J00 observing program used both the Six-inch Transit Circle and Seven-inch Transit Circle. Final positions are a combination of observations from both telescopes (for those stars in common). The authors have decided to present not only the combined positions, but the individual telescope's positions should future researchers decide to investigate the data based on which instrument was used. Characteristics of the W2J00 catalog: Category Range Average ------------------------------------------------------------- Magnitudes -1.6 to 9.91 6.84 RA standard errors of the mean 3 to 441 mas 68 mas Dec standard errors of the mean 1 to 448 mas 76 mas RA Number of observations / star 3 to 411 14 Dec Number of observations / star 2 to 418 14 Declination coverage -90 to +90 degrees ------------------------------------------------------------- Details of the W2J00 can be found in Holdenried and Rafferty (2016, PUSNO, 2nd series, vol. XXVII (part 2)). (4 data files).
Code of Federal Regulations, 2010 CFR
2010-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... followed by all cataloging activities participating in the Federal Catalog System. (1) Federal Catalog... of a uniform catalog system. (3) Federal Supply Classification (Cataloging Publication H2 Series...
Code of Federal Regulations, 2013 CFR
2013-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... followed by all cataloging activities participating in the Federal Catalog System. (1) Federal Catalog... of a uniform catalog system. (3) Federal Supply Classification (Cataloging Publication H2 Series...
Code of Federal Regulations, 2012 CFR
2012-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... followed by all cataloging activities participating in the Federal Catalog System. (1) Federal Catalog... of a uniform catalog system. (3) Federal Supply Classification (Cataloging Publication H2 Series...
Code of Federal Regulations, 2011 CFR
2011-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... followed by all cataloging activities participating in the Federal Catalog System. (1) Federal Catalog... of a uniform catalog system. (3) Federal Supply Classification (Cataloging Publication H2 Series...
Code of Federal Regulations, 2014 CFR
2014-07-01
... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... followed by all cataloging activities participating in the Federal Catalog System. (1) Federal Catalog... of a uniform catalog system. (3) Federal Supply Classification (Cataloging Publication H2 Series...
NASA Astrophysics Data System (ADS)
Moore, R. T.; Hansen, M. C.
2011-12-01
Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.
Looking Up at Mars Rover Curiosity in Buckskin Selfie
2015-08-19
This low-angle self-portrait of NASA's Curiosity Mars rover shows the vehicle at the site from which it reached down to drill into a rock target called "Buckskin" on lower Mount Sharp. The selfie combines several component images taken by Curiosity's Mars Hand Lens Imager (MAHLI) on Aug. 5, 2015, during the 1,065th Martian day, or sol, of the rover's work on Mars. For scale, the rover's wheels are 20 inches (50 centimeters) in diameter and about 16 inches (40 centimeters) wide. This view is a portion of a larger panorama available at PIA19807. A close look reveals a small rock stuck onto Curiosity's left middle wheel (on the right in this head-on view). The rock had been seen previously during periodic monitoring of wheel condition about three weeks earlier, in the MAHLI raw image at http://mars.nasa.gov/msl/multimedia/raw/?rawid=1046MH0002640000400290E01_DXXX&s=1046. MAHLI is mounted at the end of the rover's robotic arm. For this self-portrait, the rover team positioned the camera lower in relation to the rover body than for any previous full self-portrait of Curiosity. This yielded a view that includes the rover's "belly," as in a partial self-portrait (/catalog/PIA16137) taken about five weeks after Curiosity's August 2012 landing inside Mars' Gale Crater. The selfie at Buckskin does not include the rover's robotic arm beyond a portion of the upper arm held nearly vertical from the shoulder joint. With the wrist motions and turret rotations used in pointing the camera for the component images, the arm was positioned out of the shot in the frames or portions of frames used in this mosaic. This process was used previously in acquiring and assembling Curiosity self-portraits taken at sample-collection sites "Rocknest" (PIA16468), "John Klein" (PIA16937), "Windjana" (PIA18390) and "Mojave" (PIA19142). MAHLI was built by Malin Space Science Systems, San Diego. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. MAHLI was built by Malin Space Science Systems, San Diego. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19808
Extending the GI Brokering Suite to Support New Interoperability Specifications
NASA Astrophysics Data System (ADS)
Boldrini, E.; Papeschi, F.; Santoro, M.; Nativi, S.
2014-12-01
The GI brokering suite provides the discovery, access, and semantic Brokers (i.e. GI-cat, GI-axe, GI-sem) that empower a Brokering framework for multi-disciplinary and multi-organizational interoperability. GI suite has been successfully deployed in the framework of several programmes and initiatives, such as European Union funded projects, NSF BCube, and the intergovernmental coordinated effort Global Earth Observation System of Systems (GEOSS). Each GI suite Broker facilitates interoperability for a particular functionality (i.e. discovery, access, semantic extension) among a set of brokered resources published by autonomous providers (e.g. data repositories, web services, semantic assets) and a set of heterogeneous consumers (e.g. client applications, portals, apps). A wide set of data models, encoding formats, and service protocols are already supported by the GI suite, such as the ones defined by international standardizing organizations like OGC and ISO (e.g. WxS, CSW, SWE, GML, netCDF) and by Community specifications (e.g. THREDDS, OpenSearch, OPeNDAP, ESRI APIs). Using GI suite, resources published by a particular Community or organization through their specific technology (e.g. OPeNDAP/netCDF) can be transparently discovered, accessed, and used by different Communities utilizing their preferred tools (e.g. a GIS visualizing WMS layers). Since Information Technology is a moving target, new standards and technologies continuously emerge and are adopted in the Earth Science context too. Therefore, GI Brokering suite was conceived to be flexible and accommodate new interoperability protocols and data models. For example, GI suite has recently added support to well-used specifications, introduced to implement Linked data, Semantic Web and precise community needs. Amongst the others, they included: DCAT: a RDF vocabulary designed to facilitate interoperability between Web data catalogs. CKAN: a data management system for data distribution, particularly used by public administrations. CERIF: used by CRIS (Current Research Information System) instances. HYRAX Server: a scientific dataset publishing component. This presentation will discuss these and other latest GI suite extensions implemented to support new interoperability protocols in use by the Earth Science Communities.
Rubik, Beverly
2017-01-01
This study investigated whether short-term exposure to a passive online software application of purported subtle energy technology would affect heart rate variability (HRV) and associated autonomic nervous system measures. This was a randomized, double-blinded, sham-controlled clinical trial (RCT). The study took place in a nonprofit laboratory in Emeryville, California. Twenty healthy, nonsmoking subjects (16 females), aged 40-75 years, participated. Quantum Code Technology ™ (QCT), a purported subtle energy technology, was delivered through a passive software application (Heart+ App) on a smartphone placed <1 m from subjects who were seated and reading a catalog. HRV was measured for 5 min in triplicate for each condition via finger plethysmography using a Food and Drug Administration medically approved HRV measurement device. Measurements were made at baseline and 35 min following exposure to the software applications. The following parameters were calculated and analyzed: heart rate, total power, standard deviation node-to-node, root mean square sequential difference, low frequency to high frequency ratio (LF/HF), low frequency (LF), and high frequency (HF). Paired samples t-tests showed that for the Heart+ App, mean LF/HF decreased (p = 9.5 × 10 -4 ), while mean LF decreased in a trend (p = 0.06), indicating reduced sympathetic dominance. Root mean square sequential difference increased for the Heart+ App, showing a possible trend (p = 0.09). Post-pre differences in LF/HF for sham compared with the Heart+ App were also significant (p < 0.008) by independent t-test, indicating clinical relevance. Significant beneficial changes in mean LF/HF, along with possible trends in mean LF and root mean square sequential difference, were observed in subjects following 35 min exposure to the Heart+ App that was working in the background on an active smartphone untouched by the subjects. This may be the first RCT to show that specific frequencies of a purported non-Hertzian type of subtle energy conveyed by software applications broadcast from personal electronic devices can be bioactive and beneficially impact autonomic nervous system balance.
NASA Space Biology Plant Research for 2010-2020
NASA Technical Reports Server (NTRS)
Levine, H. G.; Tomko, D. L.; Porterfield, D. M.
2012-01-01
The U.S. National Research Council (NRC) recently published "Recapturing a Future for Space Exploration: Life and Physical Sciences Research for a New Era" (http://www.nap.edu/catalog.php?record id=13048), and NASA completed a Space Biology Science Plan to develop a strategy for implementing its recommendations ( http://www.nasa.gov/exploration/library/esmd documents.html). The most important recommendations of the NRC report on plant biology in space were that NASA should: (1) investigate the roles of microbial-plant systems in long-term bioregenerative life support systems, and (2) establish a robust spaceflight program of research analyzing plant growth and physiological responses to the multiple stimuli encountered in spaceflight environments. These efforts should take advantage of recently emerged analytical technologies (genomics, transcriptomics, proteomics, metabolomics) and apply modern cellular and molecular approaches in the development of a vigorous flight-based and ground-based research program. This talk will describe NASA's strategy and plans for implementing these NRC Plant Space Biology recommendations. New research capabilities for Plant Biology, optimized by providing state-of-the-art automated technology and analytical techniques to maximize scientific return, will be described. Flight experiments will use the most appropriate platform to achieve science results (e.g., ISS, free flyers, sub-orbital flights) and NASA will work closely with its international partners and other U.S. agencies to achieve its objectives. One of NASA's highest priorities in Space Biology is the development research capabilities for use on the International Space Station and other flight platforms for studying multiple generations of large plants. NASA will issue recurring NASA Research Announcements (NRAs) that include a rapid turn-around model to more fully engage the biology community in designing experiments to respond to the NRC recommendations. In doing so, NASA's Space Biology research will optimize ISS research utilization, develop and demonstrate technology and hardware that will enable new science, and contribute to the base of fundamental knowledge that will facilitate development of new tools for human space exploration and Earth applications. By taking these steps, NASA will energize the Space Biology user community and advance our knowledge of the effect of the space flight environment on living systems.
Model of Mars-Bound MarCO CubeSat
2015-06-12
Engineers for NASA's MarCO technology demonstration display a full-scale mechanical mock-up of the small craft in development as part of NASA's next mission to Mars. Mechanical engineer Joel Steinkraus and systems engineer Farah Alibay are on the team at NASA's Jet Propulsion Laboratory, Pasadena, California, preparing twin MarCO (Mars Cube One) CubeSats for a March 2016 launch. MarCO is the first interplanetary mission using CubeSat technologies for small spacecraft. The briefcase-size MarCO twins will ride along on an Atlas V launch vehicle lifting off from Vandenberg Air Force Base, California, with NASA's next Mars lander, InSight. The mock-up in the photo is in a configuration to show the deployed position of components that correspond to MarCO's two solar panels and two antennas. During launch, those components will be stowed for a total vehicle size of about 14.4 inches (36.6 centimeters) by 9.5 inches (24.3 centimeters) by 4.6 inches (11.8 centimeters). After launch, the two MarCO CubeSats and InSight will be navigated separately to Mars. The MarCO twins will fly past the planet in September 2016 just as InSight is descending through the atmosphere and landing on the surface. MarCO is a technology demonstration mission to relay communications from InSight to Earth during InSight's descent and landing. InSight communications during that critical period will also be recorded by NASA's Mars Reconnaissance Orbiter for delayed transmission to Earth. InSight -- an acronym for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport -- will study the interior of Mars to improve understanding of the processes that formed and shaped rocky planets, including Earth. After launch, the MarCO twins and InSight will be navigated separately to Mars. Note: After thorough examination, NASA managers have decided to suspend the planned March 2016 launch of the Interior Exploration using Seismic Investigations Geodesy and Heat Transport (InSight) mission. The decision follows unsuccessful attempts to repair a leak in a section of the prime instrument in the science payload. http://photojournal.jpl.nasa.gov/catalog/PIA19389
Size Contrast for Mars CubeSat
2015-06-12
The full-scale mock-up of NASA's MarCO CubeSat held by Farah Alibay, a systems engineer at NASA's Jet Propulsion Laboratory, is dwarfed by the one-half-scale model of NASA's Mars Reconnaissance Orbiter behind her. MarCO, short for Mars Cube One, is the first interplanetary use of CubeSat technologies for small spacecraft. JPL is preparing two MarCO twins for launch in March 2016. They will ride along on an Atlas V launch vehicle lifting off from Vandenberg Air Force Base, California, with NASA's next Mars lander, InSight. MarCO is a technology demonstration aspect of the InSight mission. The mock-up in the photo is in a configuration to show the deployed position of components that correspond to MarCO's two solar panels and two antennas. During launch, those components will be stowed for a total vehicle size of about 14.4 inches (36.6 centimeters) by 9.5 inches (24.3 centimeters) by 4.6 inches (11.8 centimeters). After launch, the two MarCO CubeSats and InSight will be navigated separately to Mars. The MarCO twins will fly past the planet in September 2016 just as InSight is descending through the atmosphere and landing on the surface. MarCO is a technology demonstration to relay communications from InSight to Earth during InSight's descent and landing. InSight communications during that critical period will also be recorded by NASA's Mars Reconnaissance Orbiter for delayed transmission to Earth. InSight -- an acronym for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport -- will study the interior of Mars to improve understanding of the processes that formed and shaped rocky planets, including Earth. Note: After thorough examination, NASA managers have decided to suspend the planned March 2016 launch of the Interior Exploration using Seismic Investigations Geodesy and Heat Transport (InSight) mission. The decision follows unsuccessful attempts to repair a leak in a section of the prime instrument in the science payload. http://photojournal.jpl.nasa.gov/catalog/PIA19671
The BCube Crawler: Web Scale Data and Service Discovery for EarthCube.
NASA Astrophysics Data System (ADS)
Lopez, L. A.; Khalsa, S. J. S.; Duerr, R.; Tayachow, A.; Mingo, E.
2014-12-01
Web-crawling, a core component of the NSF-funded BCube project, is researching and applying the use of big data technologies to find and characterize different types of web services, catalog interfaces, and data feeds such as the ESIP OpenSearch, OGC W*S, THREDDS, and OAI-PMH that describe or provide access to scientific datasets. Given the scale of the Internet, which challenges even large search providers such as Google, the BCube plan for discovering these web accessible services is to subdivide the problem into three smaller, more tractable issues. The first, to be able to discover likely sites where relevant data and data services might be found, the second, to be able to deeply crawl the sites discovered to find any data and services which might be present. Lastly, to leverage the use of semantic technologies to characterize the services and data found, and to filter out everything but those relevant to the geosciences. To address the first two challenges BCube uses an adapted version of Apache Nutch (which originated Hadoop), a web scale crawler, and Amazon's ElasticMapReduce service for flexibility and cost effectiveness. For characterization of the services found, BCube is examining existing web service ontologies for their applicability to our needs and will re-use and/or extend these in order to query for services with specific well-defined characteristics in scientific datasets such as the use of geospatial namespaces. The original proposal for the crawler won a grant from Amazon's academic program, which allowed us to become operational; we successfully tested the Bcube Crawler at web scale obtaining a significant corpus, sizeable enough to enable work on characterization of the services and data found. There is still plenty of work to be done, doing "smart crawls" by managing the frontier, developing and enhancing our scoring algorithms and fully implementing the semantic characterization technologies. We describe the current status of the project, our successes and issues encountered. The final goal of the BCube crawler project is to provide relevant data services to other projects on the EarthCube stack and third party partners so they can be brokered and used by a wider scientific community.
New Instruments for Survey: on Line Softwares for 3d Recontruction from Images
NASA Astrophysics Data System (ADS)
Fratus de Balestrini, E.; Guerra, F.
2011-09-01
3d scanning technologies had a significant development and have been widely used in documentation of cultural, architectural and archeological heritages. Modern methods of three-dimensional acquiring and modeling allow to represent an object through a digital model that combines visual potentialities of images (normally used for documentation) to the accuracy of the survey, becoming at the same time support for the visualization that for metric evaluation of any artefact that have an historical or artistic interest, opening up new possibilities for cultural heritage's fruition, cataloging and study. Despite this development, because of the small catchment area and the 3D laser scanner's sophisticated technologies, the cost of these instruments is very high and beyond the reach of most operators in the field of cultural heritages. This is the reason why they have appeared low-cost technologies or even free, allowing anyone to approach the issues of acquisition and 3D modeling, providing tools that allow to create three-dimensional models in a simple and economical way. The research, conducted by the Laboratory of Photogrammetry of the University IUAV of Venice, of which we present here some results, is intended to figure out whether, with Arc3D, it is possible to obtain results that can be somehow comparable, in therms of overall quality, to those of the laser scanner, and/or whether it is possible to integrate them. They were carried out a series of tests on certain types of objects, models made with Arc3D, from raster images, were compared with those obtained using the point clouds from laser scanner. We have also analyzed the conditions for an optimal use of Arc3D: environmental conditions (lighting), acquisition tools (digital cameras) and type and size of objects. After performing the tests described above, we analyzed the patterns generated by Arc3D to check what other graphic representations can be obtained from them: orthophotos and drawings. The research's result is a critical analysis of the software's potentialities, with an indication of the areas in which it is possible an effective and alternative use to other methods of survey.
Astronomical Data Center Bulletin, volume 1, number 2
NASA Technical Reports Server (NTRS)
Nagy, T. A.; Warren, W. H., Jr.; Mead, J. M.
1981-01-01
Work in progress on astronomical catalogs is presented in 16 papers. Topics cover astronomical data center operations; automatic astronomical data retrieval at GSFC; interactive computer reference search of astronomical literature 1950-1976; formatting, checking, and documenting machine-readable catalogs; interactive catalog of UV, optical, and HI data for 201 Virgo cluster galaxies; machine-readable version of the general catalog of variable stars, third edition; galactic latitude and magnitude distribution of two astronomical catalogs; the catalog of open star clusters; infrared astronomical data base and catalog of infrared observations; the Air Force geophysics laboratory; revised magnetic tape of the N30 catalog of 5,268 standard stars; positional correlation of the two-micron sky survey and Smithsonian Astrophysical Observatory catalog sources; search capabilities for the catalog of stellar identifications (CSI) 1979 version; CSI statistics: blue magnitude versus spectral type; catalogs available from the Astronomical Data Center; and status report on machine-readable astronomical catalogs.
Cataloging Practices in India: Efforts for Standardization.
ERIC Educational Resources Information Center
Tikku, Upinder Kumar
1984-01-01
Surveys current cataloging practices in Indian libraries and discusses standardization in cataloging, types of catalogs, cataloging codes (Anglo-American and Ranganathan), subject headings, descriptive cataloging, and standardization efforts (international, United States, USSR, Great Britain, India). Footnotes are included. (EJS)
Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging
NASA Astrophysics Data System (ADS)
Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.
2017-10-01
Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.
U.S. Spacesuit Knowledge Capture Series Catalog
NASA Technical Reports Server (NTRS)
Bitterly, Rose; Oliva, Vladenka
2012-01-01
The National Aeronautics and Space Administration (NASA) and other organizations have been performing U.S. Spacesuit Knowledge Capture (USSKC) since the beginning of space exploration through published reports, conference presentations, specialized seminars, and classes instructed by veterans in the field. The close physical interaction between spacesuit systems and human beings makes them among the most personally evocative pieces of space hardware. Consequently, spacesuit systems have required nearly constant engineering refinements to do their jobs without impinging on human activity. Since 2008, spacesuit knowledge capture has occurred through video recording, engaging both current and former specialists presenting technical scope specifically to educate individuals and preserve knowledge. These archives of spacesuit legacy reflect its rich history and will provide knowledge that will enhance the chances for the success of future and more ambitious spacesuit system programs. The scope and topics of USSKC have included lessons learned in spacesuit technology; experience from the Gemini, Apollo, Skylab, and Shuttle Programs; the process of hardware certification, design, development, and other program components; spacesuit evolution and experience; failure analysis and resolution; and aspects of program management. USSKC activities have progressed to a level where NASA, the National Air and Space Museum (NASM), Hamilton Sundstrand (HS) and the spacesuit community are now working together to provide a comprehensive way to organize and archive intra-agency information related to the development of spacesuit systems. These video recordings are currently being reviewed for public release using NASA export control processes. After a decision is made for either public or non-public release (internal NASA only), the videos and presentations will be available through the NASA Johnson Space Center Engineering Directorate (EA) Engineering Academy, the NASA Technical Reports Server (NTRS), the NASA Aeronautics & Space Database (NA&SD), or NASA YouTube. Event availability is duly noted in this catalog.
Collaborative workbench for cyberinfrastructure to accelerate science algorithm development
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Maskey, M.; Kuo, K.; Lynnes, C.
2013-12-01
There are significant untapped resources for information and knowledge creation within the Earth Science community in the form of data, algorithms, services, analysis workflows or scripts, and the related knowledge about these resources. Despite the huge growth in social networking and collaboration platforms, these resources often reside on an investigator's workstation or laboratory and are rarely shared. A major reason for this is that there are very few scientific collaboration platforms, and those that exist typically require the use of a new set of analysis tools and paradigms to leverage the shared infrastructure. As a result, adoption of these collaborative platforms for science research is inhibited by the high cost to an individual scientist of switching from his or her own familiar environment and set of tools to a new environment and tool set. This presentation will describe an ongoing project developing an Earth Science Collaborative Workbench (CWB). The CWB approach will eliminate this barrier by augmenting a scientist's current research environment and tool set to allow him or her to easily share diverse data and algorithms. The CWB will leverage evolving technologies such as commodity computing and social networking to design an architecture for scalable collaboration that will support the emerging vision of an Earth Science Collaboratory. The CWB is being implemented on the robust and open source Eclipse framework and will be compatible with widely used scientific analysis tools such as IDL. The myScience Catalog built into CWB will capture and track metadata and provenance about data and algorithms for the researchers in a non-intrusive manner with minimal overhead. Seamless interfaces to multiple Cloud services will support sharing algorithms, data, and analysis results, as well as access to storage and computer resources. A Community Catalog will track the use of shared science artifacts and manage collaborations among researchers.
NASA Astrophysics Data System (ADS)
Samodra, G.; Chen, G.; Sartohadi, J.; Kasama, K.
2018-04-01
This paper proposes an approach for landslide inventory mapping considering actual conditions in Indonesia. No satisfactory landslide database exists. What exists is inadequate, focusing, on data response, rather than on pre-disaster preparedness and planning. The humid tropical climate also leads a rapid vegetation growth so past landslides signatures are covered by vegetation or dismantled by erosion process. Generating landslide inventory using standard techniques still seems difficult. A catalog of disasters from local government (village level) was used as a basis of participatory landslide inventory mapping. Eyewitnesses or landslide disaster victims were asked to participate in the reconstruction of past landslides. Field investigation focusing on active participation from communities with the use of an innovative technology was used to verify the landslide events recorded in the disaster catalog. Statistical analysis was also used to obtain the necessary relationships between geometric measurements, including the height of the slope and length of run out, area and volume of displaced materials, the probability distributions of landslide area and volume, and mobilization rate. The result shows that run out distance is proportional to the height of the slope. The frequency distribution calculated by using non-cumulative distribution empirically exhibits a power law (fractal statistic) even though rollover can also be found in the dataset. This cannot be the result of the censoring effect or incompleteness of the data because the landslide inventory dataset can be classified as having complete data or nearly complete data. The so-called participatory landslide inventory mapping method is expected to solve the difficulties of landslide inventory mapping and can be applied to support pre-disaster planning and preparedness action to reduce the landslide disaster risk in Indonesia. It may also supplement the usually incomplete data in a typical landslide inventory.
Tsunami Data and Scientific Data Diplomacy
NASA Astrophysics Data System (ADS)
Arcos, N. P.; Dunbar, P. K.; Gusiakov, V. K.; Kong, L. S. L.; Aliaga, B.; Yamamoto, M.; Stroker, K. J.
2016-12-01
Free and open access to data and information fosters scientific progress and can build bridges between nations even when political relationships are strained. Data and information held by one stakeholder may be vital for promoting research of another. As an emerging field of inquiry, data diplomacy explores how data-sharing helps create and support positive relationships between countries to enable the use of data for societal and humanitarian benefit. Tsunami has arguably been the only natural hazard that has been addressed so effectively at an international scale and illustrates the success of scientific data diplomacy. Tsunami mitigation requires international scientific cooperation in both tsunami science and technology development. This requires not only international agreements, but working-level relationships between scientists from countries that may have different political and economic policies. For example, following the Pacific wide tsunami of 1960 that killed two thousand people in Chile and then, up to a day later, hundreds in Hawaii, Japan, and the Philippines; delegates from twelve countries met to discuss and draft the requirements for an international tsunami warning system. The Pacific Tsunami Warning System led to the development of local, regional, and global tsunami databases and catalogs. For example, scientists at NOAA/NCEI and the Tsunami Laboratory/Russian Academy of Sciences have collaborated on their tsunami catalogs that are now routinely accessed by scientists and the public around the world. These data support decision-making during tsunami events, are used in developing inundation and evacuation maps, and hazard assessments. This presentation will include additional examples of agreements for data-sharing between countries, as well as challenges in standardization and consistency among the tsunami research community. Tsunami data and scientific data diplomacy have ultimately improved understanding of tsunami and associated impacts.
NASA Astrophysics Data System (ADS)
Lawton, Brandon L.; Smith, D. A.; SMD Astrophysics E/PO Community, NASA
2013-01-01
The NASA Science Education and Public Outreach Forums support the NASA Science Mission Directorate (SMD) and its education and public outreach (E/PO) community in enhancing the coherence, efficiency, and effectiveness of SMD-funded E/PO programs. As a part of this effort, the Astrophysics Forum is coordinating a collaborative project among the NASA SMD astrophysics missions and E/PO programs to create a broader impact for the use of real NASA data in classrooms. Among NASA's major education goals is the training of students in the Science, Technology, Engineering, and Math (STEM) disciplines. The use of real data, from some of the most sophisticated observatories in the world, provide educators an authentic opportunity to teach students basic science process skills, inquiry, and real-world applications of the STEM subjects. The goal of this NASA SMD astrophysics community collaboration is to find a way to maximize the reach of existing real data products produced by E/PO professionals working with NASA E/PO grants and missions in ways that enhance the teaching of the STEM subjects. We present an initial result of our collaboration: defining levels of basic science process skills that lie at the heart of authentic scientific research and national education standards (AAAS Benchmarks) and examples of NASA data products that align with those levels. Our results are the beginning of a larger goal of utilizing the new NASA education resource catalog, NASA Wavelength, for the creation of progressions that tie NASA education resources together. We aim to create an informational sampler that illustrates how an educator can use the NASA Wavelength resource catalog to connect NASA real-data resources that meet the educational goals of their class.
Astronomical catalog desk reference, 1994 edition
NASA Technical Reports Server (NTRS)
1994-01-01
The Astronomical Catalog Desk Reference is designed to aid astronomers in locating machine readable catalogs in the Astronomical Data Center (ADC) archives. The key reference components of this document are as follows: A listing of shortened titles for all catalogs available from the ADC (includes the name of the lead author and year of publication), brief descriptions of over 300 astronomical catalogs, an index of ADC catalog numbers by subject keyword, and an index of ADC catalog numbers by author. The heart of this document is the set of brief descriptions generated by the ADC staff. The 1994 edition of the Astronomical Catalog Desk Reference contains descriptions for over one third of the catalogs in the ADC archives. Readers are encouraged to refer to this section for concise summaries of those catalogs and their contents.
Design of Formats and Packs of Catalog Cards.
ERIC Educational Resources Information Center
OCLC Online Computer Library Center, Inc., Dublin, OH.
The three major functions of the Ohio College Library Center's Shared Cataloging System are: 1) provision of union catalog location listing; 2) making available cataloging done by one library to all other users of the system; and 3) production of catalog cards. The system, based on a central machine readable data base, speeds cataloging and…
The Catalog as Portal to the Internet.
ERIC Educational Resources Information Center
Thomas, Sarah E.
This paper examines the potential of the library catalog to serve as a portal to the Internet. The first section provides an overview of the development of the catalog, including the emergence of the union catalog, standardization of cataloging practice, MARC format, and the insufficiency of resources to catalog all the titles acquired by…
41 CFR 101-30.401 - Data available from the Federal Catalog System.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Federal Catalog System. 101-30.401 Section 101-30.401 Public Contracts and Property Management Federal...-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401 Data available from the Federal Catalog System. Federal Catalog System data are available in publications of general interest to...
41 CFR 101-30.401 - Data available from the Federal Catalog System.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Federal Catalog System. 101-30.401 Section 101-30.401 Public Contracts and Property Management Federal...-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401 Data available from the Federal Catalog System. Federal Catalog System data are available in publications of general interest to...
41 CFR 101-30.401 - Data available from the Federal Catalog System.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Federal Catalog System. 101-30.401 Section 101-30.401 Public Contracts and Property Management Federal...-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401 Data available from the Federal Catalog System. Federal Catalog System data are available in publications of general interest to...
41 CFR 101-30.401 - Data available from the Federal Catalog System.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Federal Catalog System. 101-30.401 Section 101-30.401 Public Contracts and Property Management Federal...-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401 Data available from the Federal Catalog System. Federal Catalog System data are available in publications of general interest to...
41 CFR 101-30.401 - Data available from the Federal Catalog System.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Federal Catalog System. 101-30.401 Section 101-30.401 Public Contracts and Property Management Federal...-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401 Data available from the Federal Catalog System. Federal Catalog System data are available in publications of general interest to...
Weaves as an Interconnection Fabric for ASIM's and Nanosatellites
NASA Technical Reports Server (NTRS)
Gorlick, Michael M.
1995-01-01
Many of the micromachines under consideration require computer support, indeed, one of the appeals of this technology is the ability to intermix mechanical, optical, analog, and digital devices on the same substrate. The amount of computer power is rarely an issue, the sticking point is the complexity of the software required to make effective use of these devices. Micromachines are the nano-technologist's equivalent of 'golden screws'. In other words, they will be piece parts in larger assemblages. For example, a nano-satellite may be composed of stacked silicon wafers where each wafer contains hundreds to thousands of micromachines, digital controllers, general purpose computers, memories, and high-speed bus interconnects. Comparatively few of these devices will be custom designed, most will be stock parts selected from libraries and catalogs. The novelty will lie in the interconnections. For example, a digital accelerometer may be a component part in an adaptive suspension, a monitoring element embedded in the wrapper of a package, or a portion of the smart skin of a launch vehicle. In each case, this device must inter-operate with other devices and probes for the purposes of command, control, and communication. We propose a software technology called 'weaves' that will permit large collections of micromachines and their attendant computers to freely intercommunicate while preserving modularity, transparency, and flexibility. Weaves are composed of networks of communicating software components. The network, and the components comprising it, may be changed even while the software, and the devices it controls, are executing. This unusual degree of software plasticity permits micromachines to dynamically adapt the software to changing conditions and allows system engineers to rapidly and inexpensively develop special purpose software by assembling stock software components in custom configurations.
Space Images for NASA JPL Android Version
NASA Technical Reports Server (NTRS)
Nelson, Jon D.; Gutheinz, Sandy C.; Strom, Joshua R.; Arca, Jeremy M.; Perez, Martin; Boggs, Karen; Stanboli, Alice
2013-01-01
This software addresses the demand for easily accessible NASA JPL images and videos by providing a user friendly and simple graphical user interface that can be run via the Android platform from any location where Internet connection is available. This app is complementary to the iPhone version of the application. A backend infrastructure stores, tracks, and retrieves space images from the JPL Photojournal and Institutional Communications Web server, and catalogs the information into a streamlined rating infrastructure. This system consists of four distinguishing components: image repository, database, server-side logic, and Android mobile application. The image repository contains images from various JPL flight projects. The database stores the image information as well as the user rating. The server-side logic retrieves the image information from the database and categorizes each image for display. The Android mobile application is an interfacing delivery system that retrieves the image information from the server for each Android mobile device user. Also created is a reporting and tracking system for charting and monitoring usage. Unlike other Android mobile image applications, this system uses the latest emerging technologies to produce image listings based directly on user input. This allows for countless combinations of images returned. The backend infrastructure uses industry-standard coding and database methods, enabling future software improvement and technology updates. The flexibility of the system design framework permits multiple levels of display possibilities and provides integration capabilities. Unique features of the software include image/video retrieval from a selected set of categories, image Web links that can be shared among e-mail users, sharing to Facebook/Twitter, marking as user's favorites, and image metadata searchable for instant results.
A New Satellite System for Measuring BRDF from Space
NASA Technical Reports Server (NTRS)
Wiscombe, W.; Kaufman, Y.; Herman, J.
1999-01-01
Formation flying of satellites is at the beginning of an explosive growth curve. Spacecraft buses are shrinking to the point where we will soon be able to launch 10 micro-satellites or 100 nano-satellites on a single launch vehicle. Simultaneously, spectrometers are just beginning to be flown in space by both the U.S. and Europe. On-board programmable band aggregation will soon allow exactly the spectral bands desired to be returned to Earth. Further efforts are being devoted to radically shrink spectrometers both in size and weight. And GPS positioning and attitude determination, plus new technologies for attitude control, will allow fleets of satellites to all point at the same Earth target. All these advances, in combination, make possible for the first time the proper measurement of Bidirectional Reflectance Distribution (BRDF) form space. Previously space BDRF's were mere composites, built up over time by viewing different types of scenes at different times, then creating catalogs of BDRF functions whose use relied upon correct "scene identification" --the weak link. Formation-flying micro-satellites, carrying programmable spectrometers and precision-pointing at the same Earth target, can measure the full BDRF simultaneously, in real time. This talk will review these technological advances and discuss an actual proposed concept, based on these advances, to measure Earth-target BDRF's (clouds as well as surface) across the full solar spectrum in the 2010 timeframe. This concept is part of a larger concept called Leonardo for properly measuring the radiative forcing of Earth for climate purposes; lack of knowing of BDRF and of diurnal cycle are at present the two limiting factors preventing improved estimates of this forcing.
Thompson, John W; Sorum, Alexander W; Hsieh-Wilson, Linda C
2018-06-23
The dynamic posttranslational modification O-linked β-N-acetylglucosamine glycosylation (O-GlcNAcylation) is present on thousands of intracellular proteins in the brain. Like phosphorylation, O-GlcNAcylation is inducible and plays important functional roles in both physiology and disease. Recent advances in mass spectrometry (MS) and bioconjugation methods are now enabling the mapping of O-GlcNAcylation events to individual sites in proteins. However, our understanding of which glycosylation events are necessary for regulating protein function and controlling specific processes, phenotypes, or diseases remains in its infancy. Given the sheer number of O-GlcNAc sites, methods are greatly needed to identify promising sites and prioritize them for time- and resource-intensive functional studies. Revealing sites that are dynamically altered by different stimuli or disease states will likely to go a long way in this regard. Here, we describe advanced methods for identifying O-GlcNAc sites on individual proteins and across the proteome, and for determining their stoichiometry in vivo. We also highlight emerging technologies for quantitative, site-specific MS-based O-GlcNAc proteomics (O-GlcNAcomics), which allow proteome-wide tracking of O-GlcNAcylation dynamics at individual sites. These cutting-edge technologies are beginning to bridge the gap between the high-throughput cataloging of O-GlcNAcylated proteins and the relatively low-throughput study of individual proteins. By uncovering the O-GlcNAcylation events that change in specific physiological and disease contexts, these new approaches are providing key insights into the regulatory functions of O-GlcNAc in the brain, including their roles in neuroprotection, neuronal signaling, learning and memory, and neurodegenerative diseases.
Boo! Outsourcing from the Cataloging Perspective.
ERIC Educational Resources Information Center
Hill, Janet Swan
1998-01-01
Examines long-accepted ways library cataloging departments have used outsourcing (cataloging records, card production, authority control, card filling, and retrospective conversion) and potential outsourcing activities (original cataloging, and copy cataloging). Discusses reasons why outsourcing is controversial. (PEN)
A Use Study of the Card Catalogs in the University of Illinois Music Library.
ERIC Educational Resources Information Center
Drone, Jeanette M.
1984-01-01
A multifaceted card catalog use study was conducted at University of Illinois Music Library to determine hourly rate of use at sound recording and book/music catalogs; time spent at catalogs; who uses catalogs and why; difficulties users encounter; success rate of users' searches; recommendations for designing online catalog. (16 references)…
Korea Institute for Advanced Study Value-Added Galaxy Catalog
NASA Astrophysics Data System (ADS)
Choi, Yun-Young; Han, Du-Hwan; Kim, Sungsoo S.
2010-12-01
We present the Korea Institute for Advanced Study Value-Added Galaxy Catalog (KIAS VAGC),a catalog of galaxies based on the Large Scale Structure (LSS) sample of New York University Value-Added Galaxy Catalog (NYU VAGC) Data Release 7. Our catalog supplements redshifts of 10,497 galaxies with 10 < r_{P} ≤ 17.6 (1455 with 10 < r_{P} ≤ 14.5) to the NYU VAGC LSS sample. Redshifts from various existing catalogs such as the Updated Zwicky Catalog, the IRAS Point Source Catalog Redshift Survey, the Third Reference Catalogue of Bright Galaxies, and the Two Degree Field Galaxy Redshift Survey have been put into the NYU VAGC photometric catalog. Our supplementation significantly improves spectroscopic completeness: the area covered by the spectroscopic sample with completeness higher than 95% increases from 2.119 to 1.737 sr.Our catalog also provides morphological types of all galaxies that are determined by the automated morphology classification scheme of Park & Choi (2005), and related parameters, together with fundamental photometry parameters supplied by the NYU VAGC. Our catalog contains matches to objects in the Max Planck for Astronomy (MPA) & Johns Hopkins University (JHU) spectrum measurements (Data Release 7). This new catalog, the KIAS VAGC, is complementary to the NYU VAGC and MPA-JHU catalog.
The Future of Catalogers and Cataloging.
ERIC Educational Resources Information Center
Holley, Robert P.
1981-01-01
Future emphasis in cataloging will be on the sharing of high quality bibliographic records through a national network. As original cataloging decreases, catalogers, rather than disappearing, will more likely be managers of the library's bibliographic control system. (Author/RAA)
Integrating historical clinical and financial data for pharmacological research.
Deshmukh, Vikrant G; Sower, N Brett; Hunter, Cheri Y; Mitchell, Joyce A
2011-11-18
Retrospective research requires longitudinal data, and repositories derived from electronic health records (EHR) can be sources of such data. With Health Information Technology for Economic and Clinical Health (HITECH) Act meaningful use provisions, many institutions are expected to adopt EHRs, but may be left with large amounts of financial and historical clinical data, which can differ significantly from data obtained from newer systems, due to lack or inconsistent use of controlled medical terminologies (CMT) in older systems. We examined different approaches for semantic enrichment of financial data with CMT, and integration of clinical data from disparate historical and current sources for research. Snapshots of financial data from 1999, 2004 and 2009 were mapped automatically to the current inpatient pharmacy catalog, and enriched with RxNorm. Administrative metadata from financial and dispensing systems, RxNorm and two commercial pharmacy vocabularies were used to integrate data from current and historical inpatient pharmacy modules, and the outpatient EHR. Data integration approaches were compared using percentages of automated matches, and effects on cohort size of a retrospective study. During 1999-2009, 71.52%-90.08% of items in use from the financial catalog were enriched using RxNorm; 64.95%-70.37% of items in use from the historical inpatient system were integrated using RxNorm, 85.96%-91.67% using a commercial vocabulary, 87.19%-94.23% using financial metadata, and 77.20%-94.68% using dispensing metadata. During 1999-2009, 48.01%-30.72% of items in use from the outpatient catalog were integrated using RxNorm, and 79.27%-48.60% using a commercial vocabulary. In a cohort of 16304 inpatients obtained from clinical systems, 4172 (25.58%) were found exclusively through integration of historical clinical data, while 15978 (98%) could be identified using semantically enriched financial data. Data integration using metadata from financial/dispensing systems and pharmacy vocabularies were comparable. Given the current state of EHR adoption, semantic enrichment of financial data and integration of historical clinical data would allow the repurposing of these data for research. With the push for HITECH meaningful use, institutions that are transitioning to newer EHRs will be able to use their older financial and clinical data for research using these methods.
Integrating historical clinical and financial data for pharmacological research
2011-01-01
Background Retrospective research requires longitudinal data, and repositories derived from electronic health records (EHR) can be sources of such data. With Health Information Technology for Economic and Clinical Health (HITECH) Act meaningful use provisions, many institutions are expected to adopt EHRs, but may be left with large amounts of financial and historical clinical data, which can differ significantly from data obtained from newer systems, due to lack or inconsistent use of controlled medical terminologies (CMT) in older systems. We examined different approaches for semantic enrichment of financial data with CMT, and integration of clinical data from disparate historical and current sources for research. Methods Snapshots of financial data from 1999, 2004 and 2009 were mapped automatically to the current inpatient pharmacy catalog, and enriched with RxNorm. Administrative metadata from financial and dispensing systems, RxNorm and two commercial pharmacy vocabularies were used to integrate data from current and historical inpatient pharmacy modules, and the outpatient EHR. Data integration approaches were compared using percentages of automated matches, and effects on cohort size of a retrospective study. Results During 1999-2009, 71.52%-90.08% of items in use from the financial catalog were enriched using RxNorm; 64.95%-70.37% of items in use from the historical inpatient system were integrated using RxNorm, 85.96%-91.67% using a commercial vocabulary, 87.19%-94.23% using financial metadata, and 77.20%-94.68% using dispensing metadata. During 1999-2009, 48.01%-30.72% of items in use from the outpatient catalog were integrated using RxNorm, and 79.27%-48.60% using a commercial vocabulary. In a cohort of 16304 inpatients obtained from clinical systems, 4172 (25.58%) were found exclusively through integration of historical clinical data, while 15978 (98%) could be identified using semantically enriched financial data. Conclusions Data integration using metadata from financial/dispensing systems and pharmacy vocabularies were comparable. Given the current state of EHR adoption, semantic enrichment of financial data and integration of historical clinical data would allow the repurposing of these data for research. With the push for HITECH meaningful use, institutions that are transitioning to newer EHRs will be able to use their older financial and clinical data for research using these methods. PMID:22099213
2016-08-09
is image shows a deployed half-scale starshade with four petals at NASA's Jet Propulsion Laboratory, Pasadena, California in 2014. The full-scale of this starshade (not shown) will measure at 111 feet (34 meters). The flower-like petals of the starshade are designed to diffract bright starlight away from telescopes seeking the dim light of exoplanets. The starshade was re-designed from earlier models to allow these petals to furl, or wrap around the spacecraft, for launch into space. Each petal is covered in a high-performance plastic film that resembles gold foil. On a starshade ready for launch, the thermal gold foil will only cover the side of the petals facing away from the telescope, with black on the other, so as not to reflect other light sources such as the Earth into its lens. The starshade is light enough for space and cannot support its own weight on Earth. Is it shown offloaded with counterweights, much like an elevator. Starlight-blocking technologies such as the starshade are being developed to help image exoplanets, with a focus on Earth-sized, habitable worlds. http://photojournal.jpl.nasa.gov/catalog/PIA20909
2016-08-09
This image shows the deployment of a half-scale starshade with four petals at NASA's Jet Propulsion Laboratory in Pasadena, California, in 2014. The full scale of this starshade (not shown) will measure at 34 meters, or approximately 111 feet. The flower-like petals of the starshade are designed to diffract bright starlight away from telescopes seeking the dim light of exoplanets. The starshade was re-designed from earlier models to allow these petals to furl, or wrap around the spacecraft, for launch into space. Once in space, the starshade will need to expand from its tightly-packed launch shape to become large and umbrella-like, ideal for blocking starlight. Each petal is covered in a high-performance plastic film that resembles gold foil. On a starshade ready for launch, the thermal gold foil will only cover the side of the petals facing away from the telescope, with black on the other, so as not to reflect other light sources such as the Earth into its lens. Starlight-blocking technologies such as the starshade are being developed to help image exoplanets, with a focus on Earth-sized, habitable worlds. http://photojournal.jpl.nasa.gov/catalog/PIA20907
Garden City Vein Complex on Lower Mount Sharp, Mars
2015-11-11
Prominent mineral veins at the "Garden City" site examined by NASA's Curiosity Mars rover vary in thickness and brightness, as seen in this image from Curiosity's Mast Camera (Mastcam). The image covers and area roughly 2 feet (60 centimeters) across. Types of vein material evident in the area include: 1) thin, dark-toned fracture filling material; 2) thick, dark-toned vein material in large fractures; 3) light-toned vein material, which was deposited last. Figure 1 includes annotations identifying each of those three major kinds and a scale bar indicating 10 centimeters (3.9 inches). Researchers used the Mastcam and other instruments on Curiosity in March and April 2015 to study the structure and composition of mineral veins at Garden City, for information about fluids that deposited minerals in fractured rock there. Malin Space Science Systems, San Diego, built and operates Curiosity's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, built the rover and manages the project for NASA's Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA19922
VizieR Online Data Catalog: SOFI and ISOCAM observations of Cha II (Persi+, 2003)
NASA Astrophysics Data System (ADS)
Persi, P.; Marenzi, A. R.; Gomez, M.; Olofsson, G.
2003-01-01
A region of approximately 28'x26' of Cha II, centered at RA = 13h 00min 47s, DE = -77° 06' 09" (2000), was surveyed with ISOCAM in raster mode at LW2(5-8.5μm) (TDT N.11500619) and LW3(12-18μm)(TDT N.11500620). All the frames were observed with a pixel field of view (PFOV) of 6", intrinsic integration time Tint=2.1s and ~15s integration time per sky position. The total integration time was of 4472 s and 4474 s for LW2 and LW3, respectively. We obtained J, H, and Ks images of the central part of Cha II covering an area of4.9'x4.9' with the SOFI near-IR camera at the ESO 3.58m New Technology Telescope (NTT) on the night of April 28, 2000 under very good seeing conditions (~0.3") SOFI uses a 1024x1024 pixel HgCdTe array and provides a field of view of 299"x299" with a scale of 0.292"/pix. (2 data files).
National Science Foundation-sponsored workshop report. Draft plan for soybean genomics.
Stacey, Gary; Vodkin, Lila; Parrott, Wayne A; Shoemaker, Randy C
2004-05-01
Recent efforts to coordinate and define a research strategy for soybean (Glycine max) genomics began with the establishment of a Soybean Genetics Executive Committee, which will serve as a communication focal point between the soybean research community and granting agencies. Secondly, a workshop was held to define a strategy to incorporate existing tools into a framework for advancing soybean genomics research. This workshop identified and ranked research priorities essential to making more informed decisions as to how to proceed with large scale sequencing and other genomics efforts. Most critical among these was the need to finalize a physical map and to obtain a better understanding of genome microstructure. Addressing these research needs will require pilot work on new technologies to demonstrate an ability to discriminate between recently duplicated regions in the soybean genome and pilot projects to analyze an adequate amount of random genome sequence to identify and catalog common repeats. The development of additional markers, reverse genetics tools, and bioinformatics is also necessary. Successful implementation of these goals will require close coordination among various working groups.
2016-01-20
Engineers for NASA's MarCO (Mars Cube One) technology demonstration inspect the MarCO test bed, which contains components that are identical to those built for a flight to Mars. Cody Colley, left, MarCO integration and test deputy, and Shannon Statham, MarCO integration and test lead, are on the team at NASA's Jet Propulsion Laboratory, Pasadena, California, preparing twin MarCO CubeSats. The briefcase-size MarCO twins were designed to ride along with NASA's next Mars lander, InSight. Its planned March 2016 launch was suspended. InSight -- an acronym for Interior Exploration using Seismic Investigations, Geodesy and Heat Transport -- will study the interior of Mars to improve understanding of the processes that formed and shaped rocky planets, including Earth. Note: After thorough examination, NASA managers have decided to suspend the planned March 2016 launch of the Interior Exploration using Seismic Investigations Geodesy and Heat Transport (InSight) mission. The decision follows unsuccessful attempts to repair a leak in a section of the prime instrument in the science payload. http://photojournal.jpl.nasa.gov/catalog/PIA20341
Rover Track in Sand Sheet Near Martian Sand Dune
2015-12-10
The rippled surface of the first Martian sand dune ever studied up close fills this view of "High Dune" from the Mast Camera (Mastcam) on NASA's Curiosity rover. This site is part of the "Bagnold Dunes" field along the northwestern flank of Mount Sharp. The dunes are active, migrating up to about one yard or meter per year. The component images of this mosaic view were taken on Nov. 27, 2015, during the 1,176th Martian day, or sol, of Curiosity's work on Mars. The scene is presented with a color adjustment that approximates white balancing, to resemble how the sand would appear under daytime lighting conditions on Earth. The annotated version includes superimposed scale bars of 30 centimeters (1 foot) in the foreground and 100 centimeters (3.3 feet) in the middle distance. Malin Space Science Systems, San Diego, built and operates Curiosity's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, built the rover and manages the project for NASA's Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA20169
High Dune is First Martian Dune Studied up Close
2015-12-10
The rippled surface of the first Martian sand dune ever studied up close fills this view of "High Dune" from the Mast Camera (Mastcam) on NASA's Curiosity rover. This site is part of the "Bagnold Dunes" field along the northwestern flank of Mount Sharp. The dunes are active, migrating up to about one yard or meter per year. The component images of this mosaic view were taken on Nov. 27, 2015, during the 1,176th Martian day, or sol, of Curiosity's work on Mars. The scene is presented with a color adjustment that approximates white balancing, to resemble how the sand would appear under daytime lighting conditions on Earth. The annotated version includes superimposed scale bars of 30 centimeters (1 foot) in the foreground and 100 centimeters (3.3 feet) in the middle distance. Malin Space Science Systems, San Diego, built and operates Curiosity's Mastcam. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, built the rover and manages the project for NASA's Science Mission Directorate, Washington. http://photojournal.jpl.nasa.gov/catalog/PIA20168
NASA Astrophysics Data System (ADS)
Ramírez-Rojas, Alejandro; Telesca, Luciano; Lovallo, Michele; Flores, Leticia
2015-04-01
By using the method of the visibility graph (VG), five magnitude time series extracted from the seismic catalog of the Mexican subduction zone were investigated. The five seismic sequences represent the seismicity which occurred between 2005 and 2012 in five seismic areas: Guerrero, Chiapas, Oaxaca, Jalisco and Michoacan. Among the five seismic sequences, the Jalisco sequence shows VG properties significantly different from those shown by the other four. Such a difference could be inherent in the different tectonic settings of Jalisco with respect to those characterizing the other four areas. The VG properties of the seismic sequences have been put in relationship with the more typical seismological characteristics (b-value and a-value of the Gutenberg-Richter law). The present study was supported by the Bilateral Project Italy-Mexico "Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences", jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016
Transformation of OODT CAS to Perform Larger Tasks
NASA Technical Reports Server (NTRS)
Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean
2008-01-01
A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.
Optimization the composition of sand-lime products modified of diabase aggregate
NASA Astrophysics Data System (ADS)
Komisarczyk, K.; Stępień, A.
2017-10-01
The problem of optimizing the composition of building materials is currently of great importance due to the increasing competitiveness and technological development in the construction industry. This phenomenon also applies to catalog sand-lime. The respective arrangement of individual components or their equivalents, and linking them with the main parameters of the composition of the mixture, i.e. The lime/sand/water should lead to the intended purpose. The introduction of sand-lime diabase aggregate is concluded with a positive effect of final products. The paper presents the results of optimization with the addition of diabase aggregate. The constant value was the amount of water, variable - the mass of the dry ingredients. The program of experimental studies was taken for 6 series of silicates made in industrial conditions. Final samples were tested for mechanical and physico-chemical expanding the analysis of the mercury intrusion porosimetry, SEM and XRD. The results show that, depending on the aggregate’s contribution, exhibit differences. The sample in an amount of 10% diabase aggregate the compressive strength was higher than in the case of reference sample, while modified samples absorbed less water.
2002-12-21
Kennedy Space Center, Florida. - Deep Space 1 is lifted from its work platform, giving a closeup view of the experimental solar-powered ion propulsion engine. The ion propulsion engine is the first non-chemical propulsion to be used as the primary means of propelling a spacecraft. The first flight in NASA's New Millennium Program, Deep Space 1 is designed to validate 12 new technologies for scientific space missions of the next century. Another onboard experiment includes software that tracks celestial bodies so the spacecraft can make its own navigation decisions without the intervention of ground controllers. Deep Space 1 will complete most of its mission objectives within the first two months, but may also do a flyby of a near-Earth asteroid, 1992 KD, in July 1999. Deep Space 1 will be launched aboard a Boeing Delta 7326 rocket from Launch Pad 17A, Cape Canaveral Air Station, in October. Delta II rockets are medium capacity expendable launch vehicles derived from the Delta family of rockets built and launched since 1960. Since then there have been more than 245 Delta launches. http://photojournal.jpl.nasa.gov/catalog/PIA04232
Test Rover at JPL During Preparation for Mars Rover Low-Angle Selfie
2015-08-19
This view of a test rover at NASA's Jet Propulsion Laboratory, Pasadena, California, results from advance testing of arm positions and camera pointings for taking a low-angle self-portrait of NASA's Curiosity Mars rover. This rehearsal in California led to a dramatic Aug. 5, 2015, selfie of Curiosity, online at PIA19807. Curiosity's arm-mounted Mars Hand Lens Imager (MAHLI) camera took 92 of component images that were assembled into that mosaic. The rover team positioned the camera lower in relation to the rover body than for any previous full self-portrait of Curiosity. This practice version was taken at JPL's Mars Yard in July 2013, using the Vehicle System Test Bed (VSTB) rover, which has a test copy of MAHLI on its robotic arm. MAHLI was built by Malin Space Science Systems, San Diego. JPL, a division of the California Institute of Technology in Pasadena, manages the Mars Science Laboratory Project for the NASA Science Mission Directorate, Washington. JPL designed and built the project's Curiosity rover. http://photojournal.jpl.nasa.gov/catalog/PIA19810
2MASS Photometry of the Hot DA White Dwarf Stars in the Palomar Green Survey
NASA Astrophysics Data System (ADS)
Holberg, J. B.; Magargal, K.
2003-12-01
The Palomar Green (PG) Survey is a complete, magnitude limited survey of UV excess objects that continues to provide well-defined sample populations for many types of objects, in particular hot white dwarf stars. The 2MASS All-Sky Survey limiting JHK magnitudes are reasonably well matched to the B magnitude limits of the PG survey. The 2MASS survey, therefore, constitutes an excellent source of uniform, high-quality of photometry, that can be used in conjunction with the PG Survey. The 2MASS Point Source Catalog in the All-Sky Data Release was searched for over 340 hot DA white dwarfs in the PG sample. The resulting JHK colors and apparent magnitudes are used to determine photometric distances for these stars and to place limits on the existence of possible cool binary companions. This publication makes use of data products from the Two Micron All Sky Survey, which is a joint project of the University of Massachusetts and the Infrared Processing and Analysis Center/California Institute of Technology, funded by the National Aeronautics and Space Administration and the National Science Foundation.
Evaluating non-relational storage technology for HEP metadata and meta-data catalog
NASA Astrophysics Data System (ADS)
Grigorieva, M. A.; Golosova, M. V.; Gubin, M. Y.; Klimentov, A. A.; Osipova, V. V.; Ryabinkin, E. A.
2016-10-01
Large-scale scientific experiments produce vast volumes of data. These data are stored, processed and analyzed in a distributed computing environment. The life cycle of experiment is managed by specialized software like Distributed Data Management and Workload Management Systems. In order to be interpreted and mined, experimental data must be accompanied by auxiliary metadata, which are recorded at each data processing step. Metadata describes scientific data and represent scientific objects or results of scientific experiments, allowing them to be shared by various applications, to be recorded in databases or published via Web. Processing and analysis of constantly growing volume of auxiliary metadata is a challenging task, not simpler than the management and processing of experimental data itself. Furthermore, metadata sources are often loosely coupled and potentially may lead to an end-user inconsistency in combined information queries. To aggregate and synthesize a range of primary metadata sources, and enhance them with flexible schema-less addition of aggregated data, we are developing the Data Knowledge Base architecture serving as the intelligence behind GUIs and APIs.
Rick, Cathy; Kearns, Martha A; Thompson, Nancy A
2003-01-01
The health care network and hospital system within the Department of Veterans Affairs (VA), the Veterans Health Administration (VHA), provides employment to more than 56,000 nursing personnel and serves as clinical education site to countless other nursing and health professional students. Nurse administrators and educators are posed with the challenge of providing an environment in which each nurse is able to gain needed knowledge, learn new skills, and share and communicate this knowledge with other colleagues. The education of nurses improves the health status of veterans while also realizing individual professional enhancement. Regional and cultural diversity of the system present challenges to education, in both delivery and content. VHA's learning organizations, the Employee Education System and the Office of Special Projects, have maximized new technologies and information systems to provide innovative, virtual education opportunities, capitalizing on the benefits of informal and formal learning, thus moving VHA to the forefront in knowledge sharing and dissemination. The Virtual Learning Center, VA Knowledge Network, Learning Catalog, and VA Learning Online provide VHA's nurses with interactive, desktop virtual learning opportunities.
Omics Approaches To Probe Microbiota and Drug Metabolism Interactions.
Nichols, Robert G; Hume, Nicole E; Smith, Philip B; Peters, Jeffrey M; Patterson, Andrew D
2016-12-19
The drug metabolism field has long recognized the beneficial and sometimes deleterious influence of microbiota in the absorption, distribution, metabolism, and excretion of drugs. Early pioneering work with the sulfanilamide precursor prontosil pointed toward the necessity not only to better understand the metabolic capabilities of the microbiota but also, importantly, to identify the specific microbiota involved in the generation and metabolism of drugs. However, technological limitations important for cataloging the microbiota community as well as for understanding and/or predicting their metabolic capabilities hindered progress. Current advances including mass spectrometry-based metabolite profiling as well as culture-independent sequence-based identification and functional analysis of microbiota have begun to shed light on microbial metabolism. In this review, case studies will be presented to highlight key aspects (e.g., microbiota identification, metabolic function and prediction, metabolite identification, and profiling) that have helped to clarify how the microbiota might impact or be impacted by drug metabolism. Lastly, a perspective of the future of this field is presented that takes into account what important knowledge is lacking and how to tackle these problems.
Novel hermetic packaging methods for MOEMS
NASA Astrophysics Data System (ADS)
Stark, David
2003-01-01
Hermetic packaging of micro-optoelectromechanical systems (MOEMS) is an immature technology, lacking industry-consensus methods and standards. Off-the-shelf, catalog window assemblies are not yet available. Window assemblies are in general custom designed and manufactured for each new product, resulting in longer than acceptable cycle times, high procurement costs and questionable reliability. There are currently two dominant window-manufacturing methods wherein a metal frame is attached to glass, as well as a third, less-used method. The first method creates a glass-to-metal seal by heating the glass above its Tg to fuse it to the frame. The second method involves first metallizing the glass where it is to be attached to the frame, and then soldering the glass to the frame. The third method employs solder-glass to bond the glass to the frame. A novel alternative with superior features compared to the three previously described window-manufacturing methods is proposed. The new approach lends itself to a plurality of glass-to-metal attachment techniques. Benefits include lower temperature processing than two of the current methods and potentially more cost-effective manufacturing than all three of today"s attachment methods.
Rapid Exploitation and Analysis of Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buttler, D J; Andrzejewski, D; Stevens, K D
Analysts are overwhelmed with information. They have large archives of historical data, both structured and unstructured, and continuous streams of relevant messages and documents that they need to match to current tasks, digest, and incorporate into their analysis. The purpose of the READ project is to develop technologies to make it easier to catalog, classify, and locate relevant information. We approached this task from multiple angles. First, we tackle the issue of processing large quantities of information in reasonable time. Second, we provide mechanisms that allow users to customize their queries based on latent topics exposed from corpus statistics. Third,more » we assist users in organizing query results, adding localized expert structure over results. Forth, we use word sense disambiguation techniques to increase the precision of matching user generated keyword lists with terms and concepts in the corpus. Fifth, we enhance co-occurrence statistics with latent topic attribution, to aid entity relationship discovery. Finally we quantitatively analyze the quality of three popular latent modeling techniques to examine under which circumstances each is useful.« less
Fiber-Optic Network Observations of Earthquake Wavefields
NASA Astrophysics Data System (ADS)
Lindsey, Nathaniel J.; Martin, Eileen R.; Dreger, Douglas S.; Freifeld, Barry; Cole, Stephen; James, Stephanie R.; Biondi, Biondo L.; Ajo-Franklin, Jonathan B.
2017-12-01
Our understanding of subsurface processes suffers from a profound observation bias: seismometers are sparse and clustered on continents. A new seismic recording approach, distributed acoustic sensing (DAS), transforms telecommunication fiber-optic cables into sensor arrays enabling meter-scale recording over tens of kilometers of linear fiber length. We analyze cataloged earthquake observations from three DAS arrays with different horizontal geometries to demonstrate some possibilities using this technology. In Fairbanks, Alaska, we find that stacking ground motion records along 20 m of fiber yield a waveform that shows a high degree of correlation in amplitude and phase with a colocated inertial seismometer record at 0.8-1.6 Hz. Using an L-shaped DAS array in Northern California, we record the nearly vertically incident arrival of an earthquake from The Geysers Geothermal Field and estimate its backazimuth and slowness via beamforming for different phases of the seismic wavefield. Lastly, we install a fiber in existing telecommunications conduits below Stanford University and show that little cable-to-soil coupling is required for teleseismic P and S phase arrival detection.
The Visibility of Earth Transits
NASA Technical Reports Server (NTRS)
Castellano, Tim; DeVincenzi, Donald L. (Technical Monitor)
2000-01-01
The recent detection of planetary transits of the solar-like star HD 209458 at a distance of 47 parsecs suggest that transits can reveal the presence of Jupiter-size planetary companions in the solar neighborhood. Recent space-based transit searches have achieved photometric precision within an order of magnitude of that required to detect the much smaller transit signal of an earth-size planet around a solar-size star. Laboratory experiments in the presence of realistic noise sources have shown that CCDs can achieve photometric precision adequate to detect the 9.6 E-5 dimming, of the Sun due to a transit of the Earth. Space-based solar irradiance monitoring has shown that the intrinsic variability of the Sun would not preclude such a detection. Transits of the Sun by the Earth would be detectable by observers that reside within a narrow band of sky positions near the ecliptic plane, if the observers possess current Earth epoch levels of technology and astronomical expertise. A catalog of candidate target stars, their properties, and simulations of the photometric Earth transit signal detectability at each target is presented.
The Visibility of Earth Transits
NASA Technical Reports Server (NTRS)
Castellano, Timothy P.; Doyle, Laurance; McIntosh, Dawn; DeVincenzi, Donald (Technical Monitor)
2000-01-01
The recent photometric detection of planetary transits of the solar-like star HD 209458 at a distance of 47 parsecs suggest that transits can reveal the presence of Jupiter-size planetary companions in the solar neighborhood. Recent space-based transit searches have achieved photometric precision within an order of magnitude of that required to detect the much smaller transit signal of an earth-size planet across a solar-size star. Laboratory experiments in the presence of realistic noise sources have shown that CCDs can achieve photometric precision adequate to detect the 9.6 E-5 dimming of the Sun due to a transit of the Earth. Space-based solar irradiance monitoring has shown that the intrinsic variability of the Sun would not preclude such a detection. Transits of the Sun by the Earth would be detectable by observers that reside within a narrow band of sky positions near the ecliptic plane, if the observers possess current Earth epoch levels of technology and astronomical expertise. A catalog of solar-like stars that satisfy the geometric condition for Earth transit visibility are presented.
NASA Astrophysics Data System (ADS)
Bruns, Donald
2016-05-01
In 1919, astronomers performed an experiment during a solar eclipse, attempting to measure the deflection of stars near the sun, in order to verify Einstein's theory of general relativity. The experiment was very difficult and the results were marginal, but the success made Albert Einstein famous around the world. Astronomers last repeated the experiment in 1973, achieving an error of 11%. In 2017, using amateur equipment and modern technology, I plan to repeat the experiment and achieve a 1% error. The best available star catalog will be used for star positions. Corrections for optical distortion and atmospheric refraction are better than 0.01 arcsec. During totality, I expect 7 or 8 measurable stars down to magnitude 9.5, based on analysis of previous eclipse measurements taken by amateurs. Reference images, taken near the sun during totality, will be used for precise calibration. Preliminary test runs performed during twilight in April 2016 and April 2017 can accurately simulate the sky conditions during totality, providing an accurate estimate of the final uncertainty.
12th National Cataloguing Conference.
ERIC Educational Resources Information Center
Tan, Janine; Olston, Julie; Dearman, Rosemary; Hay, Ros; Butler, Gabrielle; Giopoulos, Jenny; Moloney, Julie; Pearce, Fran
1997-01-01
Summarizes issues raised at the 1997 national cataloging conference of the Australian Library and Information Association. Includes a draft procedural document for cataloging Internet sites and provides reports from five workshops on human resource management in cataloging, career strategies for catalogers, cataloging standards, the Anglo-American…
Code of Federal Regulations, 2010 CFR
2010-07-01
... within the civil agencies. (2) Each item included in the Federal Catalog System shall be classified under... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... and maintained in the Federal Catalog System as prescribed in the Federal Catalog System Policy Manual...
Code of Federal Regulations, 2014 CFR
2014-07-01
... within the civil agencies. (2) Each item included in the Federal Catalog System shall be classified under... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... and maintained in the Federal Catalog System as prescribed in the Federal Catalog System Policy Manual...
Code of Federal Regulations, 2011 CFR
2011-07-01
... within the civil agencies. (2) Each item included in the Federal Catalog System shall be classified under... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... and maintained in the Federal Catalog System as prescribed in the Federal Catalog System Policy Manual...
Code of Federal Regulations, 2012 CFR
2012-07-01
... within the civil agencies. (2) Each item included in the Federal Catalog System shall be classified under... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... and maintained in the Federal Catalog System as prescribed in the Federal Catalog System Policy Manual...
Code of Federal Regulations, 2013 CFR
2013-07-01
... within the civil agencies. (2) Each item included in the Federal Catalog System shall be classified under... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.2-Cataloging... and maintained in the Federal Catalog System as prescribed in the Federal Catalog System Policy Manual...
OAST system technology planning
NASA Technical Reports Server (NTRS)
Sadin, S. R.
1978-01-01
The NASA Office of Aeronautics and Space Technology developed a planning model for space technology consisting of a space systems technology model, technology forecasts and technology surveys. The technology model describes candidate space missions through the year 2000 and identifies their technology requirements. The technology surveys and technology forecasts provide, respectively, data on the current status and estimates of the projected status of relevant technologies. These tools are used to further the understanding of the activities and resources required to ensure the timely development of technological capabilities. Technology forecasting in the areas of information systems, spacecraft systems, transportation systems, and power systems are discussed.
ERIC Educational Resources Information Center
Pratt, Virginia; And Others
Faced with an old and increasingly complex card catalog and with proposed changes in the Library of Congress (LC) cataloging system, the Subcommittee on the Future of the Catalogs at the University of California, Berkeley (UCB) Library considered methods for dealing with these joint problems of catalog renovation and…
What is technology? A study of fifth and eighth grade student ideas about the Nature of Technology
NASA Astrophysics Data System (ADS)
Digironimo, Nicole
Most, if not all, standards for science and technology education and curriculum indicate that knowledge of the Nature of Technology is an educational goal, yet the literature lacks an established definition for the Nature of Technology. Additionally, the research on student ideas about the Nature of Technology is insufficient. After reviewing the literature on science and technology education, the philosophy of technology, and the history of technology, this study presents an internally consistent definition for the Nature of Technology. This definition illustrates how the Nature of Technology includes five dimensions: Technology as Artifacts; Technology as a Creation Process; Technology as a Human Practice; The History of Technology; and The Current Role of Technology. Using an interview protocol developed for this study, a small group of 5th and 8th grade students were interviewed to ascertain their ideas about the Nature of Technology. The results indicate that there are a variety of ideas present in the thinking of young people. All of the participants expressed one of two ideas about technological artifacts: technological artifacts are electronic or technological artifacts are invented. All of the participants identified particular skills needed to invent technological artifacts; some of the participants included mobility and concluded that disabled people cannot be inventors. Despite their experiences with technological artifacts (including educational technology tools), a few of the participants were uncertain whether they would identify themselves as technological. More than half of the participants did not believe older artifacts can still be considered technology. Most of the participants were apprehensive about our technological future; the main issue expressed by the participants was the environment. Other than environmental concerns, most of the participants were unable to identify global issues regarding technological use and development. Overall, these findings increase our knowledge of the ideas young people have about the Nature of Technology, which can inform future research on teaching and learning about science and technology.
41 CFR 101-30.603 - GSA Supply Catalog.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.6-GSA Section of the Federal Supply Catalog § 101-30.603 GSA Supply Catalog. (a) The GSA Supply... GSA Supply Catalog contains all necessary information for ordering from the GSA Federal Supply Service...
41 CFR 101-30.603 - GSA Supply Catalog.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.6-GSA Section of the Federal Supply Catalog § 101-30.603 GSA Supply Catalog. (a) The GSA Supply... GSA Supply Catalog contains all necessary information for ordering from the GSA Federal Supply Service...
41 CFR 101-30.603 - GSA Supply Catalog.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.6-GSA Section of the Federal Supply Catalog § 101-30.603 GSA Supply Catalog. (a) The GSA Supply... GSA Supply Catalog contains all necessary information for ordering from the GSA Federal Supply Service...
41 CFR 101-30.603 - GSA Supply Catalog.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.6-GSA Section of the Federal Supply Catalog § 101-30.603 GSA Supply Catalog. (a) The GSA Supply... GSA Supply Catalog contains all necessary information for ordering from the GSA Federal Supply Service...
41 CFR 101-30.603 - GSA Supply Catalog.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.6-GSA Section of the Federal Supply Catalog § 101-30.603 GSA Supply Catalog. (a) The GSA Supply... GSA Supply Catalog contains all necessary information for ordering from the GSA Federal Supply Service...
The Online Catalog: Issues in Planning and Development.
ERIC Educational Resources Information Center
Richards, Timothy F.
1984-01-01
Discusses key issues to be addressed in planning for introduction of online public access catalog in academic research library environment. Purpose of catalog, reasons to adopt catalog, user behavior, use of catalog records, authority control, shared or unique systems, and impact on staff are highlighted. Seventy-three sources are cited. (EJS)
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2012 CFR
2012-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2011 CFR
2011-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2010 CFR
2010-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2013 CFR
2013-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2014 CFR
2014-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
LANDSAT: Non-US standard catalog no. N-33
NASA Technical Reports Server (NTRS)
1975-01-01
A catalog used for dissemination of information regarding the availability of LANDSAT imagery is presented. The Image Processing Facility of the Goddard Space Flight Center, publishes a U.S. and a Non-U.S. Standard Catalog on a monthly schedule, and the catalogs identify imagery which has been processed and input to the data files during the referenced month. The U.S. Standard Catalog includes imagery covering the continental United States, Alaska and Hawaii; the Non-U.S. Catalog identifies all the remaining coverage. Imagery adjacent to the continental U.S. and Alaska borders is included in the U.S. Standard Catalog.
Small satellite debris catalog maintenance issues
NASA Technical Reports Server (NTRS)
Jackson, Phoebe A.
1991-01-01
The United States Space Command (USSPACECOM) is a unified command of the Department of Defense, and one of its tasks is to detect, track, identify, and maintain a catalog of all man-made objects in Earth orbit. This task is called space surveillance, and the most important tool for space surveillance is the satellite catalog. The command's reasons for performing satellite catalog maintenance is presented. A satellite catalog is described, and small satellite-debris catalog-maintenance issues are identified. The underlying rationale is to describe the catalog maintenance services so that the members of the community can use them with assurance.
Bolef, D
1975-01-01
After ten years of experimentation in computer-assisted cataloging, the Washington University School of Medicine Library has decided to join the Ohio College Library Center network. The history of the library's work preceding this decision is reviewed. The data processing equipment and computers that have permitted librarians to explore different ways of presenting cataloging information are discussed. Certain cataloging processes are facilitated by computer manipulation and printouts, but the intellectual cataloging processes such as descriptive and subject cataloging are not. Networks and shared bibliographic data bases show promise of eliminating the intellectual cataloging for one book by more than one cataloger. It is in this area that future developments can be expected. PMID:1148442
SKYMAP system description: Star catalog data base generation and utilization
NASA Technical Reports Server (NTRS)
Gottlieb, D. M.
1979-01-01
The specifications, design, software description, and use of the SKYMAP star catalog system are detailed. The SKYMAP system was developed to provide an accurate and complete catalog of all stars with blue or visual magnitudes brighter than 9.0 for use by attitude determination programs. Because of the large number of stars which are brighter than 9.0 magnitude, efficient techniques of manipulating and accessing the data were required. These techniques of staged distillation of data from a Master Catalog to a Core Catalog, and direct access of overlapping zone catalogs, form the basis of the SKYMAP system. The collection and tranformation of data required to produce the Master Catalog data base is described. The data flow through the main programs and levels of star catalogs is detailed. The mathematical and logical techniques for each program and the format of all catalogs are documented.
A description of the catalog division project at the College of Physicians of Philadelphia Library.
Caspari, S B; Batty, E L
1975-01-01
This paper describes the procedures used at the Library of the College of Physicians of Philadelphia to divide its ninety-year-old dictionary card catalog. The division was necessitated by overcrowding, obsolete subject headings, and lack of a complete authority list which resulted in like materials being scattered throughout the catalog under several headings. Two catalogs were created: the historical-biographical catalog, representing all works published before 1950 and all works of historical or biographical nature; and the current catalog, containing all works published from 1950 on, excepting historical or biographical materials. The 1950- catalog was further divided into name and subject catalogs, and the subject section was revised according to MeSH. The project was completed in about two years. As a result, searching time has been much reduced, and the library is able to take advantage of the annual revisions of MeSH to update the subject catalog. PMID:1173786
A Catalog of Photometric Redshift and the Distribution of Broad Galaxy Morphologies
NASA Astrophysics Data System (ADS)
Paul, Nicholas; Virag, Nicholas; Shamir, Lior
2018-06-01
We created a catalog of photometric redshift of ~3,000,000 SDSS galaxies annotated by their broad morphology. The photometric redshift was optimized by testing and comparing several pattern recognition algorithms and variable selection strategies, trained and tested on a subset of the galaxies in the catalog that had spectra. The galaxies in the catalog have i magnitude brighter than 18 and Petrosian radius greater than 5.5''. The majority of these objects are not included in previous SDSS photometric redshift catalogs such as the photoz table of SDSS DR12. Analysis of the catalog shows that the number of galaxies in the catalog that are visually spiral increases until redshift of ~0.085, where it peaks and starts to decrease. It also shows that the number of spiral galaxies compared to elliptical galaxies drops as the redshift increases. The catalog is publicly available at https://figshare.com/articles/Morphology_and_photometric_redshift_catalog/4833593
Cho, Yongrae; Kim, Minsung
2014-01-01
The volatility and uncertainty in the process of technological developments are growing faster than ever due to rapid technological innovations. Such phenomena result in integration among disparate technology fields. At this point, it is a critical research issue to understand the different roles and the propensity of each element technology for technological convergence. In particular, the network-based approach provides a holistic view in terms of technological linkage structures. Furthermore, the development of new indicators based on network visualization can reveal the dynamic patterns among disparate technologies in the process of technological convergence and provide insights for future technological developments. This research attempts to analyze and discover the patterns of the international patent classification codes of the United States Patent and Trademark Office's patent data in printed electronics, which is a representative technology in the technological convergence process. To this end, we apply the physical idea as a new methodological approach to interpret technological convergence. More specifically, the concepts of entropy and gravity are applied to measure the activities among patent citations and the binding forces among heterogeneous technologies during technological convergence. By applying the entropy and gravity indexes, we could distinguish the characteristic role of each technology in printed electronics. At the technological convergence stage, each technology exhibits idiosyncratic dynamics which tend to decrease technological differences and heterogeneity. Furthermore, through nonlinear regression analysis, we have found the decreasing patterns of disparity over a given total period in the evolution of technological convergence. This research has discovered the specific role of each element technology field and has consequently identified the co-evolutionary patterns of technological convergence. These new findings on the evolutionary patterns of technological convergence provide some implications for engineering and technology foresight research, as well as for corporate strategy and technology policy.
NASA Astrophysics Data System (ADS)
Yerdelen-Damar, Sevda; Boz, Yezdan; Aydın-Günbatar, Sevgi
2017-08-01
This study examined the relations of preservice science teachers' attitudes towards technology use, technology ownership, technology competencies, and experiences to their self-efficacy beliefs about technological pedagogical content knowledge (TPACK). The present study also investigated interrelations among preservice teachers' attitudes towards technology use, technology ownership, technology competencies, and experiences. The participants of study were 665 elementary preservice science teachers (467 females, 198 males) from 7 colleges in Turkey. The proposed model based on educational technology literature was tested using structural equation modeling. The model testing results revealed that preservice teachers' technology competencies and experiences mediated the relation of technology ownership to their TPACK self efficacy beliefs. The direct relation of their possession of technology to their TPACK self efficacy beliefs was insignificant while the indirect relation through their technology competencies and experiences was significant. The results also indicated there were significant direct effects of preservice teachers' attitudes towards technology use, technology competencies, and experiences on their TPACK self efficacy beliefs.
The next generation balloon-borne large aperture submillimeter telescope (BLAST-TNG)
NASA Astrophysics Data System (ADS)
Dober, Bradley Jerald
Large areas of astrophysics, such as precision cosmology, have benefited greatly from large maps and datasets, yielded by telescopes of ever-increasing number and ability. However, due to the unique challenges posed by submillimeter polarimetry, the study of molecular cloud dynamics and star formation remain stunted. Previously, polarimetry data was limited to a few vectors on only the brightest areas of molecular clouds. This made drawing statistically-driven conclusions a daunting task. However, the successful flight of the Balloon-born Large Aperture Submillimeter Telescope for Polarimetry (BLASTPol) generated maps with thousands of independent polarization measurements of molecular clouds, and ushered in a new era of empirical modeling of molecular cloud dynamics. Now that the potential benefits from large-scale maps of magnetic fields in molecular clouds had been identified, a successor that would truly unlock the secrets must be born. The Next Generation Balloon-borne Large Aperture Submillimeter Telescope (BLAST-TNG), the successor to BLASTPol, has the ability to make larger and more detailed maps of magnetic fields in molecular clouds. It will push the field of star formation into a statistics-driven, empirical realm. With these large, detailed datasets, astronomers will be able to find new relationships between the dust dynamics and the magnetic fields. The field will surge to a new level of understanding. One of the key enabling technologies of BLAST-TNG is its three arrays of polarization-sensitive Microwave Kinetic Inductance Detectors (MKIDs). MKIDs are superconducting RLC circuits with a resonant frequency that shifts proportionally to the amount of incident radiation. The key feature of MKIDs is that thousands of detectors, each with their own unique resonant frequency, can be coupled to the same readout line. This technology will be able to drive the production of large-scale monolithic arrays, containing tens or hundreds of thousands of detectors, resulting in an ever-increasing rate of scientific progress. The current limiting factor that determines how many MKIDs can be placed on the same readout line is the bandwidth and processing limitations of the readout hardware. BLAST-TNG has pushed this technology forward by implementing the first Reconfigurable Open-Architecture Computing Hardware (ROACH2) based readout system. This has significantly raised the processing abilities of the MKID readout electronics, enabling over 1000 MKIDs to be read out on a single line. It is also the first ever ROACH (1 or 2) based system to ever be flown on a long duration balloon (LDB) payload. This thesis documents the first-ever deployment of MKIDs on a balloon payload. This is a significant technological step towards an MKID-based satellite payload. This thesis overviews the balloon payload, details the underlying detector physics, catalogs the detector and full-scale array development, and ends with the room-temperature readout electronics.
Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.
Chandra Source Catalog: User Interface
NASA Astrophysics Data System (ADS)
Bonaventura, Nina; Evans, Ian N.; Rots, Arnold H.; Tibbetts, Michael S.; van Stone, David W.; Zografou, Panagoula; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.
2009-09-01
The Chandra Source Catalog (CSC) is intended to be the definitive catalog of all X-ray sources detected by Chandra. For each source, the CSC provides positions and multi-band fluxes, as well as derived spatial, spectral, and temporal source properties. Full-field and source region data products are also available, including images, photon event lists, light curves, and spectra. The Chandra X-ray Center CSC website (http://cxc.harvard.edu/csc/) is the place to visit for high-level descriptions of each source property and data product included in the catalog, along with other useful information, such as step-by-step catalog tutorials, answers to FAQs, and a thorough summary of the catalog statistical characterization. Eight categories of detailed catalog documents may be accessed from the navigation bar on most of the 50+ CSC pages; these categories are: About the Catalog, Creating the Catalog, Using the Catalog, Catalog Columns, Column Descriptions, Documents, Conferences, and Useful Links. There are also prominent links to CSCview, the CSC data access GUI, and related help documentation, as well as a tutorial for using the new CSC/Google Earth interface. Catalog source properties are presented in seven scientific categories, within two table views: the Master Source and Source Observations tables. Each X-ray source has one ``master source'' entry and one or more ``source observation'' entries, the details of which are documented on the CSC ``Catalog Columns'' pages. The master source properties represent the best estimates of the properties of a source; these are extensively described on the following pages of the website: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The eight tutorials (``threads'') available on the website serve as a collective guide for accessing, understanding, and manipulating the source properties and data products provided by the catalog.
Technology Transition for Hybrid Warfare
2010-02-16
and Iraq. At the same time, the science and technology base must provide the disruptive technologies to defeat future conventional enemies. This... disruptive technologies will be needed to retain long-term technological superiority in conventional warfare. Incremental improvement is the most...technology to be missed. Disruptive technologies are the second type of technological change and involve revolutionary concepts involving large technological
Explanatory Supplement to the WISE All-Sky Release Products
NASA Technical Reports Server (NTRS)
2012-01-01
The Wide-field Infrared Survey Explorer (WISE; Wright et al. 2010) surveyed the entire sky at 3.4, 4.6, 12 and 22 microns in 2010, achieving 5-sigma point source sensitivities per band better than 0.08, 0.11, 1 and 6 mJy in unconfused regions on the ecliptic. The WISE All-Sky Data Release, conducted on March 14, 2012, incorporates all data taken during the full cryogenic mission phase, 7 January 2010 to 6 August 20l0,that were processed with improved calibrations and reduction algorithms. Release data products include: (1) an Atlas of 18,240 match-filtered, calibrated and coadded image sets; (2) a Source Catalog containing positions and four-band photometry for over 563 million objects, and (3) an Explanatory Supplement. Ancillary products include a Reject Table that contains 284 million detections that were not selected for the Source Catalog because they are low signal-to-noise ratio or spurious detections of image artifacts, an archive of over 1.5 million sets of calibrated WISE Single-exposure images, and a database of 9.4 billion source extractions from those single images, and moving object tracklets identified by the NEOWISE program (Mainzer et aI. 2011). The WISE All-Sky Data Release products supersede those from the WISE Preliminary Data Release (Cutri et al. 2011). The Explanatory Supplement to the WISE All-Sky Data Release Products is a general guide for users of the WISE data. The Supplement contains an overview of the WISE mission, facilities, and operations, a detailed description of WISE data processing algorithms, a guide to the content and formals of the image and tabular data products, and cautionary notes that describe known limitations of the All-Sky Release products. Instructions for accessing the WISE data products via the services of the NASA/IPAC Infrared Science Archive are provided. The Supplement also provides analyses of the achieved sky coverage, photometric and astrometric characteristics and completeness and reliability of the All-Sky Release data products. The WISE All-Sky Release Explanatory Supplement is an on-line document that is updated frequently to provide the most current information for users of the WISE data products. The Explanatory Supplement is maintained at: http://wise2.ipac.caltech.edu/docs/release/allsky/expsup/index.html WISE is a joint project of the University of California, Los Angeles and the Jet Propulsion Laboratory/California Institute of Technology, funded by the National Aeronautics and Space Administration. NEOWISE is a project of the Jet Propulsion Laboratory/California Institute of Technology, funded by the Planetary Science Division of the National Aeronautics and Space Administration.
41 CFR 101-30.504 - Cataloging data from Defense Logistics Services Center (DLSC).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Cataloging data from... Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.5-Maintenance of the Federal Catalog System § 101-30.504 Cataloging...
ERIC Educational Resources Information Center
Cochrane, Pauline A.; Markey, Karen
1983-01-01
This review of the transition from library card catalogs to online public access catalogs (OPAC) (1981-1982) discusses methods employed by online catalog use studies (self-administered questionnaires, OPAC transaction logs, focused-group interviews, feature analysis, online search and retrieval experiments) and new directions for OPAC research…
Subject cataloging practices in North American medical school libraries.
Fredericksen, R B; Michael, H N
1976-01-01
A survey of North American medical school libraries was made to determine current trends in subject cataloging practices. First, responses from 114 of these libraries are recorded and analyzed in the following areas: subject heading authority lists employed; use of the divided versus the dictionary catalog; and the form in which local subject authority files are kept. Then, focusing on 78 libraries that use MeSH in combination with a divided catalog, a further analysis of responses is made concerning issues relating to subject cataloging practices: updating the subject catalog to conform to annual MeSH changes; use of guide cards in the catalog; use of MeSH subheadings; filing conventions; and related issues. An attempt is made to analyze the extent to which these libraries vary from NLM practices. Suggestions are offered for formulating subject cataloging practices for an individual library. Finally, while it is concluded that MeSH and the Current Catalog are useful tools, a more detailed explication of the use of MeSH and NLM cataloging practices would be beneficial. PMID:989741