OFDM-Based Signal Explotation Using Quadrature Mirror Filter Bank (QMFB) Processing
2012-03-01
need in childhood as strong as the need for a father’s protection. Sigmund Freud vi Acknowledgments First I would like to thank to my...Real part of the filtered signal Q = imag(I); % Imaginary part of the filtered signal % Saving the data to the same directory save
Conserva a Puerto Rico con bosques maderables
Frank H. Wadsworth
2009-01-01
[article in Spanish] Puerto Rico consume muchos productos forestales costosos de importar. También tiene bosques extensos con maderas explotables. Además, existen condiciones fÃsicas favorables para la producción de madera útil. No obstante, hoy dÃa no se utiliza la madera de los bosques actuales ocurre la deforestación para cualquier fin. Los Bosques productivos de...
Chediack, Sandra E
2008-06-01
The effect of forest exploitation--timber and palmito (Euterpe edulis, Palmae) extraction--on structure, diversity, and floristic composition of forests known as palmitals of the Atlantic Forest of Argentina was analyzed. These palmitals are located in Misiones (54 degrees 13' W and 25 degrees 41' S). Three 1 ha permanent plots were established: two in the "intangible" zone of the Iguazu National Park (PNI), and another in an exploited forest site bordering the PNI. Three 0.2 ha non-permanent plots were also measured. One was located in the PNI reserve zone where illegal palmito extraction occurs. The other two were in logged forest. All trees and palmitos with DBH >10 cm were identified and DBH and height were measured. For each of the six sites, richness and diversity of tree species, floristic composition, number of endemic species, and density of harvestable tree species were estimated. The harvest of E. edulis increases density of other tree species, diminishing palmito density. Forest explotation (logging and palmito harvest) is accompanied by an increase in diversity and density of heliophilic species, which have greater timber value in the region. However, this explotation also diminishes the density of palmito, of endemic species which normally grow in low densities, and of species found on the IUCN Red List. Results suggest that forest structure may be managed for timber and palmito production. The "intangible" zone of the PNI has the greatest conservation value in the Atlantic Forest, since a greater number of endemisms and endangered species are found here.
Multiple layer optical memory system using second-harmonic-generation readout
Boyd, Gary T.; Shen, Yuen-Ron
1989-01-01
A novel optical read and write information storage system is described which comprises a radiation source such as a laser for writing and illumination, the radiation source being capable of radiating a preselected first frequency; a storage medium including at least one layer of material for receiving radiation from the radiation source and capable of being surface modified in response to said radiation source when operated in a writing mode and capable of generating a pattern of radiation of the second harmonic of the preselected frequency when illuminated by the radiation source at the preselected frequency corresponding to the surface modifications on the storage medium; and a detector to receive the pattern of second harmonic frequency generated.
Matching Alternative Addresses: a Semantic Web Approach
NASA Astrophysics Data System (ADS)
Ariannamazi, S.; Karimipour, F.; Hakimpour, F.
2015-12-01
Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.
Mobile Smog Simulator: New Capabilities to Study Urban Mixtures
A smog simulator developed by EPA scientists and engineers has unique capabilities that will provide information for assessing the health impacts of relevant multipollutant atmospheres and identify contributions of specific sources.
Information processing of earth resources data
NASA Technical Reports Server (NTRS)
Zobrist, A. L.; Bryant, N. A.
1982-01-01
Current trends in the use of remotely sensed data include integration of multiple data sources of various formats and use of complex models. These trends have placed a strain on information processing systems because an enormous number of capabilities are needed to perform a single application. A solution to this problem is to create a general set of capabilities which can perform a wide variety of applications. General capabilities for the Image-Based Information System (IBIS) are outlined in this report. They are then cross-referenced for a set of applications performed at JPL.
The Multi-energy High precision Data Processor Based on AD7606
NASA Astrophysics Data System (ADS)
Zhao, Chen; Zhang, Yanchi; Xie, Da
2017-11-01
This paper designs an information collector based on AD7606 to realize the high-precision simultaneous acquisition of multi-source information of multi-energy systems to form the information platform of the energy Internet at Laogang with electricty as its major energy source. Combined with information fusion technologies, this paper analyzes the data to improve the overall energy system scheduling capability and reliability.
Code of Federal Regulations, 2011 CFR
2011-10-01
... or cost proposal under other competitive procedures, and personnel evaluating protests. (2) Personnel...) Supervisors, at any level, of the personnel listed in this paragraph (a). (b) The originator of information... capabilities of potential competitive sources (see FAR 7.1 and FAR 10). ...
Connecting the Force from Space: The IRIS Joint Capability Technology Demonstration
2010-01-01
the Joint in Joint Capability Technology Demonstration, we have two sponsors, both U.S. Strategic Command and the Defense Information Systems...Capability Technology Demonstration will provide an excellent source of data on space-based Internet Protocol net- working. Operational... Internet Routing in Space Joint Capability Technology Demonstration Operational Manager, Space and Missile Defense Battle Lab, Colorado Springs
NASA Astrophysics Data System (ADS)
Kolodny, Michael A.
2017-05-01
Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.
Sensor-based architecture for medical imaging workflow analysis.
Silva, Luís A Bastião; Campos, Samuel; Costa, Carlos; Oliveira, José Luis
2014-08-01
The growing use of computer systems in medical institutions has been generating a tremendous quantity of data. While these data have a critical role in assisting physicians in the clinical practice, the information that can be extracted goes far beyond this utilization. This article proposes a platform capable of assembling multiple data sources within a medical imaging laboratory, through a network of intelligent sensors. The proposed integration framework follows a SOA hybrid architecture based on an information sensor network, capable of collecting information from several sources in medical imaging laboratories. Currently, the system supports three types of sensors: DICOM repository meta-data, network workflows and examination reports. Each sensor is responsible for converting unstructured information from data sources into a common format that will then be semantically indexed in the framework engine. The platform was deployed in the Cardiology department of a central hospital, allowing identification of processes' characteristics and users' behaviours that were unknown before the utilization of this solution.
2005-01-01
publications and other mecha- nisms for increasing information literacy will be enumerated along with the customer segments for which these are targeted. (2...training pro- vided. c. STRATEGIC OBJECTIVES: (1) Increase customer awareness of information sources/capabilities. (2) Increase information literacy ; encour
Adding tools to the open source toolbox: The Internet
NASA Technical Reports Server (NTRS)
Porth, Tricia
1994-01-01
The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.
The Efficient Utilization of Open Source Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baty, Samuel R.
These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide keymore » insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.« less
1978-08-01
weeding I I ORGANISATION & MANAGEMENT Aims and objectives, staffing, promotional activities, identifying u;ers 12 NETWORKS & EXTERNAL SOURCES OF...Acquisition Clerks with typing capability are required for meticulous recordkeeping. Typing capability of 50 words per minute and a working knowledge ...81 Adminhistration and Management Includes management planning and research. 64 Numerical Analysis Includes iteration, difference equations, and 82
Code of Federal Regulations, 2011 CFR
2011-01-01
... information from several sources including national cooperative soil surveys or other acceptable soil surveys, NRCS field office technical guides, soil potential ratings or soil productivity ratings, land capability classifications, and important farmland determinations. Based on this information, groups of soils...
Code of Federal Regulations, 2010 CFR
2010-01-01
... information from several sources including national cooperative soil surveys or other acceptable soil surveys, NRCS field office technical guides, soil potential ratings or soil productivity ratings, land capability classifications, and important farmland determinations. Based on this information, groups of soils...
Honeybee navigation: following routes using polarized-light cues
Kraft, P.; Evangelista, C.; Dacke, M.; Labhart, T.; Srinivasan, M. V.
2011-01-01
While it is generally accepted that honeybees (Apis mellifera) are capable of using the pattern of polarized light in the sky to navigate to a food source, there is little or no direct behavioural evidence that they actually do so. We have examined whether bees can be trained to find their way through a maze composed of four interconnected tunnels, by using directional information provided by polarized light illumination from the ceilings of the tunnels. The results show that bees can learn this task, thus demonstrating directly, and for the first time, that bees are indeed capable of using the polarized-light information in the sky as a compass to steer their way to a food source. PMID:21282174
Multisource information fusion applied to ship identification for the recognized maritime picture
NASA Astrophysics Data System (ADS)
Simard, Marc-Alain; Lefebvre, Eric; Helleur, Christopher
2000-04-01
The Recognized Maritime Picture (RMP) is defined as a composite picture of activity over a maritime area of interest. In simplistic terms, building an RAMP comes down to finding if an object of interest, a ship in our case, is there or not, determining what it is, determining what it is doing and determining if some type of follow-on action is required. The Canadian Department of National Defence currently has access to or may, in the near future, have access to a number of civilians, military and allied information or sensor systems to accomplish these purposes. These systems include automatic self-reporting positional systems, air patrol surveillance systems, high frequency surface radars, electronic intelligence systems, radar space systems and high frequency direction finding sensors. The ability to make full use of these systems is limited by the existing capability to fuse data from all sources in a timely, accurate and complete manner. This paper presents an information fusion systems under development that correlates and fuses these information and sensor data sources. This fusion system, named Adaptive Fuzzy Logic Correlator, correlates the information in batch but fuses and constructs ship tracks sequentially. It applies standard Kalman filter techniques and fuzzy logic correlation techniques. We propose a set of recommendations that should improve the ship identification process. Particularly it is proposed to utilize as many non-redundant sources of information as possible that address specific vessel attributes. Another important recommendation states that the information fusion and data association techniques should be capable of dealing with incomplete and imprecise information. Some fuzzy logic techniques capable of tolerating imprecise and dissimilar data are proposed.
AQUIS: A PC-based source information manager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, A.E.; Huber, C.C.; Tschanz, J.
1993-05-01
The Air Quality Utility Information System (AQUIS) was developed to calculate emissions and track them along with related information about sources, stacks, controls, and permits. The system runs on IBM- compatible personal computers with dBASE IV and tracks more than 1, 200 data items distributed among various source categories. AQUIS is currently operating at 11 US Air Force facilities, which have up to 1, 000 sources, and two headquarters. The system provides a flexible reporting capability that permits users who are unfamiliar with database structure to design and prepare reports containing user- specified information. In addition to the criteria pollutants,more » AQUIS calculates compound-specific emissions and allows users to enter their own emission estimates.« less
AQUIS: A PC-based source information manager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, A.E.; Huber, C.C.; Tschanz, J.
1993-01-01
The Air Quality Utility Information System (AQUIS) was developed to calculate emissions and track them along with related information about sources, stacks, controls, and permits. The system runs on IBM- compatible personal computers with dBASE IV and tracks more than 1, 200 data items distributed among various source categories. AQUIS is currently operating at 11 US Air Force facilities, which have up to 1, 000 sources, and two headquarters. The system provides a flexible reporting capability that permits users who are unfamiliar with database structure to design and prepare reports containing user- specified information. In addition to the criteria pollutants,more » AQUIS calculates compound-specific emissions and allows users to enter their own emission estimates.« less
Empowering Provenance in Data Integration
NASA Astrophysics Data System (ADS)
Kondylakis, Haridimos; Doerr, Martin; Plexousakis, Dimitris
The provenance of data has recently been recognized as central to the trust one places in data. This paper presents a novel framework in order to empower provenance in a mediator based data integration system. We use a simple mapping language for mapping schema constructs, between an ontology and relational sources, capable to carry provenance information. This language extends the traditional data exchange setting by translating our mapping specifications into source-to-target tuple generating dependencies (s-t tgds). Then we define formally the provenance information we want to retrieve i.e. annotation, source and tuple provenance. We provide three algorithms to retrieve provenance information using information stored on the mappings and the sources. We show the feasibility of our solution and the advantages of our framework.
News Resources on the World Wide Web.
ERIC Educational Resources Information Center
Notess, Greg R.
1996-01-01
Describes up-to-date news sources that are presently available on the Internet and World Wide Web. Highlights include electronic newspapers; AP (Associated Press) sources and Reuters; sports news; stock market information; New York Times; multimedia capabilities, including CNN Interactive; and local and regional news. (LRW)
An autonomous structural health monitoring solution
NASA Astrophysics Data System (ADS)
Featherston, Carol A.; Holford, Karen M.; Pullin, Rhys; Lees, Jonathan; Eaton, Mark; Pearson, Matthew
2013-05-01
Combining advanced sensor technologies, with optimised data acquisition and diagnostic and prognostic capability, structural health monitoring (SHM) systems provide real-time assessment of the integrity of bridges, buildings, aircraft, wind turbines, oil pipelines and ships, leading to improved safety and reliability and reduced inspection and maintenance costs. The implementation of power harvesting, using energy scavenged from ambient sources such as thermal gradients and sources of vibration in conjunction with wireless transmission enables truly autonomous systems, reducing the need for batteries and associated maintenance in often inaccessible locations, alongside bulky and expensive wiring looms. The design and implementation of such a system however presents numerous challenges. A suitable energy source or multiple sources capable of meeting the power requirements of the system, over the entire monitoring period, in a location close to the sensor must be identified. Efficient power management techniques must be used to condition the power and deliver it, as required, to enable appropriate measurements to be taken. Energy storage may be necessary, to match a continuously changing supply and demand for a range of different monitoring states including sleep, record and transmit. An appropriate monitoring technique, capable of detecting, locating and characterising damage and delivering reliable information, whilst minimising power consumption, must be selected. Finally a wireless protocol capable of transmitting the levels of information generated at the rate needed in the required operating environment must be chosen. This paper considers solutions to some of these challenges, and in particular examines SHM in the context of the aircraft environment.
Lazar, Christina M; Black, Anne C; McMahon, Thomas J; O'Shea, Kevin; Rosen, Marc I
2015-03-01
The liberty of individuals who receive Social Security disability payments is constrained if they are judged incapable of managing their payments and are assigned a payee or conservator to manage benefit payments on their behalf. Conversely, beneficiaries' well-being may be compromised if they misspend money that they need to survive. Several studies have shown that determinations of financial capability are made inconsistently and that capability guidelines appear to be applied inconsistently. This article describes ambiguities that remained for individuals even after a comprehensive assessment of financial capability was conducted by independent assessors. Trained, experienced assessors rated the financial capability of 118 individuals in intensive outpatient or inpatient psychiatric facilities who received Social Security Disability Insurance or Supplemental Security Income. Ten individuals' cases were determined to be difficult to judge. Six sources of ambiguity were identified by case review: distinguishing incapability from the challenges of navigating poverty, the amount of nonessential spending that indicates incapability, the amount of spending on harmful things that indicates incapability, how to consider intermittent periods of capability and incapability, the relative weighting of past behavior and future plans to change, and discrepancies between different sources of information. The cases raise fundamental questions about how to define and identify financial incapability, but they also illustrate how detailed consideration of beneficiaries' living situations and decision making can inform the difficult dichotomous decision about capability.
NASA Astrophysics Data System (ADS)
Johnson, J. Bruce; Reeve, S. W.; Burns, W. A.; Allen, Susan D.
2010-04-01
Termed Special Nuclear Material (SNM) by the Atomic Energy Act of 1954, fissile materials, such as 235U and 239Pu, are the primary components used to construct modern nuclear weapons. Detecting the clandestine presence of SNM represents an important capability for Homeland Security. An ideal SNM sensor must be able to detect fissile materials present at ppb levels, be able to distinguish between the source of the detected fissile material, i.e., 235U, 239Pu, 233U or other fission source, and be able to perform the discrimination in near real time. A sensor with such capabilities would provide not only rapid identification of a threat but, ultimately, information on the potential source of the threat. For example, current detection schemes for monitoring clandestine nuclear testing and nuclear fuel reprocessing to provide weapons grade fissile material rely largely on passive air sampling combined with a subsequent instrumental analysis or some type of wet chemical analysis of the collected material. It would be highly useful to have a noncontact method of measuring isotopes capable of providing forensic information rapidly at ppb levels of detection. Here we compare the use of Kr, Xe and I as "canary" species for distinguishing between 235U and 239Pu fission sources by spectroscopic methods.
Integrating Thematic Web Portal Capabilities into the NASA Earthdata Web Infrastructure
NASA Technical Reports Server (NTRS)
Wong, Minnie; Baynes, Kathleen E.; Huang, Thomas; McLaughlin, Brett
2015-01-01
This poster will present the process of integrating thematic web portal capabilities into the NASA Earth data web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators.
Developing Students' Critical Reasoning About Online Health Information: A Capabilities Approach
NASA Astrophysics Data System (ADS)
Wiblom, Jonna; Rundgren, Carl-Johan; Andrée, Maria
2017-11-01
The internet has become a main source for health-related information retrieval. In addition to information published by medical experts, individuals share their personal experiences and narratives on blogs and social media platforms. Our increasing need to confront and make meaning of various sources and conflicting health information has challenged the way critical reasoning has become relevant in science education. This study addresses how the opportunities for students to develop and practice their capabilities to critically approach online health information can be created in science education. Together with two upper secondary biology teachers, we carried out a design-based study. The participating students were given an online retrieval task that included a search and evaluation of health-related online sources. After a few lessons, the students were introduced to an evaluation tool designed to support critical evaluation of health information online. Using qualitative content analysis, four themes could be discerned in the audio and video recordings of student interactions when engaging with the task. Each theme illustrates the different ways in which critical reasoning became practiced in the student groups. Without using the evaluation tool, the students struggled to overview the vast amount of information and negotiate trustworthiness. Guided by the evaluation tool, critical reasoning was practiced to handle source subjectivity and to sift out scientific information only. Rather than a generic skill and transferable across contexts, students' critical reasoning became conditioned by the multi-dimensional nature of health issues, the blend of various contexts and the shift of purpose constituted by the students.
An open-source, mobile-friendly search engine for public medical knowledge.
Samwald, Matthias; Hanbury, Allan
2014-01-01
The World Wide Web has become an important source of information for medical practitioners. To complement the capabilities of currently available web search engines we developed FindMeEvidence, an open-source, mobile-friendly medical search engine. In a preliminary evaluation, the quality of results from FindMeEvidence proved to be competitive with those from TRIP Database, an established, closed-source search engine for evidence-based medicine.
Searching Online Chemical Data Repositories via the ChemAgora Portal.
Zanzi, Antonella; Wittwehr, Clemens
2017-12-26
ChemAgora, a web application designed and developed in the context of the "Data Infrastructure for Chemical Safety Assessment" (diXa) project, provides search capabilities to chemical data from resources available online, enabling users to cross-reference their search results with both regulatory chemical information and public chemical databases. ChemAgora, through an on-the-fly search, informs whether a chemical is known or not in each of the external data sources and provides clikable links leading to the third-party web site pages containing the information. The original purpose of the ChemAgora application was to correlate studies stored in the diXa data warehouse with available chemical data. Since the end of the diXa project, ChemAgora has evolved into an independent portal, currently accessible directly through the ChemAgora home page, with improved search capabilities of online data sources.
Navigating the Net for Grant Money.
ERIC Educational Resources Information Center
Schnitzer, Denise K.
1996-01-01
The Internet offers educators a wealth of grant resources and information on securing funds for projects. The first step is finding a funding source whose goals match those of the desired project's. Certain Net search engines have excellent capabilities. Grantsweb has accessible, organized links to federal and nonfederal grants sources. Other…
Analysis in Motion Initiative – Summarization Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Dustin; Pirrung, Meg; Jasper, Rob
2017-06-22
Analysts are tasked with integrating information from multiple data sources for important and timely decision making. What if sense making and overall situation awareness could be improved through visualization techniques? The Analysis in Motion initiative is advancing the ability to summarize and abstract multiple streams and static data sources over time.
Manpower management information system /MIS/
NASA Technical Reports Server (NTRS)
Gravette, M. C.; King, W. L.
1971-01-01
System of programs capable of building and maintaining data bank provides all levels of management with regular manpower evaluation reports and data source for special management exercises on manpower.
Space Shuttle Payload Information Source
NASA Technical Reports Server (NTRS)
Griswold, Tom
2000-01-01
The Space Shuttle Payload Information Source Compact Disk (CD) is a joint NASA and USA project to introduce Space Shuttle capabilities, payload services and accommodations, and the payload integration process. The CD will be given to new payload customers or to organizations outside of NASA considering using the Space Shuttle as a launch vehicle. The information is high-level in a visually attractive format with a voice over. The format is in a presentation style plus 360 degree views, videos, and animation. Hyperlinks are provided to connect to the Internet for updates and more detailed information on how payloads are integrated into the Space Shuttle.
Lazar, Christina M.; Black, Anne C.; McMahon, Thomas J; O’Shea, Kevin; Rosen, Marc I.
2015-01-01
Objective Social Security beneficiaries’ liberty is constrained if they are judged incapable of managing their disability payments and are assigned a fiduciary to manage benefit payments on their behalf. Conversely, beneficiaries’ well-being may be compromised if they misspend money that they need to survive. Several studies have shown that determinations of financial capability are made inconsistently and capability guidelines appear to be applied inconsistently in practice. This case series describes the ambiguities remaining for a small number of individuals even after published criteria for capability— failing to meet basic needs and/or harmful spending on drugs— are applied. Methods Trained, experienced assessors rated the financial capability of 119 individuals in intensive outpatient or inpatient psychiatric facilities who received SSI or SSDI payments. Ten individuals’ cases were determined difficult to judge. Results Six sources of ambiguity were identified by case review: distinguishing incapability from the challenges of navigating poverty, the amount of nonessential spending needed to be considered incapable, the amount of spending on harmful things needed to be considered incapable, how intermittent periods of capability and incapability should be considered, the relative weighting of past behavior and future plans to change, and discrepancies between different sources of information. Conclusion The cases raise fundamental questions about what financial incapability is, but also illustrate how detailed consideration of beneficiaries’ living situations and decision making can inform the difficult dichotomous decision about capability. PMID:25727116
2012-03-01
Targeting Review Board OPLAN Operations Plan OPORD Operations Order OPSIT Operational Situation OSINT Open Source Intelligence OV...Analysis Evaluate FLTREPs MISREPs Unit Assign Assets Feedback Asset Shortfalls Multi-Int Collection Political & Embasy Law Enforcement HUMINT OSINT ...Embassy Information OSINT Manage Theater HUMINT Law Enforcement Collection Sort Requests Platform Information Agency Information M-I Collect
Integrating thematic web portal capabilities into the NASA Earthdata Web Infrastructure
NASA Astrophysics Data System (ADS)
Wong, M. M.; McLaughlin, B. D.; Huang, T.; Baynes, K.
2015-12-01
The National Aeronautics and Space Administration (NASA) acquires and distributes an abundance of Earth science data on a daily basis to a diverse user community worldwide. To assist the scientific community and general public in achieving a greater understanding of the interdisciplinary nature of Earth science and of key environmental and climate change topics, the NASA Earthdata web infrastructure is integrating new methods of presenting and providing access to Earth science information, data, research and results. This poster will present the process of integrating thematic web portal capabilities into the NASA Earthdata web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators. Earthdata is a part of the Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools.
AQUIS: A PC-based air inventory and permit manager
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, A.E.; Huber, C.C.; Tschanz, J.
1992-01-01
The Air Quality Utility Information System (AQUIS) was developed to calculate and track sources, emissions, stacks, permits, and related information. The system runs on IBM-compatible personal computers with dBASE IV and tracks more than 1,200 data items distributed among various source categories. AQUIS is currently operating at nine US Air Force facilities that have up to 1,000 sources. The system provides a flexible reporting capability that permits users who are unfamiliar with database structure to design and prepare reports containing user-specified information. In addition to six criteria pollutants, AQUIS calculates compound-specific emissions and allows users to enter their own emissionmore » estimates.« less
Speeding response, saving lives : automatic vehicle location capabilities for emergency services.
DOT National Transportation Integrated Search
1999-01-01
Information from automatic vehicle location systems, when combined with computeraided dispatch software, can provide a rich source of data for analyzing emergency vehicle operations and evaluating agency performance.
A research on the positioning technology of vehicle navigation system from single source to "ASPN"
NASA Astrophysics Data System (ADS)
Zhang, Jing; Li, Haizhou; Chen, Yu; Chen, Hongyue; Sun, Qian
2017-10-01
Due to the suddenness and complexity of modern warfare, land-based weapon systems need to have precision strike capability on roads and railways. The vehicle navigation system is one of the most important equipments for the land-based weapon systems that have precision strick capability. There are inherent shortcomings for single source navigation systems to provide continuous and stable navigation information. To overcome the shortcomings, the multi-source positioning technology is developed. The All Source Positioning and Navigaiton (ASPN) program was proposed in 2010, which seeks to enable low cost, robust, and seamless navigation solutions for military to use on any operational platform and in any environment with or without GPS. The development trend of vehicle positioning technology was reviewed in this paper. The trend indicates that the positioning technology is developed from single source and multi-source to ASPN. The data fusion techniques based on multi-source and ASPN was analyzed in detail.
Multidimensional Environmental Data Resource Brokering on Computational Grids and Scientific Clouds
NASA Astrophysics Data System (ADS)
Montella, Raffaele; Giunta, Giulio; Laccetti, Giuliano
Grid computing has widely evolved over the past years, and its capabilities have found their way even into business products and are no longer relegated to scientific applications. Today, grid computing technology is not restricted to a set of specific grid open source or industrial products, but rather it is comprised of a set of capabilities virtually within any kind of software to create shared and highly collaborative production environments. These environments are focused on computational (workload) capabilities and the integration of information (data) into those computational capabilities. An active grid computing application field is the fully virtualization of scientific instruments in order to increase their availability and decrease operational and maintaining costs. Computational and information grids allow to manage real-world objects in a service-oriented way using industrial world-spread standards.
Greg C. Liknes; Christopher W. Woodall; Charles H. Perry
2009-01-01
Climate information frequently is included in geospatial modeling efforts to improve the predictive capability of other data sources. The selection of an appropriate climate data source requires consideration given the number of choices available. With regard to climate data, there are a variety of parameters (e.g., temperature, humidity, precipitation), time intervals...
Remote sensing as a source of data for outdoor recreation planning
NASA Technical Reports Server (NTRS)
Reed, W. E.; Goodell, H. G.; Emmitt, G. D.
1972-01-01
Specific data needs for outdoor recreation planning and the ability of tested remote sensors to provide sources for these data are examined. Data needs, remote sensor capabilities, availability of imagery, and advantages and problems of incorporating remote sensing data sources into ongoing planning data collection programs are discussed in detail. Examples of the use of imagery to derive data for a range of common planning analyses are provided. A selected bibliography indicates specific uses of data in planning, basic background materials on remote sensing technology, and sources of information on environmental information systems expected to use remote sensing to provide new environmental data of use in outdoor recreation planning.
Remote sensing as a source of land cover information utilized in the universal soil loss equation
NASA Technical Reports Server (NTRS)
Morris-Jones, D. R.; Morgan, K. M.; Kiefer, R. W.; Scarpace, F. L.
1979-01-01
In this study, methods for gathering the land use/land cover information required by the USLE were investigated with medium altitude, multi-date color and color infrared 70-mm positive transparencies using human and computer-based interpretation techniques. Successful results, which compare favorably with traditional field study methods, were obtained within the test site watershed with airphoto data sources and human airphoto interpretation techniques. Computer-based interpretation techniques were not capable of identifying soil conservation practices but were successful to varying degrees in gathering other types of desired land use/land cover information.
Prototyping an institutional IAIMS/UMLS information environment for an academic medical center.
Miller, P L; Paton, J A; Clyman, J I; Powsner, S M
1992-07-01
The paper describes a prototype information environment designed to link network-based information resources in an integrated fashion and thus enhance the information capabilities of an academic medical center. The prototype was implemented on a single Macintosh computer to permit exploration of the overall "information architecture" and to demonstrate the various desired capabilities prior to full-scale network-based implementation. At the heart of the prototype are two components: a diverse set of information resources available over an institutional computer network and an information sources map designed to assist users in finding and accessing information resources relevant to their needs. The paper describes these and other components of the prototype and presents a scenario illustrating its use. The prototype illustrates the link between the goals of two National Library of Medicine initiatives, the Integrated Academic Information Management System (IAIMS) and the Unified Medical Language System (UMLS).
Past and Present Large Solid Rocket Motor Test Capabilities
NASA Technical Reports Server (NTRS)
Kowalski, Robert R.; Owen, David B., II
2011-01-01
A study was performed to identify the current and historical trends in the capability of solid rocket motor testing in the United States. The study focused on test positions capable of testing solid rocket motors of at least 10,000 lbf thrust. Top-level information was collected for two distinct data points plus/minus a few years: 2000 (Y2K) and 2010 (Present). Data was combined from many sources, but primarily focused on data from the Chemical Propulsion Information Analysis Center s Rocket Propulsion Test Facilities Database, and heritage Chemical Propulsion Information Agency/M8 Solid Rocket Motor Static Test Facilities Manual. Data for the Rocket Propulsion Test Facilities Database and heritage M8 Solid Rocket Motor Static Test Facilities Manual is provided to the Chemical Propulsion Information Analysis Center directly from the test facilities. Information for each test cell for each time period was compiled and plotted to produce a graphical display of the changes for the nation, NASA, Department of Defense, and commercial organizations during the past ten years. Major groups of plots include test facility by geographic location, test cells by status/utilization, and test cells by maximum thrust capability. The results are discussed.
The Gaia Archive at ESAC: a VO-inside archive
NASA Astrophysics Data System (ADS)
Gonzalez-Nunez, J.
2015-12-01
The ESDC (ESAC Science Data Center) is one of the active members of the IVOA (International Virtual Observatory Alliance) that have defined a set of standards, libraries and concepts that allows to create flexible,scalable and interoperable architectures on the data archives development. In the case of astronomy science that involves the use of big catalogues, as in Gaia or Euclid, TAP, UWS and VOSpace standards can be used to create an architecture that allows the explotation of this valuable data from the community. Also, new challenges arise like the implementation of the new paradigm "move code close to the data", what can be partially obtained by the extension of the protocols (TAP+, UWS+, etc) or the languages (ADQL). We explain how we have used VO standards and libraries for the Gaia Archive that, not only have producing an open and interoperable archive but, also, minimizing the developement on certain areas. Also we will explain how we have extended these protocols and the future plans.
A strategy for rangeland management based on best available knowledge and information
USDA-ARS?s Scientific Manuscript database
Changes to rangeland systems are happening at spatial and temporal scales beyond the capability of our current knowledge and information systems. In this paper we look at how Web 2.0 tools such as wikis and crowd-sourcing and new technologies including mobile devices and massive online databases are...
Investigation of automated feature extraction using multiple data sources
NASA Astrophysics Data System (ADS)
Harvey, Neal R.; Perkins, Simon J.; Pope, Paul A.; Theiler, James P.; David, Nancy A.; Porter, Reid B.
2003-04-01
An increasing number and variety of platforms are now capable of collecting remote sensing data over a particular scene. For many applications, the information available from any individual sensor may be incomplete, inconsistent or imprecise. However, other sources may provide complementary and/or additional data. Thus, for an application such as image feature extraction or classification, it may be that fusing the mulitple data sources can lead to more consistent and reliable results. Unfortunately, with the increased complexity of the fused data, the search space of feature-extraction or classification algorithms also greatly increases. With a single data source, the determination of a suitable algorithm may be a significant challenge for an image analyst. With the fused data, the search for suitable algorithms can go far beyond the capabilities of a human in a realistic time frame, and becomes the realm of machine learning, where the computational power of modern computers can be harnessed to the task at hand. We describe experiments in which we investigate the ability of a suite of automated feature extraction tools developed at Los Alamos National Laboratory to make use of multiple data sources for various feature extraction tasks. We compare and contrast this software's capabilities on 1) individual data sets from different data sources 2) fused data sets from multiple data sources and 3) fusion of results from multiple individual data sources.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 310.001 Federal Acquisition Regulations System HEALTH AND HUMAN SERVICES COMPETITION AND ACQUISITION... sources, regardless of organizational type and size classification, and determine their capabilities to... respondents provide information regarding their organizational size classification. For example, the notice...
The utility of satellite observations for constraining fine-scale and transient methane sources
NASA Astrophysics Data System (ADS)
Turner, A. J.; Jacob, D.; Benmergui, J. S.; Brandman, J.; White, L.; Randles, C. A.
2017-12-01
Resolving differences between top-down and bottom-up emissions of methane from the oil and gas industry is difficult due, in part, to their fine-scale and often transient nature. There is considerable interest in using atmospheric observations to detect these sources. Satellite-based instruments are an attractive tool for this purpose and, more generally, for quantifying methane emissions on fine scales. A number of instruments are planned for launch in the coming years from both low earth and geostationary orbit, but the extent to which they can provide fine-scale information on sources has yet to be explored. Here we present an observation system simulation experiment (OSSE) exploring the tradeoffs between pixel resolution, measurement frequency, and instrument precision on the fine-scale information content of a space-borne instrument measuring methane. We use the WRF-STILT Lagrangian transport model to generate more than 200,000 column footprints at 1.3×1.3 km2 spatial resolution and hourly temporal resolution over the Barnett Shale in Texas. We sub-sample these footprints to match the observing characteristics of the planned TROPOMI and GeoCARB instruments as well as different hypothetical observing configurations. The information content of the various observing systems is evaluated using the Fisher information matrix and its singular values. We draw conclusions on the capabilities of the planned satellite instruments and how these capabilities could be improved for fine-scale source detection.
2003-04-01
gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other
Neuronal integration of dynamic sources: Bayesian learning and Bayesian inference.
Siegelmann, Hava T; Holzman, Lars E
2010-09-01
One of the brain's most basic functions is integrating sensory data from diverse sources. This ability causes us to question whether the neural system is computationally capable of intelligently integrating data, not only when sources have known, fixed relative dependencies but also when it must determine such relative weightings based on dynamic conditions, and then use these learned weightings to accurately infer information about the world. We suggest that the brain is, in fact, fully capable of computing this parallel task in a single network and describe a neural inspired circuit with this property. Our implementation suggests the possibility that evidence learning requires a more complex organization of the network than was previously assumed, where neurons have different specialties, whose emergence brings the desired adaptivity seen in human online inference.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrada, J.J.
This report compiles preliminary information that supports the premise that a repository is needed in Latin America and analyzes the nuclear situation (mainly in Argentina and Brazil) in terms of nuclear capabilities, inventories, and regional spent-fuel repositories. The report is based on several sources and summarizes (1) the nuclear capabilities in Latin America and establishes the framework for the need of a permanent repository, (2) the International Atomic Energy Agency (IAEA) approach for a regional spent-fuel repository and describes the support that international institutions are lending to this issue, (3) the current situation in Argentina in order to analyze themore » Argentinean willingness to find a location for a deep geological repository, and (4) the issues involved in selecting a location for the repository and identifies a potential location. This report then draws conclusions based on an analysis of this information. The focus of this report is mainly on spent fuel and does not elaborate on other radiological waste sources.« less
Development of the Centralized Storm Information System (CSIS) for use in severe weather prediction
NASA Technical Reports Server (NTRS)
Mosher, F. R.
1984-01-01
The centralized storm information system is now capable of ingesting and remapping radar scope presentations on a satellite projection. This can be color enhanced and superposed on other data types. Presentations from more than one radar can be composited on a single image. As with most other data sources, a simple macro establishes the loops and scheduling of the radar ingestions as well as the autodialing. There are approximately 60 NWS network 10 cm radars that can be interrogated. NSSFC forecasters have found this data source to be extremely helpful in severe weather situations. The capability to access lightning frequency data stored in a National Weather Service computer was added. Plans call for an interface with the National Meteorological Center to receive and display prognostic fields from operational computer forecast models. Programs are to be developed to plot and display locations of reported severe local storm events.
NASA Astrophysics Data System (ADS)
Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael
2015-04-01
We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems. Geospatial capabilities for mapping, visualization, and spatial analysis will be important components of these new generation of Web-based interoperable information systems in the water domain. The purpose of this presentation is to increase the awareness of scientists, IT personnel and agency managers about the advantages offered by the combined use of Open Data, Open Specifications for geospatial and water-related data collection, storage and sharing, as well as mature FOSS projects for the creation of interoperable Web-based information systems in the water domain. A case study is used to illustrate how these principles and technologies can be integrated to create a system with the previously mentioned characteristics for managing and responding to flood events.
Business intelligence tools for radiology: creating a prototype model using open-source tools.
Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin
2010-04-01
Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.
NASA Astrophysics Data System (ADS)
Schneider, D. J.
2009-12-01
The successful mitigation of volcanic hazards to aviation requires rapid interpretation and coordination of data from multiple sources, and communication of information products to a variety of end users. This community of information providers and information users include volcano observatories, volcanic ash advisory centers, meteorological watch offices, air traffic control centers, airline dispatch and military flight operations centers, and pilots. Each of these entities has capabilities and needs that are unique to their situations that evolve over a range of time spans. Prior to an eruption, information about probable eruption scenarios are needed in order to allow for contingency planning. Once a hazardous eruption begins, the immediate questions are where, when, how high, and how long will the eruption last? Following the initial detection of an eruption, the need for information changes to forecasting the movement of the volcanic cloud, determining whether ground operations will be affected by ash fall, and estimating how long the drifting volcanic cloud will remain hazardous. A variety of tools have been developed and/or improved over the past several years that provide additional data sources about volcanic hazards that is pertinent to the aviation sector. These include seismic and pressure sensors, ground-based radar and lidar, web cameras, ash dispersion models, and more sensitive satellite sensors that are capable of better detecting volcanic ash, gases and aerosols. Along with these improved capabilities come increased challenges in rapidly assimilating the available data sources, which come from a variety of data providers. In this presentation, examples from the recent large eruptions of Okmok, Kasatochi, and Sarychev Peak volcanoes will be used to demonstrate the challenges faced by hazard response agencies. These eruptions produced volcanic clouds that were dispersed over large regions of the Northern Hemisphere and were observed by pilots and detected by various satellite sensors for several weeks. The disruption to aviation caused by these eruptions further emphasizes the need to improve the real-time characterization of volcanic clouds (altitude, composition, particle size, and concentration) and to better understand the impacts of volcanic ash, gases and aerosols on aircraft, flight crews, and passengers.
A Forest Fire Sensor Web Concept with UAVSAR
NASA Astrophysics Data System (ADS)
Lou, Y.; Chien, S.; Clark, D.; Doubleday, J.; Muellerschoen, R.; Zheng, Y.
2008-12-01
We developed a forest fire sensor web concept with a UAVSAR-based smart sensor and onboard automated response capability that will allow us to monitor fire progression based on coarse initial information provided by an external source. This autonomous disturbance detection and monitoring system combines the unique capabilities of imaging radar with high throughput onboard processing technology and onboard automated response capability based on specific science algorithms. In this forest fire sensor web scenario, a fire is initially located by MODIS/RapidFire or a ground-based fire observer. This information is transmitted to the UAVSAR onboard automated response system (CASPER). CASPER generates a flight plan to cover the alerted fire area and executes the flight plan. The onboard processor generates the fuel load map from raw radar data, used with wind and elevation information, predicts the likely fire progression. CASPER then autonomously alters the flight plan to track the fire progression, providing this information to the fire fighting team on the ground. We can also relay the precise fire location to other remote sensing assets with autonomous response capability such as Earth Observation-1 (EO-1)'s hyper-spectral imager to acquire the fire data.
Visualizing and Validating Metadata Traceability within the CDISC Standards.
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information.
Visualizing and Validating Metadata Traceability within the CDISC Standards
Hume, Sam; Sarnikar, Surendra; Becnel, Lauren; Bennett, Dorine
2017-01-01
The Food & Drug Administration has begun requiring that electronic submissions of regulated clinical studies utilize the Clinical Data Information Standards Consortium data standards. Within regulated clinical research, traceability is a requirement and indicates that the analysis results can be traced back to the original source data. Current solutions for clinical research data traceability are limited in terms of querying, validation and visualization capabilities. This paper describes (1) the development of metadata models to support computable traceability and traceability visualizations that are compatible with industry data standards for the regulated clinical research domain, (2) adaptation of graph traversal algorithms to make them capable of identifying traceability gaps and validating traceability across the clinical research data lifecycle, and (3) development of a traceability query capability for retrieval and visualization of traceability information. PMID:28815125
Information consumerism on the World Wide Web: implications for dermatologists and patients.
Travers, Robin L
2002-09-01
The World Wide Web (WWW) is continuing to grow exponentially both in terms of numbers of users and numbers of web pages. There is a trend toward the increasing use of the WWW for medical educational purposes, both among physicians and patients alike. The multimedia capabilities of this evolving medium are particularly relevant to visual medical specialties such as dermatology. The origins of information consumerism on the WWW are examined, and the public health issues surrounding dermatologic information and misinformation, and how consumers navigate through the WWW are reviewed. The economic realities of medical information as a "capital good," and the impact this has on dermatologic information sources on the WWW are also discussed.Finally, strategies for guiding consumers and ourselves toward credible medical information sources on the WWW are outlined.
2018-02-01
that may relate to them. Qualified requestors may obtain copies of this report from the Defense Technical Information Center (DTIC) (http...report is published in the interest of scientific and technical information exchange, and its publication does not constitute the Government’s...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
Strategic Stability: Contending Interpretations
2013-02-01
prospect of a simple fait accompli. With secure and reliable communications to command centers capable of obtaining the most up to date information ... information is estimated to average 1 hour per response, including the time for reviewing instructions , searching existing data sources, gathering and...other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently
NASA Technical Reports Server (NTRS)
Rentz, P. E.
1976-01-01
Experimental evaluations of the acoustical characteristics and source sound power and directionality measurement capabilities of the NASA Lewis 9 x 15 foot low speed wind tunnel in the untreated or hardwall configuration were performed. The results indicate that source sound power estimates can be made using only settling chamber sound pressure measurements. The accuracy of these estimates, expressed as one standard deviation, can be improved from + or - 4 db to + or - 1 db if sound pressure measurements in the preparation room and diffuser are also used and source directivity information is utilized. A simple procedure is presented. Acceptably accurate measurements of source direct field acoustic radiation were found to be limited by the test section reverberant characteristics to 3.0 feet for omni-directional and highly directional sources. Wind-on noise measurements in the test section, settling chamber and preparation room were found to depend on the sixth power of tunnel velocity. The levels were compared with various analytic models. Results are presented and discussed.
Theory of Maxwell's fish eye with mutually interacting sources and drains
NASA Astrophysics Data System (ADS)
Leonhardt, Ulf; Sahebdivan, Sahar
2015-11-01
Maxwell's fish eye is predicted to image with a resolution not limited by the wavelength of light. However, interactions between sources and drains may ruin the subwavelength imaging capabilities of this and similar absolute optical instruments. Nevertheless, as we show in this paper, at resonance frequencies of the device, an array of drains may resolve a single source, or alternatively, a single drain may scan an array of sources, no matter how narrowly spaced they are. It seems that near-field information can be obtained from far-field distances.
Information Communication using Knowledge Engine on Flood Issues
NASA Astrophysics Data System (ADS)
Demir, I.; Krajewski, W. F.
2012-04-01
The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.
Analysis of Ten Reverse Engineering Tools
NASA Astrophysics Data System (ADS)
Koskinen, Jussi; Lehmonen, Tero
Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.
Intelligent Information Fusion in the Aviation Domain: A Semantic-Web based Approach
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Goforth, Andre
2005-01-01
Information fusion from multiple sources is a critical requirement for System Wide Information Management in the National Airspace (NAS). NASA and the FAA envision creating an "integrated pool" of information originally coming from different sources, which users, intelligent agents and NAS decision support tools can tap into. In this paper we present the results of our initial investigations into the requirements and prototype development of such an integrated information pool for the NAS. We have attempted to ascertain key requirements for such an integrated pool based on a survey of DSS tools that will benefit from this integrated pool. We then advocate key technologies from computer science research areas such as the semantic web, information integration, and intelligent agents that we believe are well suited to achieving the envisioned system wide information management capabilities.
Evaluated teletherapy source library
Cox, Lawrence J.; Schach Von Wittenau, Alexis E.
2000-01-01
The Evaluated Teletherapy Source Library (ETSL) is a system of hardware and software that provides for maintenance of a library of useful phase space descriptions (PSDs) of teletherapy sources used in radiation therapy for cancer treatment. The PSDs are designed to be used by PEREGRINE, the all-particle Monte Carlo dose calculation system. ETSL also stores other relevant information such as monitor unit factors (MUFs) for use with the PSDs, results of PEREGRINE calculations using the PSDs, clinical calibration measurements, and geometry descriptions sufficient for calculational purposes. Not all of this information is directly needed by PEREGRINE. It also is capable of acting as a repository for the Monte Carlo simulation history files from which the generic PSDs are derived.
2008-09-12
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...information if it does not display a currently valid OMB control number. 1 . REPORT DATE 12 SEP 2008 2. REPORT TYPE 3. DATES COVERED 00-00-2008... 1 Issue for Congress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Scope, Sources, and
2008-08-11
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...information if it does not display a currently valid OMB control number. 1 . REPORT DATE 11 AUG 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4... 1 Issue for Congress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Scope, Sources
ERIC Educational Resources Information Center
Thornburg, David D.
1986-01-01
Overview of the artificial intelligence (AI) field provides a definition; discusses past research and areas of future research; describes the design, functions, and capabilities of expert systems and the "Turing Test" for machine intelligence; and lists additional sources for information on artificial intelligence. Languages of AI are…
Information architecture for a planetary 'exploration web'
NASA Technical Reports Server (NTRS)
Lamarra, N.; McVittie, T.
2002-01-01
'Web services' is a common way of deploying distributed applications whose software components and data sources may be in different locations, formats, languages, etc. Although such collaboration is not utilized significantly in planetary exploration, we believe there is significant benefit in developing an architecture in which missions could leverage each others capabilities. We believe that an incremental deployment of such an architecture could significantly contribute to the evolution of increasingly capable, efficient, and even autonomous remote exploration.
Data Mining and Information Technology: Its Impact on Intelligence Collection and Privacy Rights
2007-11-26
sources include: Cameras - Digital cameras (still and video ) have been improving in capability while simultaneously dropping in cost at a rate...citizen is caught on camera 300 times each day.5 The power of extensive video coverage is magnified greatly by the nascent capability for voice and...software on security videos and tracking cell phone usage in the local area. However, it would only return the names and data of those who
INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorensek, M.; Hamm, L.; Garcia, H.
2011-07-18
Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less
An Integrated Nursing Management Information System: From Concept to Reality
Pinkley, Connie L.; Sommer, Patricia K.
1988-01-01
This paper addresses the transition from the conceptualization of a Nursing Management Information System (NMIS) integrated and interdependent with the Hospital Information System (HIS) to its realization. Concepts of input, throughout, and output are presented to illustrate developmental strategies used to achieve nursing information products. Essential processing capabilities include: 1) ability to interact with multiple data sources; 2) database management, statistical, and graphics software packages; 3) online, batch and reporting; and 4) interactive data analysis. Challenges encountered in system construction are examined.
Nuclear Forensics and Attribution: A National Laboratory Perspective
NASA Astrophysics Data System (ADS)
Hall, Howard L.
2008-04-01
Current capabilities in technical nuclear forensics - the extraction of information from nuclear and/or radiological materials to support the attribution of a nuclear incident to material sources, transit routes, and ultimately perpetrator identity - derive largely from three sources: nuclear weapons testing and surveillance programs of the Cold War, advances in analytical chemistry and materials characterization techniques, and abilities to perform ``conventional'' forensics (e.g., fingerprints) on radiologically contaminated items. Leveraging that scientific infrastructure has provided a baseline capability to the nation, but we are only beginning to explore the scientific challenges that stand between today's capabilities and tomorrow's requirements. These scientific challenges include radically rethinking radioanalytical chemistry approaches, developing rapidly deployable sampling and analysis systems for field applications, and improving analytical instrumentation. Coupled with the ability to measure a signature faster or more exquisitely, we must also develop the ability to interpret those signatures for meaning. This requires understanding of the physics and chemistry of nuclear materials processes well beyond our current level - especially since we are unlikely to ever have direct access to all potential sources of nuclear threat materials.
NASA Astrophysics Data System (ADS)
Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.
2009-05-01
Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.
The Use of Empirical Data Sources in HRA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruce Hallbert; David Gertman; Julie Marble
This paper presents a review of available information related to human performance to support Human Reliability Analysis (HRA) performed for nuclear power plants (NPPs). A number of data sources are identified as potentially useful. These include NPP licensee event reports (LERs), augmented inspection team (AIT) reports, operator requalification data, results from the literature in experimental psychology, and the Aviation Safety Reporting System (ASRSs). The paper discusses how utilizing such information improves our capability to model and quantify human performance. In particular the paper discusses how information related to performance shaping factors (PSFs) can be extracted from empirical data to determinemore » their size effect, their relative effects, as well as their interactions. The paper concludes that appropriate use of existing sources can help addressing some of the important issues we are currently facing in HRA.« less
DOD Manufacturing Arsenals: Actions Needed to Identify and Sustain Critical Capabilities
2015-11-01
to each develop their own unique method. A senior OSD official described the resulting process as unsound . Each manufacturing arsenal declared what...Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments
2008-10-08
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...information if it does not display a currently valid OMB control number. 1 . REPORT DATE 08 OCT 2008 2. REPORT TYPE 3. DATES COVERED 00-00-2008 to... 1 Issue for Congress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Scope, Sources, and
2008-07-10
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...information if it does not display a currently valid OMB control number. 1 . REPORT DATE 10 JUL 2008 2. REPORT TYPE 3. DATES COVERED 00-00-2008 to 00... 1 Issue for Congress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Scope, Sources, and
2008-11-19
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...information if it does not display a currently valid OMB control number. 1 . REPORT DATE 19 NOV 2008 2. REPORT TYPE 3. DATES COVERED 00-00-2008 to... 1 Issue for Congress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Scope, Sources, and
PISCES The Commander’s Tool for an Effective Exit Strategy
2003-05-16
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing...and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information...international “police-like” capability, including training in SWAT tactics and non-lethal weapons. This force would specialize in Preventive Intervention
NASA Astrophysics Data System (ADS)
Phipps, Marja; Lewis, Gina
2012-06-01
Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.
NASA Astrophysics Data System (ADS)
Zhang, Yanjun; Jiang, Li; Wang, Chunru
2015-07-01
A porous Sn@C nanocomposite was prepared via a facile hydrothermal method combined with a simple post-calcination process, using stannous octoate as the Sn source and glucose as the C source. The as-prepared Sn@C nanocomposite exhibited excellent electrochemical behavior with a high reversible capacity, long cycle life and good rate capability when used as an anode material for lithium ion batteries.A porous Sn@C nanocomposite was prepared via a facile hydrothermal method combined with a simple post-calcination process, using stannous octoate as the Sn source and glucose as the C source. The as-prepared Sn@C nanocomposite exhibited excellent electrochemical behavior with a high reversible capacity, long cycle life and good rate capability when used as an anode material for lithium ion batteries. Electronic supplementary information (ESI) available: Detailed experimental procedure and additional characterization, including a Raman spectrum, TGA curve, N2 adsorption-desorption isotherm, TEM images and SEM images. See DOI: 10.1039/c5nr03093e
Creating Services for the Digital Library.
ERIC Educational Resources Information Center
Crane, Dennis J.
The terms "virtual library,""digital library," and "electronic library" have received growing attention among professional librarians, researchers, and users of information over the past decade. The confluence of exploding sources of data, expanding technical capability, and constrained time and money will quickly move these concepts from…
NASA Astrophysics Data System (ADS)
Ulla, A.; Manteiga, M.
2008-12-01
The Third volume of "Lecture Notes and Essays in Astrophysics" highlights some important contributions of Spanish astrophysicists to Planetology, Solar and Stellar Physics, Extragalactic Astronomy, Cosmology and astronomical instrumentation. After decades without a dedicated mission, Venus is again in fashion. On the one hand, Ricardo Hueso and collaborators, and on the other Miguel Angel Lopez-Valverde, review ESA Venus Express contribution to the understanding of the atmosphere of the neighbouring planet. Carme Jordi describes in a comprehensive essay the main observational calibration techniques and methods for the determination of mass, radius, temperature, chemical composition and luminosity of a star. Dying stars are fundamental to understand the nature of dark energy, probably the most fundamental problem in Physics today. Type Ia supernovae have played a fundamental role showing the acceleration of the expansion rate of the Universe a decade ago. Inma Dominguez and collaborators go into detail on how the knowledge of the fundamental physics of thermonuclear supernovae explotions condition their role as astrophysical candles.
Kudi: A free open-source python library for the analysis of properties along reaction paths.
Vogt-Geisse, Stefan
2016-05-01
With increasing computational capabilities, an ever growing amount of data is generated in computational chemistry that contains a vast amount of chemically relevant information. It is therefore imperative to create new computational tools in order to process and extract this data in a sensible way. Kudi is an open source library that aids in the extraction of chemical properties from reaction paths. The straightforward structure of Kudi makes it easy to use for users and allows for effortless implementation of new capabilities, and extension to any quantum chemistry package. A use case for Kudi is shown for the tautomerization reaction of formic acid. Kudi is available free of charge at www.github.com/stvogt/kudi.
Ground Software Maintenance Facility (GSMF) system manual
NASA Technical Reports Server (NTRS)
Derrig, D.; Griffith, G.
1986-01-01
The Ground Software Maintenance Facility (GSMF) is designed to support development and maintenance of spacelab ground support software. THE GSMF consists of a Perkin Elmer 3250 (Host computer) and a MITRA 125s (ATE computer), with appropriate interface devices and software to simulate the Electrical Ground Support Equipment (EGSE). This document is presented in three sections: (1) GSMF Overview; (2) Software Structure; and (3) Fault Isolation Capability. The overview contains information on hardware and software organization along with their corresponding block diagrams. The Software Structure section describes the modes of software structure including source files, link information, and database files. The Fault Isolation section describes the capabilities of the Ground Computer Interface Device, Perkin Elmer host, and MITRA ATE.
Cabarcos, Alba; Sanchez, Tamara; Seoane, Jose A; Aguiar-Pulido, Vanessa; Freire, Ana; Dorado, Julian; Pazos, Alejandro
2010-01-01
Nowadays, medical practice needs, at the patient Point-of-Care (POC), personalised knowledge adjustable in each moment to the clinical needs of each patient, in order to provide support to decision-making processes, taking into account personalised information. To achieve this, adapting the hospital information systems is necessary. Thus, there is a need of computational developments capable of retrieving and integrating the large amount of biomedical information available today, managing the complexity and diversity of these systems. Hence, this paper describes a prototype which retrieves biomedical information from different sources, manages it to improve the results obtained and to reduce response time and, finally, integrates it so that it is useful for the clinician, providing all the information available about the patient at the POC. Moreover, it also uses tools which allow medical staff to communicate and share knowledge.
Arctic Capability Inventory Tool User Guide: Version 2 (International References)
2011-07-01
drawn from the primary source documents. In cases where the analyst included additional information, the text is included in [square brackets]. The...following: FERPistheallhazardsplanforacoordinatedfederalresponsetoemergencies. Inmost cases ,departmentsmanageemergencieswithevent...signed—(20)Andorra,Azerbaijan, Ecuador ,Eritrea,Israel,Kazakhstan, Kyrgyzstan,Peru,SanMarino,Syria,Tajikistan,TimorLeste,Turkey
2015-04-30
from the MIT Sloan School that provide a relative complexity score for functions (Product and Context Complexity). The PMA assesses the complexity...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or
Systematics errors in strong lens modeling
NASA Astrophysics Data System (ADS)
Johnson, Traci L.; Sharon, Keren; Bayliss, Matthew B.
We investigate how varying the number of multiple image constraints and the available redshift information can influence the systematic errors of strong lens models, specifically, the image predictability, mass distribution, and magnifications of background sources. This work will not only inform upon Frontier Field science, but also for work on the growing collection of strong lensing galaxy clusters, most of which are less massive and are capable of lensing a handful of galaxies.
Looking at Earth observation impacts with fresh eyes: a Landsat example
NASA Astrophysics Data System (ADS)
Wu, Zhuoting; Snyder, Greg; Quirk, Bruce; Stensaas, Greg; Vadnais, Carolyn; Babcock, Michael; Dale, Erin; Doucette, Peter
2016-05-01
The U. S. Geological Survey (USGS) initiated the Requirements, Capabilities and Analysis for Earth Observations (RCA-EO) activity in the Land Remote Sensing (LRS) program to provide a structured approach to collect, store, maintain, and analyze user requirements and Earth observing system capabilities information. RCA-EO enables the collection of information on current key Earth observation products, services, and projects, and to evaluate them at different organizational levels within an agency, in terms of how reliant they are on Earth observation data from all sources, including spaceborne, airborne, and ground-based platforms. Within the USGS, RCA-EO has engaged over 500 subject matter experts in this assessment, and evaluated the impacts of more than 1000 different Earth observing data sources on 345 key USGS products and services. This paper summarizes Landsat impacts at various levels of the organizational structure of the USGS and highlights the feedback of the subject matter experts regarding Landsat data and Landsat-derived products. This feedback is expected to inform future Landsat mission decision making. The RCA-EO approach can be applied in a much broader scope to derive comprehensive knowledge of Earth observing system usage and impacts, to inform product and service development and remote sensing technology innovation beyond the USGS.
NASA Technical Reports Server (NTRS)
Morenoff, J.; Roth, D. L.; Singleton, J. W.
1972-01-01
The study to develop, implement, and maintain a space law library and information system is summarized. The survey plan; major interviews with individuals representative of potential sources, users and producers of information related to aerospace law; and system trade-off analyses are discussed along with the NASA/RECON system capability. The NASA publications of STAR and IAA are described, and the NASA legal micro-thesaurus is included.
Building Knowledge Graphs for NASA's Earth Science Enterprise
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, T. J.; Ramachandran, R.; Shi, R.; Bao, Q.; Gatlin, P. N.; Weigel, A. M.; Maskey, M.; Miller, J. J.
2016-12-01
Inspired by Google Knowledge Graph, we have been building a prototype Knowledge Graph for Earth scientists, connecting information and data in NASA's Earth science enterprise. Our primary goal is to advance the state-of-the-art NASA knowledge extraction capability by going beyond traditional catalog search and linking different distributed information (such as data, publications, services, tools and people). This will enable a more efficient pathway to knowledge discovery. While Google Knowledge Graph provides impressive semantic-search and aggregation capabilities, it is limited to search topics for general public. We use the similar knowledge graph approach to semantically link information gathered from a wide variety of sources within the NASA Earth Science enterprise. Our prototype serves as a proof of concept on the viability of building an operational "knowledge base" system for NASA Earth science. Information is pulled from structured sources (such as NASA CMR catalog, GCMD, and Climate and Forecast Conventions) and unstructured sources (such as research papers). Leveraging modern techniques of machine learning, information retrieval, and deep learning, we provide an integrated data mining and information discovery environment to help Earth scientists to use the best data, tools, methodologies, and models available to answer a hypothesis. Our knowledge graph would be able to answer questions like: Which articles discuss topics investigating similar hypotheses? How have these methods been tested for accuracy? Which approaches have been highly cited within the scientific community? What variables were used for this method and what datasets were used to represent them? What processing was necessary to use this data? These questions then lead researchers and citizen scientists to investigate the sources where data can be found, available user guides, information on how the data was acquired, and available tools and models to use with this data. As a proof of concept, we focus on a well-defined domain - Hurricane Science linking research articles and their findings, data, people and tools/services. Modern information retrieval, natural language processing machine learning and deep learning techniques are applied to build the knowledge network.
Automated detection of extended sources in radio maps: progress from the SCORPIO survey
NASA Astrophysics Data System (ADS)
Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.
2016-08-01
Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.
NASA Astrophysics Data System (ADS)
Belis, Claudio A.; Pernigotti, Denise; Pirovano, Guido
2017-04-01
Source Apportionment (SA) is the identification of ambient air pollution sources and the quantification of their contribution to pollution levels. This task can be accomplished using different approaches: chemical transport models and receptor models. Receptor models are derived from measurements and therefore are considered as a reference for primary sources urban background levels. Chemical transport model have better estimation of the secondary pollutants (inorganic) and are capable to provide gridded results with high time resolution. Assessing the performance of SA model results is essential to guarantee reliable information on source contributions to be used for the reporting to the Commission and in the development of pollution abatement strategies. This is the first intercomparison ever designed to test both receptor oriented models (or receptor models) and chemical transport models (or source oriented models) using a comprehensive method based on model quality indicators and pre-established criteria. The target pollutant of this exercise, organised in the frame of FAIRMODE WG 3, is PM10. Both receptor models and chemical transport models present good performances when evaluated against their respective references. Both types of models demonstrate quite satisfactory capabilities to estimate the yearly source contributions while the estimation of the source contributions at the daily level (time series) is more critical. Chemical transport models showed a tendency to underestimate the contribution of some single sources when compared to receptor models. For receptor models the most critical source category is industry. This is probably due to the variety of single sources with different characteristics that belong to this category. Dust is the most problematic source for Chemical Transport Models, likely due to the poor information about this kind of source in the emission inventories, particularly concerning road dust re-suspension, and consequently the little detail about the chemical components of this source used in the models. The sensitivity tests show that chemical transport models show better performances when displaying a detailed set of sources (14) than when using a simplified one (only 8). It was also observed that an enhanced vertical profiling can improve the estimation of specific sources, such as industry, under complex meteorological conditions and that an insufficient spatial resolution in urban areas can impact on the capabilities of models to estimate the contribution of diffuse primary sources (e.g. traffic). Both families of models identify traffic and biomass burning as the first and second most contributing categories, respectively, to elemental carbon. The results of this study demonstrate that the source apportionment assessment methodology developed by the JRC is applicable to any kind of SA model. The same methodology is implemented in the on-line DeltaSA tool to support source apportionment model evaluation (http://source-apportionment.jrc.ec.europa.eu/).
Region 3 - EPA is performing market research to determine if industry has the capability and capacity to perform the work, on a national level, as described in the attached draft Statement of Work /Performance Work Statement(SOW/PWS).
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard; Day, John H. (Technical Monitor)
2001-01-01
This report will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing the use of Root-Sum-Square calculations for digital delays.
As many water utilities are seeking new and innovative rehabilitation technologies to extend the life of their water distribution systems, information on the capabilities and applicability of new technologies is not always readily available from an independent source. The U.S. E...
Personal Computing and Academic Library Design.
ERIC Educational Resources Information Center
Bazillion, Richard J.
1992-01-01
Notebook computers of increasing power and portability offer unique advantages to library users. Connecting easily to a campus data network, they are small silent work stations capable of drawing information from a variety of sources. Designers of new library buildings may assume that users in growing numbers will carry these multipurpose…
The role of 3-D interactive visualization in blind surveys of H I in galaxies
NASA Astrophysics Data System (ADS)
Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.
2015-09-01
Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.
Landing Force Organizational Systems Study (LFOSS).
1979-01-01
moving, and field fortifications. A heavy crawler tor is required for large earth-moving missions in tough soil condi- s where the smaller tractor is...current T/O’s and T/E’s. The primary sources for each item of equipment were the work directive, the required operational capability (ROC), and the...developments have not been used as a source of information because of the inability to determine the specific project impact prior to its entrance
The Mock LISA Data Challenge Round 3: New and Improved Sources
NASA Technical Reports Server (NTRS)
Baker, John
2008-01-01
The Mock LISA Data Challenges are a program to demonstrate and encourage the development of data-analysis capabilities for LISA. Each round of challenges consists of several data sets containing simulated instrument noise and gravitational waves from sources of undisclosed parameters. Participants are asked to analyze the data sets and report the maximum information they can infer about the source parameters. The challenges are being released in rounds of increasing complexity and realism. Challenge 3. currently in progress, brings new source classes, now including cosmic-string cusps and primordial stochastic backgrounds, and more realistic signal models for supermassive black-hole inspirals and galactic double white dwarf binaries.
Open Source Clinical NLP - More than Any Single System.
Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.
2015-05-01
effort on an unsound acquisition footing and pursuing a kill vehicle that may not be the best solution to meet the warfighter’s needs within cost...No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information
2008-03-15
information insures timely payment of entitlements and foregoes receipt of mutually exclusive payments. This depth of information supplies visibility and...reporting capability, and integration with authoritative data sources such as FPDS- NG, CCR, and contractor companies to improve data quality and reduce...manual entry requirements in Q2 FY09. • Continue to implement in theater, focusing on contingency contracts for private security companies and
NASA Astrophysics Data System (ADS)
Turner, Alexander J.; Jacob, Daniel J.; Benmergui, Joshua; Brandman, Jeremy; White, Laurent; Randles, Cynthia A.
2018-06-01
Anthropogenic methane emissions originate from a large number of fine-scale and often transient point sources. Satellite observations of atmospheric methane columns are an attractive approach for monitoring these emissions but have limitations from instrument precision, pixel resolution, and measurement frequency. Dense observations will soon be available in both low-Earth and geostationary orbits, but the extent to which they can provide fine-scale information on methane sources has yet to be explored. Here we present an observation system simulation experiment (OSSE) to assess the capabilities of different satellite observing system configurations. We conduct a 1-week WRF-STILT simulation to generate methane column footprints at 1.3 × 1.3 km2 spatial resolution and hourly temporal resolution over a 290 × 235 km2 domain in the Barnett Shale, a major oil and gas field in Texas with a large number of point sources. We sub-sample these footprints to match the observing characteristics of the recently launched TROPOMI instrument (7 × 7 km2 pixels, 11 ppb precision, daily frequency), the planned GeoCARB instrument (2.7 × 3.0 km2 pixels, 4 ppb precision, nominal twice-daily frequency), and other proposed observing configurations. The information content of the various observing systems is evaluated using the Fisher information matrix and its eigenvalues. We find that a week of TROPOMI observations should provide information on temporally invariant emissions at ˜ 30 km spatial resolution. GeoCARB should provide information available on temporally invariant emissions ˜ 2-7 km spatial resolution depending on sampling frequency (hourly to daily). Improvements to the instrument precision yield greater increases in information content than improved sampling frequency. A precision better than 6 ppb is critical for GeoCARB to achieve fine resolution of emissions. Transient emissions would be missed with either TROPOMI or GeoCARB. An aspirational high-resolution geostationary instrument with 1.3 × 1.3 km2 pixel resolution, hourly return time, and 1 ppb precision would effectively constrain the temporally invariant emissions in the Barnett Shale at the kilometer scale and provide some information on hourly variability of sources.
Definition, Capabilities, and Components of a Terrestrial Carbon Monitoring System
NASA Technical Reports Server (NTRS)
West, Tristram O.; Brown, Molly E.; Duren, Riley M.; Ogle, Stephen M.; Moss, Richard H.
2013-01-01
Research efforts for effectively and consistently monitoring terrestrial carbon are increasing in number. As such, there is a need to define carbon monitoring and how it relates to carbon cycle science and carbon management. There is also a need to identify capabilities of a carbon monitoring system and the system components needed to develop the capabilities. Capabilities that enable the effective application of a carbon monitoring system for monitoring and management purposes may include: reconciling carbon stocks and fluxes, developing consistency across spatial and temporal scales, tracking horizontal movement of carbon, attribution of emissions to originating sources, cross-sectoral accounting, uncertainty quantification, redundancy and policy relevance. Focused research is needed to integrate these capabilities for sustained estimates of carbon stocks and fluxes. Additionally, if monitoring is intended to inform management decisions, management priorities should be considered prior to development of a monitoring system.
Setting priorities for research on pollution reduction functions of agricultural buffers.
Dosskey, Michael G
2002-11-01
The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.
Lu, Fred Sun; Hou, Suqin; Baltrusaitis, Kristin; Shah, Manan; Leskovec, Jure; Sosic, Rok; Hawkins, Jared; Brownstein, John; Conidi, Giuseppe; Gunn, Julia; Gray, Josh; Zink, Anna
2018-01-01
Background Influenza outbreaks pose major challenges to public health around the world, leading to thousands of deaths a year in the United States alone. Accurate systems that track influenza activity at the city level are necessary to provide actionable information that can be used for clinical, hospital, and community outbreak preparation. Objective Although Internet-based real-time data sources such as Google searches and tweets have been successfully used to produce influenza activity estimates ahead of traditional health care–based systems at national and state levels, influenza tracking and forecasting at finer spatial resolutions, such as the city level, remain an open question. Our study aimed to present a precise, near real-time methodology capable of producing influenza estimates ahead of those collected and published by the Boston Public Health Commission (BPHC) for the Boston metropolitan area. This approach has great potential to be extended to other cities with access to similar data sources. Methods We first tested the ability of Google searches, Twitter posts, electronic health records, and a crowd-sourced influenza reporting system to detect influenza activity in the Boston metropolis separately. We then adapted a multivariate dynamic regression method named ARGO (autoregression with general online information), designed for tracking influenza at the national level, and showed that it effectively uses the above data sources to monitor and forecast influenza at the city level 1 week ahead of the current date. Finally, we presented an ensemble-based approach capable of combining information from models based on multiple data sources to more robustly nowcast as well as forecast influenza activity in the Boston metropolitan area. The performances of our models were evaluated in an out-of-sample fashion over 4 influenza seasons within 2012-2016, as well as a holdout validation period from 2016 to 2017. Results Our ensemble-based methods incorporating information from diverse models based on multiple data sources, including ARGO, produced the most robust and accurate results. The observed Pearson correlations between our out-of-sample flu activity estimates and those historically reported by the BPHC were 0.98 in nowcasting influenza and 0.94 in forecasting influenza 1 week ahead of the current date. Conclusions We show that information from Internet-based data sources, when combined using an informed, robust methodology, can be effectively used as early indicators of influenza activity at fine geographic resolutions. PMID:29317382
Intersectoral interagency partnerships to promote financial capability in older people.
Hean, Sarah; Fenge, Lee Ann; Worswick, Louise; Wilkinson, Charlie; Fearnley, Stella
2012-09-01
From the second quarter of 2008, the UK economy entered a period of economic decline. Older people are particularly vulnerable during these times. To promote ways in which older people can be better supported to maintain their financial well-being, this study explored the sources older people utilize to keep themselves financially informed. Interviews with older people (n = 28) showed that older people access trusted sources of information (e.g. healthcare professionals) rather than specialist financial information providers (e.g. financial advisors) which highlighted the need for interagency working between financial services in the private, public and voluntary sectors. An example of how such interagency partnerships might be achieved in practice is presented with some recommendations on directions for future research into interagency working that spans public, private and voluntary sectors.
Data Shaping in the Cultural Simulation Modeler Integrated Behavioral Assessment Capability. Phase I
2007-07-01
articles that appeared in global media in the years 1999-2006. The articles were all open source information and were obtained in part through an...agreement between Factiva Dow Jones and the NRL for this project, and in part collected by IndaSea from the Open Source Center database and a variety of...This view implied that a system geared to assist analysts should be open and completely dynamic. It is IndaSea’s perspective that there are advantages
Insects and Spiders. Environmental Education Curriculum.
ERIC Educational Resources Information Center
Topeka Public Schools, KS.
This unit is designed to provide information on insects and spiders that special education students are capable of understanding. The activities are aimed at level 2 and level 3 educable mentally retarded classes. There are four topics: (1) Characteristics and Life Cycles of Insects; (2) Characteristics of Spiders; (3) Habitats and Food Sources of…
Research Use by Cooperative Extension Educators in New York State
ERIC Educational Resources Information Center
Hamilton, Stephen F.; Chen, Emily K.; Pillemer, Karl; Meador, Rhoda H.
2013-01-01
A Web-based survey of 388 off-campus Cornell Extension educators in New York State examined their attitudes toward research, sources of research-based information, knowledge and beliefs about evidence-based programs, and involvement in research activities. Strong consensus emerged that research is central and that educators are capable of reading…
2007-03-01
features Federated Search (providing services to find and aggregate information across GIG enterprise data sources); Enterprise Catalog (providing...Content Discovery Federated Search Portlet Users Guide v0.4.3 M16 25-Apr-05 NCES Mediation Core Enterprise Services SDK v0.5.0 M17 25-Apr-05 NCES
White-Light Optical Information Processing and Holography.
1983-05-03
Processing, White-Light Holography, Image Subtraction, Image Deblurring , Coherence Requirement, Apparent Transfer Function, Source Encoding, Signal...in this period, also demonstrated several color image processing capabilities. Among those are broadband color image deblurring and color image...Broadband Image Deblurring ..... ......... 6 2.5 Color Image Subtraction ............... 7 2.6 Rainbow Holographic Aberrations . . ..... 7 2.7
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
... covered apparatus may convey from the source device to the consumer equipment the information necessary to... using Internet protocol (``IP'') and apparatus used by consumers to view video programming. The action...-delivered video programming and rules governing the closed captioning capabilities of certain apparatus on...
Quantity and unit extraction for scientific and technical intelligence analysis
NASA Astrophysics Data System (ADS)
David, Peter; Hawes, Timothy
2017-05-01
Scientific and Technical (S and T) intelligence analysts consume huge amounts of data to understand how scientific progress and engineering efforts affect current and future military capabilities. One of the most important types of information S and T analysts exploit is the quantities discussed in their source material. Frequencies, ranges, size, weight, power, and numerous other properties and measurements describing the performance characteristics of systems and the engineering constraints that define them must be culled from source documents before quantified analysis can begin. Automating the process of finding and extracting the relevant quantities from a wide range of S and T documents is difficult because information about quantities and their units is often contained in unstructured text with ad hoc conventions used to convey their meaning. Currently, even simple tasks, such as searching for documents discussing RF frequencies in a band of interest, is a labor intensive and error prone process. This research addresses the challenges facing development of a document processing capability that extracts quantities and units from S and T data, and how Natural Language Processing algorithms can be used to overcome these challenges.
NASA Astrophysics Data System (ADS)
Erickson, M.; Olaguer, J.; Wijesinghe, A.; Colvin, J.; Neish, B.; Williams, J.
2014-12-01
It is becoming increasingly important to understand the emissions and health effects of industrial facilities. Many areas have no or limited sustained monitoring capabilities, making it difficult to quantify the major pollution sources affecting human health, especially in fence line communities. Developments in real-time monitoring and micro-scale modeling offer unique ways to tackle these complex issues. This presentation will demonstrate the capability of coupling real-time observations with micro-scale modeling to provide real-time information and near real-time source attribution. The Houston Advanced Research Center constructed the Mobile Acquisition of Real-time Concentrations (MARC) laboratory. MARC consists of a Ford E-350 passenger van outfitted with a Proton Transfer Reaction Mass Spectrometer (PTR-MS) and meteorological equipment. This allows for the fast measurement of various VOCs important to air quality. The data recorded from the van is uploaded to an off-site database and the information is broadcast to a website in real-time. This provides for off-site monitoring of MARC's observations, which allows off-site personnel to provide immediate input to the MARC operators on how to best achieve project objectives. The information stored in the database can also be used to provide near real-time source attribution. An inverse model has been used to ascertain the amount, location, and timing of emissions based on MARC measurements in the vicinity of industrial sites. The inverse model is based on a 3D micro-scale Eulerian forward and adjoint air quality model known as the HARC model. The HARC model uses output from the Quick Urban and Industrial Complex (QUIC) wind model and requires a 3D digital model of the monitored facility based on lidar or industrial permit data. MARC is one of the instrument platforms deployed during the 2014 Benzene and other Toxics Exposure Study (BEE-TEX) in Houston, TX. The main goal of the study is to quantify and explain the origin of ambient exposure to hazardous air pollutants in an industrial fence line community near the Houston Ship Channel. Preliminary results derived from analysis of MARC observations during the BEE-TEX experiment will be presented.
MISSE in the Materials and Processes Technical Information System (MAPTIS )
NASA Technical Reports Server (NTRS)
Burns, DeWitt; Finckenor, Miria; Henrie, Ben
2013-01-01
Materials International Space Station Experiment (MISSE) data is now being collected and distributed through the Materials and Processes Technical Information System (MAPTIS) at Marshall Space Flight Center in Huntsville, Alabama. MISSE data has been instrumental in many programs and continues to be an important source of data for the space community. To facilitate great access to the MISSE data the International Space Station (ISS) program office and MAPTIS are working to gather this data into a central location. The MISSE database contains information about materials, samples, and flights along with pictures, pdfs, excel files, word documents, and other files types. Major capabilities of the system are: access control, browsing, searching, reports, and record comparison. The search capabilities will search within any searchable files so even if the desired meta-data has not been associated data can still be retrieved. Other functionality will continue to be added to the MISSE database as the Athena Platform is expanded
Spatial Dmbs Architecture for a Free and Open Source Bim
NASA Astrophysics Data System (ADS)
Logothetis, S.; Valari, E.; Karachaliou, E.; Stylianidis, E.
2017-08-01
Recent research on the field of Building Information Modelling (BIM) technology, revealed that except of a few, accessible and free BIM viewers there is a lack of Free & Open Source Software (FOSS) BIM software for the complete BIM process. With this in mind and considering BIM as the technological advancement of Computer-Aided Design (CAD) systems, the current work proposes the use of a FOSS CAD software in order to extend its capabilities and transform it gradually into a FOSS BIM platform. Towards this undertaking, a first approach on developing a spatial Database Management System (DBMS) able to store, organize and manage the overall amount of information within a single application, is presented.
Empirical study of fuzzy compatibility measures and aggregation operators
NASA Astrophysics Data System (ADS)
Cross, Valerie V.; Sudkamp, Thomas A.
1992-02-01
Two fundamental requirements for the generation of support using incomplete and imprecise information are the ability to measure the compatibility of discriminatory information with domain knowledge and the ability to fuse information obtained from disparate sources. A generic architecture utilizing the generalized fuzzy relational database model has been developed to empirically investigate the support generation capabilities of various compatibility measures and aggregation operators. This paper examines the effectiveness of combinations of compatibility measures from the set-theoretic, geometric distance, and logic- based classes paired with t-norm and generalized mean families of aggregation operators.
Krakow conference on low emissions sources: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, B.L.; Butcher, T.A.
1995-12-31
The Krakow Conference on Low Emission Sources presented the information produced and analytical tools developed in the first phase of the Krakow Clean Fossil Fuels and Energy Efficiency Program. This phase included: field testing to provide quantitative data on missions and efficiencies as well as on opportunities for building energy conservation; engineering analysis to determine the costs of implementing pollution control; and incentives analysis to identify actions required to create a market for equipment, fuels, and services needed to reduce pollution. Collectively, these Proceedings contain reports that summarize the above phase one information, present the status of energy system managementmore » in Krakow, provide information on financing pollution control projects in Krakow and elsewhere, and highlight the capabilities and technologies of Polish and American companies that are working to reduce pollution from low emission sources. It is intended that the US reader will find in these Proceedings useful results and plans for control of pollution from low emission sources that are representative of heating systems in central and Eastern Europe. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less
2003-02-01
and highly intelligent aircraft. A major tenet of this discussion will be that robust information sources provided by the PHM system can and will be...and maintainable (R+M) designed intelligent aircraft which encompasses a comprehensive Prognostics and Health Management (PHM) capability to enhance...hydraulic pump has a 90% chance of failing within the next 10 flight hours. This way, maintenance personnel will be able to make intelligent ,.informed
Advanced techniques for the storage and use of very large, heterogeneous spatial databases
NASA Technical Reports Server (NTRS)
Peuquet, Donna J.
1987-01-01
Progress is reported in the development of a prototype knowledge-based geographic information system. The overall purpose of this project is to investigate and demonstrate the use of advanced methods in order to greatly improve the capabilities of geographic information system technology in the handling of large, multi-source collections of spatial data in an efficient manner, and to make these collections of data more accessible and usable for the Earth scientist.
Blood Irradiator Interactive Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howington, John; Potter, Charles; DeGroff, Tavias
The “Blood Irradiator Interactive Tool” compares a typical Cs-137 Blood Irradiator with that of the capabilities of an average X-ray Irradiator. It is designed to inform the user about the potential capabilities that an average X-ray Irradiator could offer them. Specifically the tool compares the amount of blood bags that can be irradiated by the users’ machine with that of the average X-ray capability. It also forcasts the amount of blood that can be irradiated on yearly basis for both the users’ machine and an average X-ray Device. The Average X-ray capabilities are taken from the three X-ray devices currentlymore » on the market: The RS 3400 Rad Source X-ray Blood Irradiator and both the 2.0L and 3.5 L versions of the Best Theratronis Raycell MK2« less
Distributed Data Integration Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Critchlow, T; Ludaescher, B; Vouk, M
The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This can result in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. A related issue is keeping up with current trends in information technology often taxes the end-user's expertise and time. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and technologies, usemore » them almost exclusively, and develop a resistance to innovations that can enhance their productivity. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information and the latest technologies. In order to address this problem we are developing a end-user centric, domain-sensitive workflow-based infrastructure, shown in Figure 1, that will allow scientists to design complex scientific workflows that reflect the data manipulation required to perform their research without an undue burden. We are taking a three-tiered approach to designing this infrastructure utilizing (1) abstract workflow definition, construction, and automatic deployment, (2) complex agent-based workflow execution and (3) automatic wrapper generation. In order to construct a workflow, the scientist defines an abstract workflow (AWF) in terminology (semantics and context) that is familiar to him/her. This AWF includes all of the data transformations, selections, and analyses required by the scientist, but does not necessarily specify particular data sources. This abstract workflow is then compiled into an executable workflow (EWF, in our case XPDL) that is then evaluated and executed by the workflow engine. This EWF contains references to specific data source and interfaces capable of performing the desired actions. In order to provide access to the largest number of resources possible, our lowest level utilizes automatic wrapper generation techniques to create information and data wrappers capable of interacting with the complex interfaces typical in scientific analysis. The remainder of this document outlines our work in these three areas, the impact our work has made, and our plans for the future.« less
Fast massive preventive security and information communication systems
NASA Astrophysics Data System (ADS)
Akopian, David; Chen, Philip; Miryakar, Susheel; Kumar, Abhinav
2008-04-01
We present a fast massive information communication system for data collection from distributive sources such as cell phone users. As a very important application one can mention preventive notification systems when timely notification and evidence communication may help to improve safety and security through wide public involvement by ensuring easy-to-access and easy-to-communicate information systems. The technology significantly simplifies the response to the events and will help e.g. special agencies to gather crucial information in time and respond as quickly as possible. Cellular phones are nowadays affordable for most of the residents and became a common personal accessory. The paper describes several ways to design such systems including existing internet access capabilities of cell phones or downloadable specialized software. We provide examples of such designs. The main idea is in structuring information in predetermined way and communicating data through a centralized gate-server which will automatically process information and forward it to a proper destination. The gate-server eliminates a need in knowing contact data and specific local community infrastructure. All the cell phones will have self-localizing capability according to FCC E911 mandate, thus the communicated information can be further tagged automatically by location and time information.
Conceptualizing and assessing improvement capability: a review
Boaden, Ruth; Walshe, Kieran
2017-01-01
Abstract Purpose The literature is reviewed to examine how ‘improvement capability’ is conceptualized and assessed and to identify future areas for research. Data sources An iterative and systematic search of the literature was carried out across all sectors including healthcare. The search was limited to literature written in English. Data extraction The study identifies and analyses 70 instruments and frameworks for assessing or measuring improvement capability. Information about the source of the instruments, the sectors in which they were developed or used, the measurement constructs or domains they employ, and how they were tested was extracted. Results of data synthesis The instruments and framework constructs are very heterogeneous, demonstrating the ambiguity of improvement capability as a concept, and the difficulties involved in its operationalisation. Two-thirds of the instruments and frameworks have been subject to tests of reliability and half to tests of validity. Many instruments have little apparent theoretical basis and do not seem to have been used widely. Conclusion The assessment and development of improvement capability needs clearer and more consistent conceptual and terminological definition, used consistently across disciplines and sectors. There is scope to learn from existing instruments and frameworks, and this study proposes a synthetic framework of eight dimensions of improvement capability. Future instruments need robust testing for reliability and validity. This study contributes to practice and research by presenting the first review of the literature on improvement capability across all sectors including healthcare. PMID:28992146
Common world model for unmanned systems
NASA Astrophysics Data System (ADS)
Dean, Robert Michael S.
2013-05-01
The Robotic Collaborative Technology Alliance (RCTA) seeks to provide adaptive robot capabilities which move beyond traditional metric algorithms to include cognitive capabilities. Key to this effort is the Common World Model, which moves beyond the state-of-the-art by representing the world using metric, semantic, and symbolic information. It joins these layers of information to define objects in the world. These objects may be reasoned upon jointly using traditional geometric, symbolic cognitive algorithms and new computational nodes formed by the combination of these disciplines. The Common World Model must understand how these objects relate to each other. Our world model includes the concept of Self-Information about the robot. By encoding current capability, component status, task execution state, and histories we track information which enables the robot to reason and adapt its performance using Meta-Cognition and Machine Learning principles. The world model includes models of how aspects of the environment behave, which enable prediction of future world states. To manage complexity, we adopted a phased implementation approach to the world model. We discuss the design of "Phase 1" of this world model, and interfaces by tracing perception data through the system from the source to the meta-cognitive layers provided by ACT-R and SS-RICS. We close with lessons learned from implementation and how the design relates to Open Architecture.
Code of Federal Regulations, 2014 CFR
2014-10-01
... (2) The capability to perform the following tasks with the frequency and in the manner required under... business days after receipt of notice of income, and the income source subject to withholding from a court... orders through an automated information network in meeting paragraph (e)(2)(ii) of this section provided...
Teaching & Learning in a Hybrid World. An Interview with Carol Twigg
ERIC Educational Resources Information Center
Veronikas, Susan Walsh; Shaughnessy, Michael F.
2004-01-01
Dr. Carol A. Twigg is Executive Director of the Center for Academic Transformation at Rensselaer Polytechnic Institute. The mission of the center is to serve as a source of expertise and support for those in higher education who want to take advantage of the capabilities of information technology to transform their academic practices. The center…
Automatic generation of Web mining environments
NASA Astrophysics Data System (ADS)
Cibelli, Maurizio; Costagliola, Gennaro
1999-02-01
The main problem related to the retrieval of information from the world wide web is the enormous number of unstructured documents and resources, i.e., the difficulty of locating and tracking appropriate sources. This paper presents a web mining environment (WME), which is capable of finding, extracting and structuring information related to a particular domain from web documents, using general purpose indices. The WME architecture includes a web engine filter (WEF), to sort and reduce the answer set returned by a web engine, a data source pre-processor (DSP), which processes html layout cues in order to collect and qualify page segments, and a heuristic-based information extraction system (HIES), to finally retrieve the required data. Furthermore, we present a web mining environment generator, WMEG, that allows naive users to generate a WME specific to a given domain by providing a set of specifications.
NASA Astrophysics Data System (ADS)
Pilone, D.; Cechini, M. F.; Mitchell, A.
2011-12-01
Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.
Agent-oriented privacy-based information brokering architecture for healthcare environments.
Masaud-Wahaishi, Abdulmutalib; Ghenniwa, Hamada
2009-01-01
Healthcare industry is facing a major reform at all levels-locally, regionally, nationally, and internationally. Healthcare services and systems become very complex and comprise of a vast number of components (software systems, doctors, patients, etc.) that are characterized by shared, distributed and heterogeneous information sources with varieties of clinical and other settings. The challenge now faced with decision making, and management of care is to operate effectively in order to meet the information needs of healthcare personnel. Currently, researchers, developers, and systems engineers are working toward achieving better efficiency and quality of service in various sectors of healthcare, such as hospital management, patient care, and treatment. This paper presents a novel information brokering architecture that supports privacy-based information gathering in healthcare. Architecturally, the brokering is viewed as a layer of services where a brokering service is modeled as an agent with a specific architecture and interaction protocol that are appropriate to serve various requests. Within the context of brokering, we model privacy in terms of the entities ability to hide or reveal information related to its identities, requests, and/or capabilities. A prototype of the proposed architecture has been implemented to support information-gathering capabilities in healthcare environments using FIPA-complaint platform JADE.
NASA Astrophysics Data System (ADS)
Galvao, Diogo
2013-04-01
As a result of various economic, social and environmental factors, we can all experience the increase in importance of water resources at a global scale. As a consequence, we can also notice the increasing need of methods and systems capable of efficiently managing and combining the rich and heterogeneous data available that concerns, directly or indirectly, these water resources, such as in-situ monitoring station data, Earth Observation images and measurements, Meteorological modeling forecasts and Hydrological modeling. Under the scope of the MyWater project, we developed a water management system capable of satisfying just such needs, under a flexible platform capable of accommodating future challenges, not only in terms of sources of data but also on applicable models to extract information from it. From a methodological point of view, the MyWater platform obtains data from distinct sources, and in distinct formats, be they Satellite images or meteorological model forecasts, transforms and combines them in ways that allow them to be fed to a variety of hydrological models (such as MOHID Land, SIMGRO, etc…), which themselves can also be combined, using such approaches as those advocated by the OpenMI standard, to extract information in an automated and time efficient manner. Such an approach brings its own deal of challenges, and further research was developed under this project on the best ways to combine such data and on novel approaches to hydrological modeling (like the PriceXD model). From a technical point of view, the MyWater platform is structured according to a classical SOA architecture, with a flexible object oriented modular backend service responsible for all the model process management and data treatment, while the information extracted can be interacted with using a variety of frontends, from a web portal, including also a desktop client, down to mobile phone and tablet applications. From an operational point of view, a user can not only see these model results on graphically rich user interfaces, but also interact with them in ways that allows them to extract their own information. This platform was then applied to a variety of case studies in such countries as the Netherlands, Greece, Portugal, Brazil and Africa, to verify the practicality, accuracy and value that it brings to end users and stakeholders.
Open Source Clinical NLP – More than Any Single System
Masanz, James; Pakhomov, Serguei V.; Xu, Hua; Wu, Stephen T.; Chute, Christopher G.; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP’s mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice. PMID:25954581
Mobile Care (Moca) for Remote Diagnosis and Screening
Celi, Leo Anthony; Sarmenta, Luis; Rotberg, Jhonathan; Marcelo, Alvin; Clifford, Gari
2010-01-01
Moca is a cell phone-facilitated clinical information system to improve diagnostic, screening and therapeutic capabilities in remote resource-poor settings. The software allows transmission of any medical file, whether a photo, x-ray, audio or video file, through a cell phone to (1) a central server for archiving and incorporation into an electronic medical record (to facilitate longitudinal care, quality control, and data mining), and (2) a remote specialist for real-time decision support (to leverage expertise). The open source software is designed as an end-to-end clinical information system that seamlessly connects health care workers to medical professionals. It is integrated with OpenMRS, an existing open source medical records system commonly used in developing countries. PMID:21822397
Modeling the Volcanic Source at Long Valley, CA, Using a Genetic Algorithm Technique
NASA Technical Reports Server (NTRS)
Tiampo, Kristy F.
1999-01-01
In this project, we attempted to model the deformation pattern due to the magmatic source at Long Valley caldera using a real-value coded genetic algorithm (GA) inversion similar to that found in Michalewicz, 1992. The project has been both successful and rewarding. The genetic algorithm, coded in the C programming language, performs stable inversions over repeated trials, with varying initial and boundary conditions. The original model used a GA in which the geophysical information was coded into the fitness function through the computation of surface displacements for a Mogi point source in an elastic half-space. The program was designed to invert for a spherical magmatic source - its depth, horizontal location and volume - using the known surface deformations. It also included the capability of inverting for multiple sources.
Context-based electronic health record: toward patient specific healthcare.
Hsu, William; Taira, Ricky K; El-Saden, Suzie; Kangarloo, Hooshang; Bui, Alex A T
2012-03-01
Due to the increasingly data-intensive clinical environment, physicians now have unprecedented access to detailed clinical information from a multitude of sources. However, applying this information to guide medical decisions for a specific patient case remains challenging. One issue is related to presenting information to the practitioner: displaying a large (irrelevant) amount of information often leads to information overload. Next-generation interfaces for the electronic health record (EHR) should not only make patient data easily searchable and accessible, but also synthesize fragments of evidence documented in the entire record to understand the etiology of a disease and its clinical manifestation in individual patients. In this paper, we describe our efforts toward creating a context-based EHR, which employs biomedical ontologies and (graphical) disease models as sources of domain knowledge to identify relevant parts of the record to display. We hypothesize that knowledge (e.g., variables, relationships) from these sources can be used to standardize, annotate, and contextualize information from the patient record, improving access to relevant parts of the record and informing medical decision making. To achieve this goal, we describe a framework that aggregates and extracts findings and attributes from free-text clinical reports, maps findings to concepts in available knowledge sources, and generates a tailored presentation of the record based on the information needs of the user. We have implemented this framework in a system called Adaptive EHR, demonstrating its capabilities to present and synthesize information from neurooncology patients. This paper highlights the challenges and potential applications of leveraging disease models to improve the access, integration, and interpretation of clinical patient data. © 2012 IEEE
NASA Technical Reports Server (NTRS)
Martinko, E. A. (Principal Investigator); Caron, L. M.; Stewart, D. S.
1984-01-01
Data bases and information systems developed and maintained by state agencies to support planning and management of environmental and nutural resources were inventoried for all 50 states, Puerto Rico, and U.S. Virgin Islands. The information obtained is assembled into a computerized data base catalog which is throughly cross-referecence. Retrieval is possible by code, state, data base name, data base acronym, agency, computer, GIS capability, language, specialized software, data category name, geograhic reference, data sources, and level of reliability. The 324 automated data bases identified are described.
Noncritical generation of nonclassical frequency combs via spontaneous rotational symmetry breaking
NASA Astrophysics Data System (ADS)
Navarrete-Benlloch, Carlos; Patera, Giuseppe; de Valcárcel, Germán J.
2017-10-01
Synchronously pumped optical parametric oscillators (SPOPOs) are optical cavities driven by mode-locked lasers, and containing a nonlinear crystal capable of down-converting a frequency comb to lower frequencies. SPOPOs have received a lot of attention lately because their intrinsic multimode nature makes them compact sources of quantum correlated light with promising applications in modern quantum information technologies. In this work we show that SPOPOs are also capable of accessing the challenging and interesting regime where spontaneous symmetry breaking confers strong nonclassical properties to the emitted light, which has eluded experimental observation so far. Apart from opening the possibility of studying experimentally this elusive regime of dissipative phase transitions, our predictions will have a practical impact, since we show that spontaneous symmetry breaking provides a specific spatiotemporal mode with large quadrature squeezing for any value of the system parameters, turning SPOPOs into robust sources of highly nonclassical light above threshold.
Application of accelerator sources for pulsed neutron logging of oil and gas wells
NASA Astrophysics Data System (ADS)
Randall, R. R.
1985-05-01
Dresser Atlas introduced the first commercial pulsed neutron oil well log in the early 1960s. This log had the capability of differentiating oil from salt water in a completed well. In the late 1970s the first continuous carbon/oxygen (C/O) log capable of differentiating oil from fresh water was introduced. The sources used in these commercial logs are radial geometry deuterium-tritium reaction devices with Cockcroft-Walton voltage multipliers providing the accelerator voltage. The commercial logging tools using these accelerators are comprised of scintillators detectors, power supplies, line drivers and receivers, and various timing and communications electronics. They are used to measure either the time decay or energy spectra of neutron-induced gamma events. The time decay information is useful in determining the neutron capture cross section, and the energy spectra is used to characterize inelastic neutron events.
Developing a GIS for CO2 analysis using lightweight, open source components
NASA Astrophysics Data System (ADS)
Verma, R.; Goodale, C. E.; Hart, A. F.; Kulawik, S. S.; Law, E.; Osterman, G. B.; Braverman, A.; Nguyen, H. M.; Mattmann, C. A.; Crichton, D. J.; Eldering, A.; Castano, R.; Gunson, M. R.
2012-12-01
There are advantages to approaching the realm of geographic information systems (GIS) using lightweight, open source components in place of a more traditional web map service (WMS) solution. Rapid prototyping, schema-less data storage, the flexible interchange of components, and open source community support are just some of the benefits. In our effort to develop an application supporting the geospatial and temporal rendering of remote sensing carbon-dioxide (CO2) data for the CO2 Virtual Science Data Environment project, we have connected heterogeneous open source components together to form a GIS. Utilizing widely popular open source components including the schema-less database MongoDB, Leaflet interactive maps, the HighCharts JavaScript graphing library, and Python Bottle web-services, we have constructed a system for rapidly visualizing CO2 data with reduced up-front development costs. These components can be aggregated together, resulting in a configurable stack capable of replicating features provided by more standard GIS technologies. The approach we have taken is not meant to replace the more established GIS solutions, but to instead offer a rapid way to provide GIS features early in the development of an application and to offer a path towards utilizing more capable GIS technology in the future.
ERIC Educational Resources Information Center
Hiebert, Elfrieda H.
2011-01-01
A focus of the Common Core State Standards/English Language Arts (CCSS/ELA) is that students become increasingly more capable with complex text over their school careers. This focus has redirected attention to the measurement of text complexity. Although CCSS/ELA suggests multiple criteria for this task, the standards offer a single measure of…
ERIC Educational Resources Information Center
National Academies Press, 2016
2016-01-01
Research universities are critical contributors to our national research enterprise. They are the principal source of a world-class labor force and fundamental discoveries that enhance our lives and the lives of others around the world. These institutions help to create an educated citizenry capable of making informed and crucial choices as…
Semantic Likelihood Models for Bayesian Inference in Human-Robot Interaction
NASA Astrophysics Data System (ADS)
Sweet, Nicholas
Autonomous systems, particularly unmanned aerial systems (UAS), remain limited in au- tonomous capabilities largely due to a poor understanding of their environment. Current sensors simply do not match human perceptive capabilities, impeding progress towards full autonomy. Recent work has shown the value of humans as sources of information within a human-robot team; in target applications, communicating human-generated 'soft data' to autonomous systems enables higher levels of autonomy through large, efficient information gains. This requires development of a 'human sensor model' that allows soft data fusion through Bayesian inference to update the probabilistic belief representations maintained by autonomous systems. Current human sensor models that capture linguistic inputs as semantic information are limited in their ability to generalize likelihood functions for semantic statements: they may be learned from dense data; they do not exploit the contextual information embedded within groundings; and they often limit human input to restrictive and simplistic interfaces. This work provides mechanisms to synthesize human sensor models from constraints based on easily attainable a priori knowledge, develops compression techniques to capture information-dense semantics, and investigates the problem of capturing and fusing semantic information contained within unstructured natural language. A robotic experimental testbed is also developed to validate the above contributions.
Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D
2012-10-01
The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.
NASA Astrophysics Data System (ADS)
Trejos, Tatiana; Corzo, Ruthmara; Subedi, Kiran; Almirall, José
2014-02-01
Detection and sourcing of counterfeit currency, examination of counterfeit security documents and determination of authenticity of medical records are examples of common forensic document investigations. In these cases, the physical and chemical composition of the ink entries can provide important information for the assessment of the authenticity of the document or for making inferences about common source. Previous results reported by our group have demonstrated that elemental analysis, using either Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry (LA-ICP-MS) or Laser Ablation Induced Breakdown Spectroscopy (LIBS), provides an effective, practical and robust technique for the discrimination of document substrates and writing inks with minimal damage to the document. In this study, laser-based methods and Scanning Electron Microscopy-Energy Dispersive X-Ray Spectroscopy (SEM-EDS) methods were developed, optimized and validated for the forensic analysis of more complex inks such as toners and inkjets, to determine if their elemental composition can differentiate documents printed from different sources and to associate documents that originated from the same printing source. Comparison of the performance of each of these methods is presented, including the analytical figures of merit, discrimination capability and error rates. Different calibration strategies resulting in semi-quantitative and qualitative analysis, comparison methods (match criteria) and data analysis and interpretation tools were also developed. A total of 27 black laser toners originating from different manufacturing sources and/or batches were examined to evaluate the discrimination capability of each method. The results suggest that SEM-EDS offers relatively poor discrimination capability for this set (~ 70.7% discrimination of all the possible comparison pairs or a 29.3% type II error rate). Nonetheless, SEM-EDS can still be used as a complementary method of analysis since it has the advantage of being non-destructive to the sample in addition to providing imaging capabilities to further characterize toner samples by their particle morphology. Laser sampling methods resulted in an improvement of the discrimination between different sources with LIBS producing 89% discrimination and LA-ICP-MS resulting in 100% discrimination. In addition, a set of 21 black inkjet samples was examined by each method. The results show that SEM-EDS is not appropriate for inkjet examinations since their elemental composition is typically below the detection capabilities with only sulfur detected in this set, providing only 47.4% discrimination between possible comparison pairs. Laser sampling methods were shown to provide discrimination greater than 94% for this same inkjet set with false exclusion and false inclusion rates lower than 4.1% and 5.7%, for LA-ICP-MS and LIBS respectively. Overall these results confirmed the utility of the examination of printed documents by laser-based micro-spectrochemical methods. SEM-EDS analysis of toners produced a limited utility for discrimination within sources but was not an effective tool for inkjet ink discrimination. Both LA-ICP-MS and LIBS can be used in forensic laboratories to chemically characterize inks on documents and to complement the information obtained by conventional methods and enhance their evidential value.
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.
Status of the CDS Services, SIMBAD, VizieR and Aladin
NASA Astrophysics Data System (ADS)
Genova, Francoise; Allen, M. G.; Bienayme, O.; Boch, T.; Bonnarel, F.; Cambresy, L.; Derriere, S.; Dubois, P.; Fernique, P.; Landais, G.; Lesteven, S.; Loup, C.; Oberto, A.; Ochsenbein, F.; Schaaff, A.; Vollmer, B.; Wenger, M.; Louys, M.; Davoust, E.; Jasniewicz, G.
2006-12-01
Major evolutions have been implemented in the three main CDS databases in 2006. SIMBAD 4, a new version of SIMBAD developed with Java and PostgreSQL, has been released. Il is much more flexible than the previous version and offers in particular full search capabilities on all parameters. Wild card can also be used in object names, which should ease searching for a given object in the frequent case of 'fuzzy' nomenclature. New information is progressively added, in particular a set of multiwavelength magnitudes (in progress), and other information from the Dictionnary of Nomenclature such as the list of object types attached to each object name (available), or hierarchy and associations (in progress). A new version of VizieR, also in the open source PostgreSQL DBMS, has been completed, in order to simplify mirroring. The master database at CDS currently remains in the present Sybase implementation. A new simplified interface will be demonstrated, providing a more user-friendly navigation while retaining the multiple browsing capabilities. A new release of the Aladin Sky Atlas offers new capabilities, like the management of multipart FITS files and of data cubes, construction and execution of macros for processing a list of targets, and improved navigation within an image plane. This new version also allows easy and efficient manipulation of very large (>108 pixels) images, support for solar images display, and direct access to SExtractor to perform source extraction on displayed images.
McEntire, Robin; Szalkowski, Debbie; Butler, James; Kuo, Michelle S; Chang, Meiping; Chang, Man; Freeman, Darren; McQuay, Sarah; Patel, Jagruti; McGlashen, Michael; Cornell, Wendy D; Xu, Jinghai James
2016-05-01
External content sources such as MEDLINE(®), National Institutes of Health (NIH) grants and conference websites provide access to the latest breaking biomedical information, which can inform pharmaceutical and biotechnology company pipeline decisions. The value of the sites for industry, however, is limited by the use of the public internet, the limited synonyms, the rarity of batch searching capability and the disconnected nature of the sites. Fortunately, many sites now offer their content for download and we have developed an automated internal workflow that uses text mining and tailored ontologies for programmatic search and knowledge extraction. We believe such an efficient and secure approach provides a competitive advantage to companies needing access to the latest information for a range of use cases and complements manually curated commercial sources. Copyright © 2016. Published by Elsevier Ltd.
PMAnalyzer: a new web interface for bacterial growth curve analysis.
Cuevas, Daniel A; Edwards, Robert A
2017-06-15
Bacterial growth curves are essential representations for characterizing bacteria metabolism within a variety of media compositions. Using high-throughput, spectrophotometers capable of processing tens of 96-well plates, quantitative phenotypic information can be easily integrated into the current data structures that describe a bacterial organism. The PMAnalyzer pipeline performs a growth curve analysis to parameterize the unique features occurring within microtiter wells containing specific growth media sources. We have expanded the pipeline capabilities and provide a user-friendly, online implementation of this automated pipeline. PMAnalyzer version 2.0 provides fast automatic growth curve parameter analysis, growth identification and high resolution figures of sample-replicate growth curves and several statistical analyses. PMAnalyzer v2.0 can be found at https://edwards.sdsu.edu/pmanalyzer/ . Source code for the pipeline can be found on GitHub at https://github.com/dacuevas/PMAnalyzer . Source code for the online implementation can be found on GitHub at https://github.com/dacuevas/PMAnalyzerWeb . dcuevas08@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm
2016-06-03
Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile capabilities with the free and open source software WordPress. This allows institutions without proprietary software to build a sophisticated ePortfolio system adapted to their needs with relatively few resources. The implementation of WordPress should be accompanied by introductory courses in the use of the software and its apps in order to facilitate its usability.
Buzzelli, Michelle M; Morgan, Paula; Muschek, Alexander G; Macgregor-Skinner, Gavin
2014-01-01
Lack of success in disaster recovery occurs for many reasons, with one predominant catalyst for catastrophic failure being flawed and inefficient communication systems. Increased occurrences of devastating environmental hazards and human-caused disasters will continue to promulgate throughout the United States and around the globe as a result of the continuous intensive urbanization forcing human population into more concentrated and interconnected societies. With the rapid evolutions in technology and the advent of Information and communication technology (ICT) interfaces such as Facebook, Twitter, Flickr, Myspace, and Smartphone technology, communication is no longer a unidirectional source of information traveling from the newsroom to the public. In the event of a disaster, time critical information can be exchanged to and from any person or organization simultaneously with the capability to receive feedback. A literature review of current information regarding the use of ICT as information infrastructures in disaster management during human-caused and natural disasters will be conducted. This article asserts that the integrated use of ICTs as multidirectional information sharing tools throughout the disaster cycle will increase a community's resiliency and supplement the capabilities of first responders and emergency management officials by providing real-time updates and information needed to assist and recover from a disaster.
Multiple description distributed image coding with side information for mobile wireless transmission
NASA Astrophysics Data System (ADS)
Wu, Min; Song, Daewon; Chen, Chang Wen
2005-03-01
Multiple description coding (MDC) is a source coding technique that involves coding the source information into multiple descriptions, and then transmitting them over different channels in packet network or error-prone wireless environment to achieve graceful degradation if parts of descriptions are lost at the receiver. In this paper, we proposed a multiple description distributed wavelet zero tree image coding system for mobile wireless transmission. We provide two innovations to achieve an excellent error resilient capability. First, when MDC is applied to wavelet subband based image coding, it is possible to introduce correlation between the descriptions in each subband. We consider using such a correlation as well as potentially error corrupted description as side information in the decoding to formulate the MDC decoding as a Wyner Ziv decoding problem. If only part of descriptions is lost, however, their correlation information is still available, the proposed Wyner Ziv decoder can recover the description by using the correlation information and the error corrupted description as side information. Secondly, in each description, single bitstream wavelet zero tree coding is very vulnerable to the channel errors. The first bit error may cause the decoder to discard all subsequent bits whether or not the subsequent bits are correctly received. Therefore, we integrate the multiple description scalar quantization (MDSQ) with the multiple wavelet tree image coding method to reduce error propagation. We first group wavelet coefficients into multiple trees according to parent-child relationship and then code them separately by SPIHT algorithm to form multiple bitstreams. Such decomposition is able to reduce error propagation and therefore improve the error correcting capability of Wyner Ziv decoder. Experimental results show that the proposed scheme not only exhibits an excellent error resilient performance but also demonstrates graceful degradation over the packet loss rate.
Capability for Integrated Systems Risk-Reduction Analysis
NASA Technical Reports Server (NTRS)
Mindock, J.; Lumpkins, S.; Shelhamer, M.
2016-01-01
NASA's Human Research Program (HRP) is working to increase the likelihoods of human health and performance success during long-duration missions, and subsequent crew long-term health. To achieve these goals, there is a need to develop an integrated understanding of how the complex human physiological-socio-technical mission system behaves in spaceflight. This understanding will allow HRP to provide cross-disciplinary spaceflight countermeasures while minimizing resources such as mass, power, and volume. This understanding will also allow development of tools to assess the state of and enhance the resilience of individual crewmembers, teams, and the integrated mission system. We will discuss a set of risk-reduction questions that has been identified to guide the systems approach necessary to meet these needs. In addition, a framework of factors influencing human health and performance in space, called the Contributing Factor Map (CFM), is being applied as the backbone for incorporating information addressing these questions from sources throughout HRP. Using the common language of the CFM, information from sources such as the Human System Risk Board summaries, Integrated Research Plan, and HRP-funded publications has been combined and visualized in ways that allow insight into cross-disciplinary interconnections in a systematic, standardized fashion. We will show examples of these visualizations. We will also discuss applications of the resulting analysis capability that can inform science portfolio decisions, such as areas in which cross-disciplinary solicitations or countermeasure development will potentially be fruitful.
Building a base map with AutoCAD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flarity, S.J.
1989-12-01
The fundamental step in the exploration process is building a base map. Consequently, any serious computer exploration program should be capable of providing base maps. Data used in constructing base maps are available from commercial sources such as Tobin. and Petroleum Information. These data sets include line and well data, the line data being latitude longitude vectors, and the ell data any identifying text information for well and their locations. AutoCAD is a commercial program useful in building base maps. Its features include infinite zoom and pan capability, layering, block definition, text dialog boxes, and a command language, AutoLisp. AutoLispmore » provides more power by allowing the geologist to modify the way the program works. Three AutoLisp routines presented here allow geologists to construct a geologic base map from raw Tobin data. The first program, WELLS.LSP, sets up the map environment for the subsequent programs, WELLADD.LSP and LINEADD.LSP. Welladd.lisp reads the Tobin data and spots the well symbols and the identifying information. Lineadd.lsp performs the same task on line and textural information contained within the data set.« less
Pure sources and efficient detectors for optical quantum information processing
NASA Astrophysics Data System (ADS)
Zielnicki, Kevin
Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on optimizing the detection efficiency of visible light photon counters (VLPCs), a single-photon detection technology that is also capable of resolving photon number states. We report a record-breaking quantum efficiency of 91 +/- 3% observed with our detection system. Both sources and detectors are independently interesting physical systems worthy of study, but together they promise to enable entire new classes and applications of information based on quantum mechanics.
Preston, Stephen D.; Alexander, Richard B.; Woodside, Michael D.
2011-01-01
The U.S. Geological Survey (USGS) recently completed assessments of stream nutrients in six major regions extending over much of the conterminous United States. SPARROW (SPAtially Referenced Regressions On Watershed attributes) models were developed for each region to explain spatial patterns in monitored stream nutrient loads in relation to human activities and natural resources and processes. The model information, reported by stream reach and catchment, provides contrasting views of the spatial patterns of nutrient source contributions, including those from urban (wastewater effluent and diffuse runoff from developed land), agricultural (farm fertilizers and animal manure), and specific background sources (atmospheric nitrogen deposition, soil phosphorus, forest nitrogen fixation, and channel erosion).
Parameter estimation accuracies of Galactic binaries with eLISA
NASA Astrophysics Data System (ADS)
Błaut, Arkadiusz
2018-09-01
We study parameter estimation accuracy of nearly monochromatic sources of gravitational waves with the future eLISA-like detectors. eLISA will be capable of observing millions of such signals generated by orbiting pairs of compact binaries consisting of white dwarf, neutron star or black hole and to resolve and estimate parameters of several thousands of them providing crucial information regarding their orbital dynamics, formation rates and evolutionary paths. Using the Fisher matrix analysis we compare accuracies of the estimated parameters for different mission designs defined by the GOAT advisory team established to asses the scientific capabilities and the technological issues of the eLISA-like missions.
Static telescope aberration measurement using lucky imaging techniques
NASA Astrophysics Data System (ADS)
López-Marrero, Marcos; Rodríguez-Ramos, Luis Fernando; Marichal-Hernández, José Gil; Rodríguez-Ramos, José Manuel
2012-07-01
A procedure has been developed to compute static aberrations once the telescope PSF has been measured with the lucky imaging technique, using a nearby star close to the object of interest as the point source to probe the optical system. This PSF is iteratively turned into a phase map at the pupil using the Gerchberg-Saxton algorithm and then converted to the appropriate actuation information for a deformable mirror having low actuator number but large stroke capability. The main advantage of this procedure is related with the capability of correcting static aberration at the specific pointing direction and without the need of a wavefront sensor.
NASA Astrophysics Data System (ADS)
Naruse, Makoto; Berthel, Martin; Drezet, Aurélien; Huant, Serge; Aono, Masashi; Hori, Hirokazu; Kim, Song-Ju
2015-08-01
Decision making is critical in our daily lives and for society in general and is finding evermore practical applications in information and communication technologies. Herein, we demonstrate experimentally that single photons can be used to make decisions in uncertain, dynamically changing environments. Using a nitrogen-vacancy in a nanodiamond as a single-photon source, we demonstrate the decision-making capability by solving the multi-armed bandit problem. This capability is directly and immediately associated with single-photon detection in the proposed architecture, leading to adequate and adaptive autonomous decision making. This study makes it possible to create systems that benefit from the quantum nature of light to perform practical and vital intelligent functions.
NASA Astrophysics Data System (ADS)
Di Stefano, M.; Fox, P. A.; Beaulieu, S. E.; Maffei, A. R.; West, P.; Hare, J. A.
2012-12-01
Integrated assessments of large marine ecosystems require the understanding of interactions between environmental, ecological, and socio-economic factors that affect production and utilization of marine natural resources. Assessing the functioning of complex coupled natural-human systems calls for collaboration between natural and social scientists across disciplinary and national boundaries. We are developing a platform to implement and sustain informatics solutions for these applications, providing interoperability among very diverse and heterogeneous data and information sources, as well as multi-disciplinary organizations and people. We have partnered with NOAA NMFS scientists to facilitate the deployment of an integrated ecosystem approach to management in the Northeast U.S. (NES) and California Current Large Marine Ecosystems (LMEs). Our platform will facilitate the collaboration and knowledge sharing among NMFS natural and social scientists, promoting community participation in integrating data, models, and knowledge. Here, we present collaborative software tools developed to aid the production of the Ecosystem Status Report (ESR) for the NES LME. The ESR addresses the D-P-S portion of the DPSIR (Driver-Pressure-State-Impact-Response) management framework: reporting data, indicators, and information products for climate drivers, physical and human (fisheries) pressures, and ecosystem state (primary and secondary production and higher trophic levels). We are developing our tools in open-source software, with the main tool based on a web application capable of providing the ability to work on multiple data types from a variety of sources, providing an effective way to share the source code used to generate data products and associated metadata as well as track workflow provenance to allow in the reproducibility of a data product. Our platform retrieves data, conducts standard analyses, reports data quality and other standardized metadata, provides iterative and interactive visualization, and enables the download of data plotted in the ESR. Data, indicators, and information products include time series, geographic maps, and uni-variate and multi-variate analyses. Also central to the success of this initiative is the commitment to accommodate and train scientists of multiple disciplines who will learn to interact effectively with this new integrated and interoperable ecosystem assessment capability. Traceability, repeatability, explanation, verification, and validation of data, indicators, and information products are important for cross-disciplinary understanding and sharing with managers, policymakers, and the public. We are also developing an ontology to support the implementation of the DPSIR framework. These new capabilities will serve as the essential foundation for the formal synthesis and quantitative analysis of information on relevant natural and socio-economic factors in relation to specified ecosystem management goals which can be applied in other LMEs.
NASA Astrophysics Data System (ADS)
Yudono, Adipandang
2017-06-01
Recently, crowd-sourced information is used to produce and improve collective knowledge and community capacity building. Triggered by broadening and expanding access to the Internet and cellular telephones, the utilisation of crowd-sourcing for policy advocacy, e-government and e-participation has increased globally [1]. Crowd-sourced information can conceivably support government’s or general social initiatives to inform, counsel, and cooperate, by engaging subjects and empowering decentralisation and democratization [2]. Crowd-sourcing has turned into a major technique for interactive mapping initiatives by urban or rural community because of its capability to incorporate a wide range of data. Continuously accumulated spatial data can be sorted, layered, and envisioned in ways that even beginners can comprehend with ease. Interactive spatial visualization has the possibility to be a useful democratic planning tool to empower citizens participating in spatial data provision and sharing in government programmes. Since the global emergence of World Wide Web (WWW) technology, the interaction between information providers and users has increased. Local communities are able to produce and share spatial data to produce web interfaces with territorial information in mapping application programming interfaces (APIs) public, such as Google maps, OSM and Wikimapia [3][4][5]. In terms of the democratic spatial planning action, Volunteered Geographic Information (VGI) is considered an effective voluntary method of helping people feel comfortable with the technology and other co-participants in order to shape coalitions of local knowledge. This paper has aim to investigate ‘How is spatial data created by citizens used in Indonesia?’ by discussing the characteristics of spatial data usage by citizens to support spatial policy formulation, starting with the history of participatory mapping to current VGI development in Indonesia.
Aerothermodynamic Flight Simulation Capabilities for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Miller, Charles G.
1998-01-01
Aerothermodynamics, encompassing aerodynamics, aeroheating, and fluid dynamics and physical processes, is the genesis for the design and development of advanced space transportation vehicles and provides crucial information to other disciplines such as structures, materials, propulsion, avionics, and guidance, navigation and control. Sources of aerothermodynamic information are ground-based facilities, Computational Fluid Dynamic (CFD) and engineering computer codes, and flight experiments. Utilization of this aerothermodynamic triad provides the optimum aerothermodynamic design to safely satisfy mission requirements while reducing design conservatism, risk and cost. The iterative aerothermodynamic process for initial screening/assessment of aerospace vehicle concepts, optimization of aerolines to achieve/exceed mission requirements, and benchmark studies for final design and establishment of the flight data book are reviewed. Aerothermodynamic methodology centered on synergism between ground-based testing and CFD predictions is discussed for various flow regimes encountered by a vehicle entering the Earth s atmosphere from low Earth orbit. An overview of the resources/infrastructure required to provide accurate/creditable aerothermodynamic information in a timely manner is presented. Impacts on Langley s aerothermodynamic capabilities due to recent programmatic changes such as Center reorganization, downsizing, outsourcing, industry (as opposed to NASA) led programs, and so forth are discussed. Sample applications of these capabilities to high Agency priority, fast-paced programs such as Reusable Launch Vehicle (RLV)/X-33 Phases I and 11, X-34, Hyper-X and X-38 are presented and lessons learned discussed. Lastly, enhancements in ground-based testing/CFD capabilities necessary to partially/fully satisfy future requirements are addressed.
Trusted Defense Microelectronics: Future Access and Capabilities Are Uncertain
2015-10-28
Board Task Force on High Performance Microchip Supply and documentation and discussions with industry and DOD officials in September and October...the defense and microelectronics industry . DOD’s review of this report deemed some of this information as sensitive but unclassified. What GAO...increased specialization and industry consolidation. • Once dominated by domestic sources, the supply chain for microelectronics manufacturing is a global one
Naval Sea Systems Command On Watch 2010
2010-01-01
surface targets, such as zodiacs and fast patrol boats found in the littoral environment. As for future capabilities and goals for the program, An...Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour...per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing
A New Mathematical Framework for Design Under Uncertainty
2016-05-05
blending multiple information sources via auto-regressive stochastic modeling. A computationally efficient machine learning framework is developed based on...sion and machine learning approaches; see Fig. 1. This will lead to a comprehensive description of system performance with less uncertainty than in the...Bayesian optimization of super-cavitating hy- drofoils The goal of this study is to demonstrate the capabilities of statistical learning and
Stream temperature investigations: field and analytic methods
Bartholow, J.M.
1989-01-01
Alternative public domain stream and reservoir temperature models are contrasted with SNTEMP. A distinction is made between steady-flow and dynamic-flow models and their respective capabilities. Regression models are offered as an alternative approach for some situations, with appropriate mathematical formulas suggested. Appendices provide information on State and Federal agencies that are good data sources, vendors for field instrumentation, and small computer programs useful in data reduction.
A survey of tools and resources for the next generation analyst
NASA Astrophysics Data System (ADS)
Hall, David L.; Graham, Jake; Catherman, Emily
2015-05-01
We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
Going beyond the NASA Earthdata website: Reaching out to new audiences via social media and webinars
NASA Astrophysics Data System (ADS)
Bagwell, R.; Wong, M. M.; Brennan, J.; Murphy, K. J.; Behnke, J.
2014-12-01
This poster will introduce and explore the various social media efforts and monthly webinar series recently established by the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. Some of the capabilities include twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), a data discovery and service access client (Reverb), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative, and a host of other discipline specific data discovery, data access, data subsetting and visualization tools and services. We have embarked on these efforts to reach out to new audiences and potential new users and to engage our diverse end user communities world-wide. One of the key objectives is to increase awareness of the breadth of Earth science data information, services, and tools that are publicly available while also highlighting how these data and technologies enable scientific research.
Heterogeneity and Cooperation: The Role of Capability and Valuation on Public Goods Provision
Kolle, Felix
2018-01-01
We experimentally investigate the effects of two different sources of heterogeneity - capability and valuation - on the provision public goods when punishment is possible or not. We find that compared to homogeneous groups, asymmetric valuations for the public good have negative effects on cooperation and its enforcement through informal sanctions. Asymmetric capabilities in providing the public good, in contrast, have a positive and stabilizing effect on voluntary contributions. The main reason for these results are the different externalities contributions have on the other group members’ payoffs affecting individuals’ willingness to cooperate. We thus provide evidence that it is not the asymmetric nature of groups per se that facilitates or impedes collective action, but that it is rather the nature of asymmetry that determines the degree of cooperation and the level of public good provision. PMID:29367794
Stout, N; Bell, C
1991-06-01
The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems.
Stout, N; Bell, C
1991-01-01
BACKGROUND: The complete and accurate identification of fatal occupational injuries among the US work force is an important first step in developing work injury prevention efforts. Numerous sources of information, such as death certificates, Workers' Compensation files, Occupational Safety and Health Administration (OSHA) files, medical examiner records, state health and labor department reports, and various combinations of these, have been used to identify cases of work-related fatal injuries. Recent studies have questioned the effectiveness of these sources for identifying such cases. METHODS: At least 10 studies have used multiple sources to define the universe of fatal work injuries within a state and to determine the capture rates, or proportion of the universe identified, by each source. Results of these studies, which are not all available in published literature, are summarized here in a format that allows researchers to readily compare the ascertainment capabilities of the sources. RESULTS: The overall average capture rates of sources were as follows: death certificates, 81%; medical examiner records, 61%; Workers' Compensation reports, 57%; and OSHA reports 32%. Variations by state and value added through the use of multiple sources are presented and discussed. CONCLUSIONS: This meta-analysis of 10 state-based studies summarizes the effectiveness of various source documents for capturing cases of fatal occupational injuries to help researchers make informed decisions when designing occupational injury surveillance systems. PMID:1827569
The virtual library: Coming of age
NASA Technical Reports Server (NTRS)
Hunter, Judy F.; Cotter, Gladys A.
1994-01-01
With the high speed networking capabilities, multiple media options, and massive amounts of information that exist in electronic format today, the concept of a 'virtual' library or 'library without walls' is becoming viable. In virtual library environment, the information processed goes beyond the traditional definition of documents to include the results of scientific and technical research and development (reports, software, data) recorded in any format or media: electronic, audio, video, or scanned images. Network access to information must include tools to help locate information sources and navigate the networks to connect to the sources, as well as methods to extract the relevant information. Graphical User Interfaces (GUI's) that are intuitive and navigational tools such as Intelligent Gateway Processors (IGP) will provide users with seamless and transparent use of high speed networks to access, organize, and manage information. Traditional libraries will become points of electronic access to information on multiple medias. The emphasis will be towards unique collections of information at each library rather than entire collections at every library. It is no longer a question of whether there is enough information available; it is more a question of how to manage the vast volumes of information. The future equation will involve being able to organize knowledge, manage information, and provide access at the point of origin.
Joint Source-Channel Decoding of Variable-Length Codes with Soft Information: A Survey
NASA Astrophysics Data System (ADS)
Guillemot, Christine; Siohan, Pierre
2005-12-01
Multimedia transmission over time-varying wireless channels presents a number of challenges beyond existing capabilities conceived so far for third-generation networks. Efficient quality-of-service (QoS) provisioning for multimedia on these channels may in particular require a loosening and a rethinking of the layer separation principle. In that context, joint source-channel decoding (JSCD) strategies have gained attention as viable alternatives to separate decoding of source and channel codes. A statistical framework based on hidden Markov models (HMM) capturing dependencies between the source and channel coding components sets the foundation for optimal design of techniques of joint decoding of source and channel codes. The problem has been largely addressed in the research community, by considering both fixed-length codes (FLC) and variable-length source codes (VLC) widely used in compression standards. Joint source-channel decoding of VLC raises specific difficulties due to the fact that the segmentation of the received bitstream into source symbols is random. This paper makes a survey of recent theoretical and practical advances in the area of JSCD with soft information of VLC-encoded sources. It first describes the main paths followed for designing efficient estimators for VLC-encoded sources, the key component of the JSCD iterative structure. It then presents the main issues involved in the application of the turbo principle to JSCD of VLC-encoded sources as well as the main approaches to source-controlled channel decoding. This survey terminates by performance illustrations with real image and video decoding systems.
Journey toward a patient-centered medical home: readiness for change in primary care practices.
Wise, Christopher G; Alexander, Jeffrey A; Green, Lee A; Cohen, Genna R; Koster, Christina R
2011-09-01
Information is limited regarding the readiness of primary care practices to make the transformational changes necessary to implement the patient-centered medical home (PCMH) model. Using comparative, qualitative data, we provide practical guidelines for assessing and increasing readiness for PCMH implementation. We used a comparative case study design to assess primary care practices' readiness for PCMH implementation in sixteen practices from twelve different physician organizations in Michigan. Two major components of organizational readiness, motivation and capability, were assessed. We interviewed eight practice teams with higher PCMH scores and eight with lower PCMH scores, along with the leaders of the physician organizations of these practices, yielding sixty-six semistructured interviews. The respondents from the higher and lower PCMH scoring practices reported different motivations and capabilities for pursuing PCMH. Their motivations pertained to the perceived value of PCMH, financial incentives, understanding of specific PCMH requirements, and overall commitment to change. Capabilities that were discussed included the time demands of implementation, the difficulty of changing patients' behavior, and the challenges of adopting health information technology. Enhancing the implementation of PCMH within practices included taking an incremental approach, using data, building a team and defining roles of its members, and meeting regularly to discuss the implementation. The respondents valued external organizational support, regardless of its source. The respondents from the higher and lower PCMH scoring practices commented on similar aspects of readiness-motivation and capability-but offered very different views of them. Our findings suggest the importance of understanding practice perceptions of the motivations for PCMH and the capability to undertake change. While this study identified some initial approaches that physician organizations and practices have used to prepare for practice redesign, we need much more information about their effectiveness. © 2011 Milbank Memorial Fund. Published by Wiley Periodicals Inc.
Visualization tool for human-machine interface designers
NASA Astrophysics Data System (ADS)
Prevost, Michael P.; Banda, Carolyn P.
1991-06-01
As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.
Development of guidelines for the definition of the relavant information content in data classes
NASA Technical Reports Server (NTRS)
Schmitt, E.
1973-01-01
The problem of experiment design is defined as an information system consisting of information source, measurement unit, environmental disturbances, data handling and storage, and the mathematical analysis and usage of data. Based on today's concept of effective computability, general guidelines for the definition of the relevant information content in data classes are derived. The lack of a universally applicable information theory and corresponding mathematical or system structure is restricting the solvable problem classes to a small set. It is expected that a new relativity theory of information, generally described by a universal algebra of relations will lead to new mathematical models and system structures capable of modeling any well defined practical problem isomorphic to an equivalence relation at any corresponding level of abstractness.
NASA Astrophysics Data System (ADS)
Ames, D. P.
2013-12-01
As has been seen in other informatics fields, well-documented and appropriately licensed open source software tools have the potential to significantly increase both opportunities and motivation for inter-institutional science and technology collaboration. The CUAHSI HIS (and related HydroShare) projects have aimed to foster such activities in hydrology resulting in the development of many useful community software components including the HydroDesktop software application. HydroDesktop is an open source, GIS-based, scriptable software application for discovering data on the CUAHSI Hydrologic Information System and related resources. It includes a well-defined plugin architecture and interface to allow 3rd party developers to create extensions and add new functionality without requiring recompiling of the full source code. HydroDesktop is built in the C# programming language and uses the open source DotSpatial GIS engine for spatial data management. Capabilities include data search, discovery, download, visualization, and export. An extension that integrates the R programming language with HydroDesktop provides scripting and data automation capabilities and an OpenMI plugin provides the ability to link models. Current revision and updates to HydroDesktop include migration of core business logic to cross platform, scriptable Python code modules that can be executed in any operating system or linked into other software front-end applications.
Assessing COSMO-SkyMed capability for crops identification and monitoring
NASA Astrophysics Data System (ADS)
Guarini, R.; Dini, L.
2015-12-01
In the last decade, it has been possible to better understand the impact of agricultural human practices on the global environmental change at different spatial (from local to global) and time (from seasonal to decadal) scales. This has been achieved thanks to: big dataset continuously acquired by Earth Observation (EO) satellites; the improved capabilities of remote sensing techniques in extracting valuable information from the EO datasets; the new EO data policy which allowed unrestricted data usage; the net technologies which allowed to quickly and easily share national, international and market-derived information; an increasingly performing computing technology which allows to massively process large amount of data easier and at decreasing costs. To better understand the environmental impacts of agriculture and to monitor the consequences of human agricultural activities on the biosphere, scientists require to better identify crops and monitor crop conditions over time and space. Traditionally, NDVI time series maps derived from optical sensors have been used to this aim. As well-known this important source of information is conditioned by cloud cover. Unlike passive systems, synthetic aperture radar (SAR) ones are almost insensitive to atmospheric influences; thus, they are especially suitable for crop identification and condition monitoring. Among the other SAR systems currently in orbit, the Italian Space Agency (ASI) COSMO Sky-Med® (CSK®) constellation (X-band, frequency 9.6 GHz, wavelength 3.1 cm), especially for its peculiar high revisit capability (up to four images in 16 days with same acquisition geometry) seems to be particular suitable for providing information in addition and/or in alternative to other optical EO systems. To assess the capability of the CSK® constellation in identifying crops and in monitoring crops condition in 2013 ASI started the "AGRICIDOT" project. Some of the main project achievements will be presented at the congress.
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.
2016-12-01
It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.
NASA Astrophysics Data System (ADS)
Koltunov, A.; Quayle, B.; Prins, E. M.; Ambrosia, V. G.; Ustin, S.
2014-12-01
Fire managers at various levels require near-real-time, low-cost, systematic, and reliable early detection capabilities with minimal latency to effectively respond to wildfire ignitions and minimize the risk of catastrophic development. The GOES satellite images collected for vast territories at high temporal frequencies provide a consistent and reliable source for operational active fire mapping realized by the WF-ABBA algorithm. However, their potential to provide early warning or rapid confirmation of initial fire ignition reports from conventional sources remains underutilized, partly because the operational wildfire detection has been successfully optimized for users and applications for which timeliness of initial detection is a low priority, contrasting to the needs of first responders. We present our progress in developing the GOES Early Fire Detection (GOES-EFD) system, a collaborative effort led by University of California-Davis and USDA Forest Service. The GOES-EFD specifically focuses on first detection timeliness for wildfire incidents. It is automatically trained for a monitored scene and capitalizes on multiyear cross-disciplinary algorithm research. Initial retrospective tests in Western US demonstrate significantly earlier identification detection of new ignitions than existing operational capabilities and a further improvement prospect. The GOES-EFD-β prototype will be initially deployed for the Western US region to process imagery from GOES-NOP and the rapid and 4 times higher spatial resolution imagery from GOES-R — the upcoming next generation of GOES satellites. These and other enhanced capabilities of GOES-R are expected to significantly improve the timeliness of fire ignition information from GOES-EFD.
NASA Astrophysics Data System (ADS)
Gibson, J. Murray
2009-05-01
Probably the most prolific use of large accelerators today is in the creation of bright beams of x-ray photons or neutrons. The number of scientific users of such sources in the US alone is approaching 10,000. I will describe the some of the major applications of synchrotron and neutron radiation and their impact on society. If you have AIDS, need a better IPOD or a more efficient car, or want to clean up a superfund site, you are benefitting from these accelerators. The design of new materials is becoming more and more dependent on structural information from these sources. I will identify the trends in applications which are demanding new sources with greater capabilities.
The Need for Vendor Source Code at NAS. Revised
NASA Technical Reports Server (NTRS)
Carter, Russell; Acheson, Steve; Blaylock, Bruce; Brock, David; Cardo, Nick; Ciotti, Bob; Poston, Alan; Wong, Parkson; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
The Numerical Aerodynamic Simulation (NAS) Facility has a long standing practice of maintaining buildable source code for installed hardware. There are two reasons for this: NAS's designated pathfinding role, and the need to maintain a smoothly running operational capacity given the widely diversified nature of the vendor installations. NAS has a need to maintain support capabilities when vendors are not able; diagnose and remedy hardware or software problems where applicable; and to support ongoing system software development activities whether or not the relevant vendors feel support is justified. This note provides an informal history of these activities at NAS, and brings together the general principles that drive the requirement that systems integrated into the NAS environment run binaries built from source code, onsite.
Strategic Sourcing of R&D: The Determinants of Success
NASA Astrophysics Data System (ADS)
Brook, Jacques W.; Plugge, Albert
The outsourcing of the R&D function is an emerging practice of corporate firms. In their attempt to reduce the increasing cost of research and technology development, firms are strategically outsourcing the R&D function or repositioning their internal R&D organisation. By doing so, they are able to benefit from other technology sources around the world. So far, there is only limited research on how firms develop their R&D sourcing strategies and how these strategies are implemented. This study aims to identify which determinants contribute to the success of R&D sourcing strategies. The results of our empirical research indicate that a clear vision of how to manage innovation strategically on a corporate level is a determinant of an effective R&D strategy. Moreover, our findings revealed that the R&D sourcing strategy influences a firm's sourcing capabilities. These sourcing capabilities need to be developed to manage the demand as well as the supply of R&D services. The alignment between the demand capabilities and the supply capabilities contributes to the success of R&D sourcing.
Sense and Respond Logistics: Integrating Prediction, Responsiveness, and Control Capabilities
2006-01-01
logistics SAR sense and respond SCM Supply Chain Management SCN Supply Chain Network SIDA sense, interpret, decide, act SOS source of supply TCN...commodity supply chain management ( SCM ), will have WS- SCMs that focus on integrating information for a particular MDS. 8 In the remainder of this...developed applications of ABMs for SCM .21 Applications of Agents and Agent-Based Modeling Agents have been used in telecommunications, e-commerce
Building Software Agents for Planning, Monitoring, and Optimizing Travel
2004-01-01
defined as plans in the Theseus Agent Execution language (Barish et al. 2002). In the Web environment, sources can be quite slow and the latencies of...executor is based on a dataflow paradigm, actions are executed as soon as the data becomes available. Second, Theseus performs the actions in a...while Thesues provides an expressive language for defining information gathering and monitoring plans. The Theseus language supports capabilities
Articulation Management for Intelligent Integration of Information
NASA Technical Reports Server (NTRS)
Maluf, David A.; Tran, Peter B.; Clancy, Daniel (Technical Monitor)
2001-01-01
When combining data from distinct sources, there is a need to share meta-data and other knowledge about various source domains. Due to semantic inconsistencies and heterogeneity of representations, problems arise in combining multiple domains when the domains are merged. The knowledge that is irrelevant to the task of interoperation will be included, making the result unnecessarily complex. This heterogeneity problem can be eliminated by mediating the conflicts and managing the intersections of the domains. For interoperation and intelligent access to heterogeneous information, the focus is on the intersection of the knowledge, since intersection will define the required articulation rules. An algebra over domain has been proposed to use articulation rules to support disciplined manipulation of domain knowledge resources. The objective of a domain algebra is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The algebra supports formally the tasks of selecting, combining, extending, specializing, and modifying Components from a diverse set of domains. This paper presents a domain algebra and demonstrates the use of articulation rules to link declarative interfaces for Internet and enterprise applications. In particular, it discusses the articulation implementation as part of a production system capable of operating over the domain described by the IDL (interface description language) of objects registered in multiple CORBA servers.
Stratospheric Aerosol and Gas Experiment (SAGE 3)
NASA Technical Reports Server (NTRS)
Mccormick, M. P.
1993-01-01
The proposed SAGE III instrument would be the principal source of data for global changes of stratospheric aerosols, stratospheric water vapor, and ozone profiles, and a contributing source of data for upper tropospheric water vapor, aerosols, and clouds. The ability to obtain such data has been demonstrated by the predecessor instrument, SAGE II, but SAGE III will be substantially more capable, as discussed below. The capabilities for monitoring the profiles of atmospheric constituents have been verified in detail, including ground-based validations, for aerosol, ozone, and water vapor. Indeed, because of its self-calibrating characteristics, SAGE II was an essential component of the international ozone trend assessments, and SAGE II is now proving to be invaluable in tracking the aerosols from Mt. Pinatubo. Although SAGE profiles generally terminate at the height of the first tropospheric cloud layer, it has been found that the measurements extend down to 3 km altitude more than 40 percent of the time at most latitudes. Thus, useful information can also be obtained on upper tropospheric aerosols, water vapor, and ozone.
JAMSS: proteomics mass spectrometry simulation in Java.
Smith, Rob; Prince, John T
2015-03-01
Countless proteomics data processing algorithms have been proposed, yet few have been critically evaluated due to lack of labeled data (data with known identities and quantities). Although labeling techniques exist, they are limited in terms of confidence and accuracy. In silico simulators have recently been used to create complex data with known identities and quantities. We propose Java Mass Spectrometry Simulator (JAMSS): a fast, self-contained in silico simulator capable of generating simulated MS and LC-MS runs while providing meta information on the provenance of each generated signal. JAMSS improves upon previous in silico simulators in terms of its ease to install, minimal parameters, graphical user interface, multithreading capability, retention time shift model and reproducibility. The simulator creates mzML 1.1.0. It is open source software licensed under the GPLv3. The software and source are available at https://github.com/optimusmoose/JAMSS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Granero, Luis; Zalevsky, Zeev; Micó, Vicente
2011-04-01
We present a new implementation capable of producing two-dimensional (2D) superresolution (SR) imaging in a single exposure by aperture synthesis in digital lensless Fourier holography when using angular multiplexing provided by a vertical cavity surface-emitting laser source array. The system performs the recording in a single CCD snapshot of a multiplexed hologram coming from the incoherent addition of multiple subholograms, where each contains information about a different 2D spatial frequency band of the object's spectrum. Thus, a set of nonoverlapping bandpass images of the input object can be recovered by Fourier transformation (FT) of the multiplexed hologram. The SR is obtained by coherent addition of the information contained in each bandpass image while generating an enlarged synthetic aperture. Experimental results demonstrate improvement in resolution and image quality.
Materials identification using a small-scale pixellated x-ray diffraction system
NASA Astrophysics Data System (ADS)
O'Flynn, D.; Crews, C.; Drakos, I.; Christodoulou, C.; Wilson, M. D.; Veale, M. C.; Seller, P.; Speller, R. D.
2016-05-01
A transmission x-ray diffraction system has been developed using a pixellated, energy-resolving detector (HEXITEC) and a small-scale, mains operated x-ray source (Amptek Mini-X). HEXITEC enables diffraction to be measured without the requirement of incident spectrum filtration, or collimation of the scatter from the sample, preserving a large proportion of the useful signal compared with other diffraction techniques. Due to this efficiency, sufficient molecular information for material identification can be obtained within 5 s despite the relatively low x-ray source power. Diffraction data are presented from caffeine, hexamine, paracetamol, plastic explosives and narcotics. The capability to determine molecular information from aspirin tablets inside their packaging is demonstrated. Material selectivity and the potential for a sample classification model is shown with principal component analysis, through which each different material can be clearly resolved.
Oxygen isotopes as a tracer of phosphate sources and cycling in aquatic systems (Invited)
NASA Astrophysics Data System (ADS)
Young, M. B.; Kendall, C.; Paytan, A.
2013-12-01
The oxygen isotopic composition of phosphate can provide valuable information about sources and processes affecting phosphorus as it moves through hydrologic systems. Applications of this technique in soil and water have become more common in recent years due to improvements in extraction methods and instrument capabilities, and studies in multiple aquatic environments have demonstrated that some phosphorus sources may have distinct isotopic compositions within a given system. Under normal environmental conditions, the oxygen-phosphorus bonds in dissolved inorganic phosphate (DIP) can only be broken by enzymatic activity. Biological cycling of DIP will bring the phosphate oxygen into a temperature-dependent equilibrium with the surrounding water, overprinting any existing isotopic source signals. However, studies conducted in a wide range of estuarine, freshwater, and groundwater systems have found that the phosphate oxygen is often out of biological equilibrium with the water, suggesting that it is common for at least a partial isotopic source signal to be retained in aquatic systems. Oxygen isotope analysis on various potential phosphate sources such as synthetic and organic fertilizers, animal waste, detergents, and septic/wastewater treatment plant effluents show that these sources span a wide range of isotopic compositions, and although there is considerable overlap between the source groups, sources may be isotopically distinct within a given study area. Recent soil studies have shown that isotopic analysis of phosphate oxygen is also useful for understanding microbial cycling across different phosphorus pools, and may provide insights into controls on phosphorus leaching. Combining stable isotope information from soil and water studies will greatly improve our understanding of complex phosphate cycling, and the increasing use of this isotopic technique across different environments will provide new information regarding anthropogenic phosphate inputs and controls on biological cycling within hydrologic systems.
Mideksa, K G; Singh, A; Hoogenboom, N; Hellriegel, H; Krause, H; Schnitzler, A; Deuschl, G; Raethjen, J; Schmidt, G; Muthuraman, M
2016-08-01
One of the most commonly used therapy to treat patients with Parkinson's disease (PD) is deep brain stimulation (DBS) of the subthalamic nucleus (STN). Identifying the most optimal target area for the placement of the DBS electrodes have become one of the intensive research area. In this study, the first aim is to investigate the capabilities of different source-analysis techniques in detecting deep sources located at the sub-cortical level and validating it using the a-priori information about the location of the source, that is, the STN. Secondly, we aim at an investigation of whether EEG or MEG is best suited in mapping the DBS-induced brain activity. To do this, simultaneous EEG and MEG measurement were used to record the DBS-induced electromagnetic potentials and fields. The boundary-element method (BEM) have been used to solve the forward problem. The position of the DBS electrodes was then estimated using the dipole (moving, rotating, and fixed MUSIC), and current-density-reconstruction (CDR) (minimum-norm and sLORETA) approaches. The source-localization results from the dipole approaches demonstrated that the fixed MUSIC algorithm best localizes deep focal sources, whereas the moving dipole detects not only the region of interest but also neighboring regions that are affected by stimulating the STN. The results from the CDR approaches validated the capability of sLORETA in detecting the STN compared to minimum-norm. Moreover, the source-localization results using the EEG modality outperformed that of the MEG by locating the DBS-induced activity in the STN.
NASA Astrophysics Data System (ADS)
Yu, Xiaojun; Liu, Xinyu; Chen, Si; Wang, Xianghong; Liu, Linbo
2016-03-01
High-resolution optical coherence tomography (OCT) is of critical importance to disease diagnosis because it is capable of providing detailed microstructural information of the biological tissues. However, a compromise usually has to be made between its spatial resolutions and sensitivity due to the suboptimal spectral response of the system components, such as the linear camera, the dispersion grating, and the focusing lenses, etc. In this study, we demonstrate an OCT system that achieves both high spatial resolutions and enhanced sensitivity through utilizing a spectrally encoded source. The system achieves a lateral resolution of 3.1 μm and an axial resolution of 2.3 μm in air; when with a simple dispersive prism placed in the infinity space of the sample arm optics, the illumination beam on the sample is transformed into a line source with a visual angle of 10.3 mrad. Such an extended source technique allows a ~4 times larger maximum permissible exposure (MPE) than its point source counterpart, which thus improves the system sensitivity by ~6dB. In addition, the dispersive prism can be conveniently switched to a reflector. Such flexibility helps increase the penetration depth of the system without increasing the complexity of the current point source devices. We conducted experiments to characterize the system's imaging capability using the human fingertip in vivo and the swine eye optic never disc ex vivo. The higher penetration depth of such a system over the conventional point source OCT system is also demonstrated in these two tissues.
Information fusion: telling the story (or threat narrative)
NASA Astrophysics Data System (ADS)
Fenstermacher, Laurie
2014-06-01
Today's operators face a "double whammy" - the need to process increasing amounts of information, including "Twitter-INT"1 (social information such as Facebook, You-Tube videos, blogs, Twitter) as well as the need to discern threat signatures in new security environments, including those in which the airspace is contested. To do this will require the Air Force to "fuse and leverage its vast capabilities in new ways."2 For starters, the integration of quantitative and qualitative information must be done in a way that preserves important contextual information since the goal increasingly is to identify and mitigate violence before it occurs. To do so requires a more nuanced understanding of the environment being sensed, including the human environment, ideally from the "emic" perspective; that is, from the perspective of that individual or group. This requires not only data and information that informs the understanding of how the individuals and/or groups see themselves and others (social identity) but also information on how that identity filters information in their environment which, in turn, shapes their behaviors.3 The goal is to piece together the individual and/or collective narratives regarding threat, the threat narrative, from various sources of information. Is there a threat? If so, what is it? What is motivating the threat? What is the intent of those who pose the threat and what are their capabilities and their vulnerabilities?4 This paper will describe preliminary investigations regarding the application of prototype hybrid information fusion method based on the threat narrative framework.
Ortega-Martorell, Sandra; Ruiz, Héctor; Vellido, Alfredo; Olier, Iván; Romero, Enrique; Julià-Sapé, Margarida; Martín, José D.; Jarman, Ian H.; Arús, Carles; Lisboa, Paulo J. G.
2013-01-01
Background The clinical investigation of human brain tumors often starts with a non-invasive imaging study, providing information about the tumor extent and location, but little insight into the biochemistry of the analyzed tissue. Magnetic Resonance Spectroscopy can complement imaging by supplying a metabolic fingerprint of the tissue. This study analyzes single-voxel magnetic resonance spectra, which represent signal information in the frequency domain. Given that a single voxel may contain a heterogeneous mix of tissues, signal source identification is a relevant challenge for the problem of tumor type classification from the spectroscopic signal. Methodology/Principal Findings Non-negative matrix factorization techniques have recently shown their potential for the identification of meaningful sources from brain tissue spectroscopy data. In this study, we use a convex variant of these methods that is capable of handling negatively-valued data and generating sources that can be interpreted as tumor class prototypes. A novel approach to convex non-negative matrix factorization is proposed, in which prior knowledge about class information is utilized in model optimization. Class-specific information is integrated into this semi-supervised process by setting the metric of a latent variable space where the matrix factorization is carried out. The reported experimental study comprises 196 cases from different tumor types drawn from two international, multi-center databases. The results indicate that the proposed approach outperforms a purely unsupervised process by achieving near perfect correlation of the extracted sources with the mean spectra of the tumor types. It also improves tissue type classification. Conclusions/Significance We show that source extraction by unsupervised matrix factorization benefits from the integration of the available class information, so operating in a semi-supervised learning manner, for discriminative source identification and brain tumor labeling from single-voxel spectroscopy data. We are confident that the proposed methodology has wider applicability for biomedical signal processing. PMID:24376744
A Survey of Knowledge Management Research & Development at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Clancy, Daniel (Technical Monitor)
2002-01-01
This chapter catalogs knowledge management research and development activities at NASA Ames Research Center as of April 2002. A general categorization scheme for knowledge management systems is first introduced. This categorization scheme divides knowledge management capabilities into five broad categories: knowledge capture, knowledge preservation, knowledge augmentation, knowledge dissemination, and knowledge infrastructure. Each of nearly 30 knowledge management systems developed at Ames is then classified according to this system. Finally, a capsule description of each system is presented along with information on deployment status, funding sources, contact information, and both published and internet-based references.
Observability-Based Guidance and Sensor Placement
NASA Astrophysics Data System (ADS)
Hinson, Brian T.
Control system performance is highly dependent on the quality of sensor information available. In a growing number of applications, however, the control task must be accomplished with limited sensing capabilities. This thesis addresses these types of problems from a control-theoretic point-of-view, leveraging system nonlinearities to improve sensing performance. Using measures of observability as an information quality metric, guidance trajectories and sensor distributions are designed to improve the quality of sensor information. An observability-based sensor placement algorithm is developed to compute optimal sensor configurations for a general nonlinear system. The algorithm utilizes a simulation of the nonlinear system as the source of input data, and convex optimization provides a scalable solution method. The sensor placement algorithm is applied to a study of gyroscopic sensing in insect wings. The sensor placement algorithm reveals information-rich areas on flexible insect wings, and a comparison to biological data suggests that insect wings are capable of acting as gyroscopic sensors. An observability-based guidance framework is developed for robotic navigation with limited inertial sensing. Guidance trajectories and algorithms are developed for range-only and bearing-only navigation that improve navigation accuracy. Simulations and experiments with an underwater vehicle demonstrate that the observability measure allows tuning of the navigation uncertainty.
Interactive visual comparison of multimedia data through type-specific views
NASA Astrophysics Data System (ADS)
Burtner, Russ; Bohn, Shawn; Payne, Debbie
2013-01-01
Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, a novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment.
Interactive visual comparison of multimedia data through type-specific views
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burtner, Edwin R.; Bohn, Shawn J.; Payne, Deborah A.
2013-02-05
Analysts who work with collections of multimedia to perform information foraging understand how difficult it is to connect information across diverse sets of mixed media. The wealth of information from blogs, social media, and news sites often can provide actionable intelligence; however, many of the tools used on these sources of content are not capable of multimedia analysis because they only analyze a single media type. As such, analysts are taxed to keep a mental model of the relationships among each of the media types when generating the broader content picture. To address this need, we have developed Canopy, amore » novel visual analytic tool for analyzing multimedia. Canopy provides insight into the multimedia data relationships by exploiting the linkages found in text, images, and video co-occurring in the same document and across the collection. Canopy connects derived and explicit linkages and relationships through multiple connected visualizations to aid analysts in quickly summarizing, searching, and browsing collected information to explore relationships and align content. In this paper, we will discuss the features and capabilities of the Canopy system and walk through a scenario illustrating how this system might be used in an operational environment. Keywords: Multimedia (Image/Video/Music) Visualization.« less
Results and Analysis of the Infrastructure Request for Information (DE-SOL-0008318)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heidrich, Brenden John
2015-07-01
The Department of Energy (DOE) Office of Nuclear Energy (NE) released a request for information (RFI) (DE-SOL-0008318) for “University, National Laboratory, Industry and International Input on Potential Office of Nuclear Energy Infrastructure Investments” on April 13, 2015. DOE-NE solicited information on five specific types of capabilities as well as any others suggested by the community. The RFI proposal period closed on June 19, 2015. From the 26 responses, 34 individual proposals were extracted. Eighteen were associated with a DOE national laboratory, including Argonne National Laboratory (ANL), Brookhaven National Laboratory (BNL), Idaho National Laboratory (INL), Los Alamos National Laboratory (LANL), Pacificmore » Northwest National Laboratory (PNNL) and Sandia National Laboratory (SNL). Oak Ridge National Laboratory (ORNL) was referenced in a proposal as a proposed capability location, although the proposal did not originate with ORNL. Five US universities submitted proposals (Massachusetts Institute of Technology, Pennsylvania State University, Rensselaer Polytechnic Institute, University of Houston and the University of Michigan). Three industrial/commercial institutions submitted proposals (AREVA NP, Babcock and Wilcox (B&W) and the Electric Power Research Institute (EPRI)). Eight major themes emerged from the submissions as areas needing additional capability or support for existing capabilities. Two submissions supported multiple areas. The major themes are: Advanced Manufacturing (AM), High Performance Computing (HPC), Ion Irradiation with X-Ray Diagnostics (IIX), Ion Irradiation with TEM Visualization (IIT), Radiochemistry Laboratories (RCL), Test Reactors, Neutron Sources and Critical Facilities (RX) , Sample Preparation and Post-Irradiation Examination (PIE) and Thermal-Hydraulics Test Facilities (THF).« less
Configuration of electro-optic fire source detection system
NASA Astrophysics Data System (ADS)
Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir
2007-04-01
The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.
Astrophysics Source Code Library Enhancements
NASA Astrophysics Data System (ADS)
Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.
2015-09-01
The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.
3-D interactive visualisation tools for Hi spectral line imaging
NASA Astrophysics Data System (ADS)
van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.
2017-06-01
Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.
Shaping the future through innovations: From medical imaging to precision medicine.
Comaniciu, Dorin; Engel, Klaus; Georgescu, Bogdan; Mansi, Tommaso
2016-10-01
Medical images constitute a source of information essential for disease diagnosis, treatment and follow-up. In addition, due to its patient-specific nature, imaging information represents a critical component required for advancing precision medicine into clinical practice. This manuscript describes recently developed technologies for better handling of image information: photorealistic visualization of medical images with Cinematic Rendering, artificial agents for in-depth image understanding, support for minimally invasive procedures, and patient-specific computational models with enhanced predictive power. Throughout the manuscript we will analyze the capabilities of such technologies and extrapolate on their potential impact to advance the quality of medical care, while reducing its cost. Copyright © 2016 Elsevier B.V. All rights reserved.
Wildlife habitat evaluation demonstration project. [Michigan
NASA Technical Reports Server (NTRS)
Burgoyne, G. E., Jr.; Visser, L. G.
1981-01-01
To support the deer range improvement project in Michigan, the capability of LANDSAT data in assessing deer habitat in terms of areas and mixes of species and age classes of vegetation is being examined to determine whether such data could substitute for traditional cover type information sources. A second goal of the demonstration project is to determine whether LANDSAT data can be used to supplement and improve the information normally used for making deer habitat management decisions, either by providing vegetative cover for private land or by providing information about the interspersion and juxtaposition of valuable vegetative cover types. The procedure to be used for evaluating in LANDSAT data of the Lake County test site is described.
HIghZ: A search for HI absorption in high-redshift radio galaxies
NASA Astrophysics Data System (ADS)
Allison, J.; Callingham, J.; Sadler, E.; Wayth, R.; Curran, S.; Mahoney, E.
2017-01-01
We will use the unique low-frequency spectral capability of the MWA to carry out a pilot survey for neutral gas in the interstellar medium of the most distant (z>5) radio galaxies in the Universe. Through detection of the HI 21-cm line in absorption we aim to place stringent lower limits on the source redshift, confirming its location in the early Universe. Our sample makes use of the excellent wide-band spectral information available from the recently completed MWA GLEAM survey, from which we have selected a sample of ultra-steep peaked-spectrum radio sources that have a spectral turnover below 300 MHz. These sources should be ideal candidates for high-redshift compact radio galaxies since they have (a) spectral peaks that turnover below 1GHz and (b) very steep (alpha < -1.0) spectral indices that are consistent with the high density environments expected for radio galaxies in the early Universe. Using the MWA, we aim to verify this hypothesis through the detection of significant column densities of cold HI. This pathfinder project will provide important technical information that will inform future absorption surveys both with the MWA and, ultimately, the SKA-LOW telescope.
2011-05-01
information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...OMB control number. 1 . REPORT DATE MAY 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Space...space operations depend. GAO was asked to ( 1 ) review key systems being planned and acquired to provide SSA, and their progress meeting cost
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
1998-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter's column will include some announcements and some recent radiation test results and evaluations of interest. Specifically, the following topics will be covered: the Military and Aerospace Applications of Programmable Devices and Technologies Conference to be held at GSFC in September, 1998, proton test results, and some total dose results.
Health Security Intelligence: Assessing the Nascent Public Health Capability
2012-03-01
and one that although dedicated as an HSI analyst, did not work a full 40- hour workweek . Of the three jurisdictions that answered yes to Question 6...during the standard 40- hour workweek is well established. In her book, Out of Bounds, Innovation and Change in Law Enforcement Intelligence Analysis...collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources
Manning the Next Unmanned Air Force: Developing RPA Pilots of the Future
2013-08-01
is essential since “natural human capacities are becoming mismatched to the enormous data volumes, processing capabilities, and decision speeds that...screening criteria, the tests “are a rich source of information on the attributes of the candidate and have been used to construct a composite...against terrorism than any manned aircraft. From a recruiting point, it is also critical to reach out to this generation of millennials that have a
SmartR: an open-source platform for interactive visual analytics for translational research data
Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard
2017-01-01
Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291
JEFX 10 demonstration of Cooperative Hunter Killer UAS and upstream data fusion
NASA Astrophysics Data System (ADS)
Funk, Brian K.; Castelli, Jonathan C.; Watkins, Adam S.; McCubbin, Christopher B.; Marshall, Steven J.; Barton, Jeffrey D.; Newman, Andrew J.; Peterson, Cammy K.; DeSena, Jonathan T.; Dutrow, Daniel A.; Rodriguez, Pedro A.
2011-05-01
The Johns Hopkins University Applied Physics Laboratory deployed and demonstrated a prototype Cooperative Hunter Killer (CHK) Unmanned Aerial System (UAS) capability and a prototype Upstream Data Fusion (UDF) capability as participants in the Joint Expeditionary Force Experiment 2010 in April 2010. The CHK capability was deployed at the Nevada Test and Training Range to prosecute a convoy protection operational thread. It used mission-level autonomy (MLA) software applied to a networked swarm of three Raven hunter UAS and a Procerus Miracle surrogate killer UAS, all equipped with full motion video (FMV). The MLA software provides the capability for the hunter-killer swarm to autonomously search an area or road network, divide the search area, deconflict flight paths, and maintain line of sight communications with mobile ground stations. It also provides an interface for an operator to designate a threat and initiate automatic engagement of the target by the killer UAS. The UDF prototype was deployed at the Maritime Operations Center at Commander Second Fleet, Naval Station Norfolk to provide intelligence analysts and the ISR commander with a common fused track picture from the available FMV sources. It consisted of a video exploitation component that automatically detected moving objects, a multiple hypothesis tracker that fused all of the detection data to produce a common track picture, and a display and user interface component that visualized the common track picture along with appropriate geospatial information such as maps and terrain as well as target coordinates and the source video.
Securing internet by eliminating DDOS attacks
NASA Astrophysics Data System (ADS)
Niranchana, R.; Gayathri Devi, N.; Santhi, H.; Gayathri, P.
2017-11-01
The major threat caused to the authorised usage of Internet is Distributed Denial of Service attack. The mechanisms used to prevent the DDoS attacks are said to overcome the attack’s ability in spoofing the IP packets source addresses. By utilising Internet Protocol spoofing, the attackers cause a consequential load over the networks destination for policing attack packets. To overcome the IP Spoofing level on the Internet, We propose an Inter domain Packet Filter (IPF) architecture. The proposed scheme is not based on global routing information. The packets with reliable source addresses are not rejected, the IPF frame work works in such a manner. The spoofing capability of attackers is confined by IPF, and also the filter identifies the source of an attack packet by minimal number of candidate network.
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
Distribution of immunodeficiency fact files with XML--from Web to WAP.
Väliaho, Jouni; Riikonen, Pentti; Vihinen, Mauno
2005-06-26
Although biomedical information is growing rapidly, it is difficult to find and retrieve validated data especially for rare hereditary diseases. There is an increased need for services capable of integrating and validating information as well as proving it in a logically organized structure. A XML-based language enables creation of open source databases for storage, maintenance and delivery for different platforms. Here we present a new data model called fact file and an XML-based specification Inherited Disease Markup Language (IDML), that were developed to facilitate disease information integration, storage and exchange. The data model was applied to primary immunodeficiencies, but it can be used for any hereditary disease. Fact files integrate biomedical, genetic and clinical information related to hereditary diseases. IDML and fact files were used to build a comprehensive Web and WAP accessible knowledge base ImmunoDeficiency Resource (IDR) available at http://bioinf.uta.fi/idr/. A fact file is a user oriented user interface, which serves as a starting point to explore information on hereditary diseases. The IDML enables the seamless integration and presentation of genetic and disease information resources in the Internet. IDML can be used to build information services for all kinds of inherited diseases. The open source specification and related programs are available at http://bioinf.uta.fi/idml/.
[Meaningful words? Cancer screening communication in Italy].
Cogo, Carla; Petrella, Marco
2012-01-01
Over the last ten years, Italian work groups of communication within The National Centre for Screening Monitoring have been working on various aspects of communication in screening: quality surveys, information materials, guidelines, websites, and training. This has been done taking into account that good quality information must be clear, accessible, up to date, evidence based, clear about its limitations and capable of indicating further sources of information. Whenever possible, information has been developed in collaboration with the target groups: citizens but also health professionals. However, if good quality information must be clear about benefits and harms, the communication of quantitative information is particularly complex in cancer screening. Moreover, receiving more information on risks and benefits does not seem to modify participation. In addition, more balanced information does not entail that a person will include it in the decision process.Throughout several focus groups, citizens have made it clear that the information received from the programmes was only a part of the decisional process in which other elements were just as, if not more, important: trust in doctors, family and friends, perception of health authority efficiency, personal experiences, inconsistencies in information or public disagreements with other credible sources. Such elements can be seen as an opportunity to strengthen partnerships with professional and advocacy groups and to cooperate more efficiently with media and specialists from different fields.
NASA Technical Reports Server (NTRS)
Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig
2018-01-01
This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.
WEB-GIS Decision Support System for CO2 storage
NASA Astrophysics Data System (ADS)
Gaitanaru, Dragos; Leonard, Anghel; Radu Gogu, Constantin; Le Guen, Yvi; Scradeanu, Daniel; Pagnejer, Mihaela
2013-04-01
Environmental decision support systems (DSS) paradigm evolves and changes as more knowledge and technology become available to the environmental community. Geographic Information Systems (GIS) can be used to extract, assess and disseminate some types of information, which are otherwise difficult to access by traditional methods. In the same time, with the help of the Internet and accompanying tools, creating and publishing online interactive maps has become easier and rich with options. The Decision Support System (MDSS) developed for the MUSTANG (A MUltiple Space and Time scale Approach for the quaNtification of deep saline formations for CO2 storaGe) project is a user friendly web based application that uses the GIS capabilities. MDSS can be exploited by the experts for CO2 injection and storage in deep saline aquifers. The main objective of the MDSS is to help the experts to take decisions based large structured types of data and information. In order to achieve this objective the MDSS has a geospatial objected-orientated database structure for a wide variety of data and information. The entire application is based on several principles leading to a series of capabilities and specific characteristics: (i) Open-Source - the entire platform (MDSS) is based on open-source technologies - (1) database engine, (2) application server, (3) geospatial server, (4) user interfaces, (5) add-ons, etc. (ii) Multiple database connections - MDSS is capable to connect to different databases that are located on different server machines. (iii)Desktop user experience - MDSS architecture and design follows the structure of a desktop software. (iv)Communication - the server side and the desktop are bound together by series functions that allows the user to upload, use, modify and download data within the application. The architecture of the system involves one database and a modular application composed by: (1) a visualization module, (2) an analysis module, (3) a guidelines module, and (4) a risk assessment module. The Database component is build by using the PostgreSQL and PostGIS open source technology. The visualization module allows the user to view data of CO2 injection sites in different ways: (1) geospatial visualization, (2) table view, (3) 3D visualization. The analysis module will allow the user to perform certain analysis like Injectivity, Containment and Capacity analysis. The Risk Assessment module focus on the site risk matrix approach. The Guidelines module contains the methodologies of CO2 injection and storage into deep saline aquifers guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, D. B.
2015-01-30
The Adversary & Interdiction Methods (AIM) program provides training and capability assessment services to government agencies around the country. Interdisciplinary teams equipped with gear and radioactive sources are repeatedly fielded to offsite events to collaborate with law enforcement agencies at all levels of government. AIM has grown rapidly over the past three years. A knowledge management system as evolved along with the program but it has failed to keep pace. A new system is needed. The new system must comply with cybersecurity and information technology solutions already in place at an institutional level. The offsite nature of AIM activities mustmore » also be accommodated. Cost and schedule preclude the commissioning of new software and the procurement of expensive hardware. The new system must exploit in-house capabilities and be established quickly. A novel system is proposed. This solution centers on a recently introduced institutional file sharing capability called Syncplicity. AIM-authored software will be combined with a dedicated institutional account to vastly extend the capability of this resource. The new knowledge management system will reduce error and increase efficiency through automation and be accessible offsite via mobile devices.« less
NASA Astrophysics Data System (ADS)
Pilone, D.; Gilman, J.; Baynes, K.; Shum, D.
2015-12-01
This talk introduces a new NASA Earth Observing System Data and Information System (EOSDIS) capability to automatically generate and maintain derived, Virtual Product information allowing DAACs and Data Providers to create tailored and more discoverable variations of their products. After this talk the audience will be aware of the new EOSDIS Virtual Product capability, applications of it, and how to take advantage of it. Much of the data made available in the EOSDIS are organized for generation and archival rather than for discovery and use. The EOSDIS Common Metadata Repository (CMR) is launching a new capability providing automated generation and maintenance of user-oriented Virtual Product information. DAACs can easily surface variations on established data products tailored to specific uses cases and users, leveraging DAAC exposed services such as custom ordering or access services like OPeNDAP for on-demand product generation and distribution. Virtual Data Products enjoy support for spatial and temporal information, keyword discovery, association with imagery, and are fully discoverable by tools such as NASA Earthdata Search, Worldview, and Reverb. Virtual Product generation has applicability across many use cases: - Describing derived products such as Surface Kinetic Temperature information (AST_08) from source products (ASTER L1A) - Providing streamlined access to data products (e.g. AIRS) containing many (>800) data variables covering an enormous variety of physical measurements - Attaching additional EOSDIS offerings such as Visual Metadata, external services, and documentation metadata - Publishing alternate formats for a product (e.g. netCDF for HDF products) with the actual conversion happening on request - Publishing granules to be modified by on-the-fly services, like GES-DISC's Data Quality Screening Service - Publishing "bundled" products where granules from one product correspond to granules from one or more other related products
NASA Astrophysics Data System (ADS)
Liu, Dongyan; Amy, Pickering; Sun, Jun
2004-04-01
An experiment was designed to select economically valuable macroalga species with high nutrient uptake rates. Such species cultured on a large scale could be a potential solution to eutrophication. Three macroalgae species, Ulva pertusa (Chlorophyta), Gelidium amansii (Rhodophyta) and Sargassum enerve (Phaeophyta), were chosen for the experiment because of their economic values and availability. Control and four nitrogen concentrations were achieved by adding NH{4/+} and NO{3/-}. The results indicate that the fresh weights of all species increase faster than that of control after 5 d culture. The fresh weight of Ulva pertusa increases fastest among the 3 species. However, different species show different responses to nitrogen source and its availability. They also show the advantage of using NH{4/+} than using NO{3/-}. U. pertusa grows best and shows higher capability of removing nitrogen at 200µmolL-1, but it has lower economical value. G. amansii has higher economical value but lower capability of removing nitrogen at 200 µmolL-1. The capability of nitrogen assimilation of S. enerve is higher than that of G. amansii at 200µmolL -1, but the former’s increase of fresh weight is lower than those of other two species. Then present preliminary study demonstrates that it is possible to use macroalgae as biofilters and further development of this approach could provide biologically valuable information on the source, fate, and transport of N in marine ecosystems. Caution is needed should we extrapolate these findings to natural environments.
Magsamen-Conrad, Kate; Dillon, Jeanette M; Billotte Verhoff, China; Faulkner, Sandra L
2018-02-23
There are myriad technological devices, computer programs, and online information sources available for people to manage their health and the health of others. However, people must be technologically and health literate and capable of accessing, analyzing, and sharing the information they encounter. The authors interviewed middle-aged and older adults about their online health information seeking behavior and discovered that technology and health literacy are influenced by a collective ability to manage the health and technological needs of a family. We used information management theory to frame participants' experiences of their self-efficacy using technology to manage the health of loved ones. Findings suggest that health can be co-managed if at least one person in a family unit is technologically "savvy" and able to effectively share health information. However, individuals' confidence in their own literacy often depends on others, usually family members who tend to "do" instead of "teach."
Broadband Processing in a Noisy Shallow Ocean Environment: A Particle Filtering Approach
Candy, J. V.
2016-04-14
Here we report that when a broadband source propagates sound in a shallow ocean the received data can become quite complicated due to temperature-related sound-speed variations and therefore a highly dispersive environment. Noise and uncertainties disrupt this already chaotic environment even further because disturbances propagate through the same inherent acoustic channel. The broadband (signal) estimation/detection problem can be decomposed into a set of narrowband solutions that are processed separately and then combined to achieve more enhancement of signal levels than that available from a single frequency, thereby allowing more information to be extracted leading to a more reliable source detection.more » A Bayesian solution to the broadband modal function tracking, pressure-field enhancement, and source detection problem is developed that leads to nonparametric estimates of desired posterior distributions enabling the estimation of useful statistics and an improved processor/detector. In conclusion, to investigate the processor capabilities, we synthesize an ensemble of noisy, broadband, shallow-ocean measurements to evaluate its overall performance using an information theoretical metric for the preprocessor and the receiver operating characteristic curve for the detector.« less
Alvarsson, Jonathan; Andersson, Claes; Spjuth, Ola; Larsson, Rolf; Wikberg, Jarl E S
2011-05-20
Compound profiling and drug screening generates large amounts of data and is generally based on microplate assays. Current information systems used for handling this are mainly commercial, closed source, expensive, and heavyweight and there is a need for a flexible lightweight open system for handling plate design, and validation and preparation of data. A Bioclipse plugin consisting of a client part and a relational database was constructed. A multiple-step plate layout point-and-click interface was implemented inside Bioclipse. The system contains a data validation step, where outliers can be removed, and finally a plate report with all relevant calculated data, including dose-response curves. Brunn is capable of handling the data from microplate assays. It can create dose-response curves and calculate IC50 values. Using a system of this sort facilitates work in the laboratory. Being able to reuse already constructed plates and plate layouts by starting out from an earlier step in the plate layout design process saves time and cuts down on error sources.
Susceptibility of ground water to surface and shallow sources of contamination in Mississippi
O'Hara, Charles G.
1996-01-01
Ground water, because of its extensive use in agriculture, industry, and public-water supply, is one of Mississippi's most important natural resources. Ground water is the source for about 80 percent of the total freshwater used by the State's population (Solley and others, 1993). About 2,600 Mgal/d of freshwater is withdrawn from aquifers in Mississippi (D.E. Burt, Jr., U.S. Geological Survey, oral commun., 1995). Wells capable of yielding 200 gal/min of water with quality suitable for most uses can be developed nearly anywhere in the State (Bednar, 1988). The U.S. Geological Survey (USGS), in cooperation with the Mississippi Department of Environmental Quality, Office of Pollution Control, and the Mississippi Department of Agriculture and Commerce, Bureau of Plant Industry, conducted an investigation to evaluate the susceptibility of ground water to contamination from surgace and shallow sources in Mississippi. A geographic information system (GIS) was used to develop and analyze statewide spatial data layers that contain geologic, hydrologic, physiographic, and cultural information.
Validation of Satellite-based Rainfall Estimates for Severe Storms (Hurricanes & Tornados)
NASA Astrophysics Data System (ADS)
Nourozi, N.; Mahani, S.; Khanbilvardi, R.
2005-12-01
Severe storms such as hurricanes and tornadoes cause devastating damages, almost every year, over a large section of the United States. More accurate forecasting intensity and track of a heavy storm can help to reduce if not to prevent its damages to lives, infrastructure, and economy. Estimating accurate high resolution quantitative precipitation (QPE) from a hurricane, required to improve the forecasting and warning capabilities, is still a challenging problem because of physical characteristics of the hurricane even when it is still over the ocean. Satellite imagery seems to be a valuable source of information for estimating and forecasting heavy precipitation and also flash floods, particularly for over the oceans where the traditional ground-based gauge and radar sources cannot provide any information. To improve the capability of a rainfall retrieval algorithm for estimating QPE of severe storms, its product is evaluated in this study. High (hourly 4km x 4km) resolutions satellite infrared-based rainfall products, from the NESDIS Hydro-Estimator (HE) and also PERSIANN (Precipitation Estimation from Remotely Sensed Information using an Artificial Neural Networks) algorithms, have been tested against NEXRAD stage-IV and rain gauge observations in this project. Three strong hurricanes: Charley (category 4), Jeanne (category 3), and Ivan (category 3) that caused devastating damages over Florida in the summer 2004, have been considered to be investigated. Preliminary results demonstrate that HE tends to underestimate rain rates when NEXRAD shows heavy storm (rain rates greater than 25 mm/hr) and to overestimate when NEXRAD gives low rainfall amounts, but PERSIANN tends to underestimate rain rates, in general.
GRIP Collaboration Portal: Information Management for a Hurricane Field Campaign
NASA Astrophysics Data System (ADS)
Conover, H.; Kulkarni, A.; Garrett, M.; Smith, T.; Goodman, H. M.
2010-12-01
NASA’s Genesis and Rapid Intensification Processes (GRIP) experiment, carried out in August and September of 2010, was a complex operation, involving three aircraft and their crews based at different airports, a dozen instrument teams, mission scientists, weather forecasters, project coordinators and a variety of other participants. In addition, GRIP was coordinated with concurrent airborne missions: NOAA’s IFEX and then NSF-funded PREDICT. The GRIP Collaboration Portal was developed to facilitate communication within and between the different teams and serve as an information repository for the field campaign, providing a single access point for project documents, plans, weather forecasts, flight reports and quicklook data. The portal was developed using the Drupal open source content management framework. This presentation will cover both technology and participation issues. Specific examples include: Drupal’s large and diverse open source developer community is an advantage in that we were able to reuse many modules rather than develop capabilities from scratch, but integrating multiple modules developed by many people adds to the overall complexity of the site. Many of the communication capabilities provided by the site, such as discussion forums and blogs, were not used. Participants were diligent about posting necessary documents, but the favored communication method remained email. Drupal's developer-friendly nature allowed for quick development of the customized functionality needed to accommodate the rapidly changing requirements of GRIP experiment. DC-8 Overflight of Hurricane Earl during GRIP Mission
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
1999-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter the focus is on some experimental data on low voltage drop out regulators to support mixed 5 and 3.3 volt systems. A discussion of the Small Explorer WIRE spacecraft will also be given. Lastly, we show take a first look at robust state machines in Hardware Description Languages (VHDL) and their use in critical systems. If you have information that you would like to submit or an area you would like discussed or researched, please give me a call or e-mail.
Open source OCR framework using mobile devices
NASA Astrophysics Data System (ADS)
Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan
2008-02-01
Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.
Radioactive source security: the cultural challenges.
Englefield, Chris
2015-04-01
Radioactive source security is an essential part of radiation protection. Sources can be abandoned, lost or stolen. If they are stolen, they could be used to cause deliberate harm and the risks are varied and significant. There is a need for a global security protection system and enhanced capability to achieve this. The establishment of radioactive source security requires 'cultural exchanges'. These exchanges include collaboration between: radiation protection specialists and security specialists; the nuclear industry and users of radioactive sources; training providers and regulators/users. This collaboration will facilitate knowledge and experience exchange for the various stakeholder groups, beyond those already provided. This will promote best practice in both physical and information security and heighten security awareness generally. Only if all groups involved are prepared to open their minds to listen to and learn from, each other will a suitable global level of control be achieved. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Porter, Christina L.; Tanksalvala, Michael; Gerrity, Michael; Miley, Galen P.; Esashi, Yuka; Horiguchi, Naoto; Zhang, Xiaoshi; Bevis, Charles S.; Karl, Robert; Johnsen, Peter; Adams, Daniel E.; Kapteyn, Henry C.; Murnane, Margaret M.
2018-03-01
With increasingly 3D devices becoming the norm, there is a growing need in the semiconductor industry and in materials science for high spatial resolution, non-destructive metrology techniques capable of determining depth-dependent composition information on devices. We present a solution to this problem using ptychographic coherent diffractive imaging (CDI) implemented using a commercially available, tabletop 13 nm source. We present the design, simulations, and preliminary results from our new complex EUV imaging reflectometer, which uses coherent 13 nm light produced by tabletop high harmonic generation. This tool is capable of determining spatially-resolved composition vs. depth profiles for samples by recording ptychographic images at multiple incidence angles. By harnessing phase measurements, we can locally and nondestructively determine quantities such as device and thin film layer thicknesses, surface roughness, interface quality, and dopant concentration profiles. Using this advanced imaging reflectometer, we can quantitatively characterize materials-sciencerelevant and industry-relevant nanostructures for a wide variety of applications, spanning from defect and overlay metrology to the development and optimization of nano-enhanced thermoelectric or spintronic devices.
Overview of the NASA Wallops Flight Facility Mobile Range Control System
NASA Technical Reports Server (NTRS)
Davis, Rodney A.; Semancik, Susan K.; Smith, Donna C.; Stancil, Robert K.
1999-01-01
The NASA GSFC's Wallops Flight Facility (WFF) Mobile Range Control System (MRCS) is based on the functionality of the WFF Range Control Center at Wallops Island, Virginia. The MRCS provides real time instantaneous impact predictions, real time flight performance data, and other critical information needed by mission and range safety personnel in support of range operations at remote launch sites. The MRCS integrates a PC telemetry processing system (TELPro), a PC radar processing system (PCDQS), multiple Silicon Graphics display workstations (IRIS), and communication links within a mobile van for worldwide support of orbital, suborbital, and aircraft missions. This paper describes the MRCS configuration; the TELPro's capability to provide single/dual telemetry tracking and vehicle state data processing; the PCDQS' capability to provide real time positional data and instantaneous impact prediction for up to 8 data sources; and the IRIS' user interface for setup/display options. With portability, PC-based data processing, high resolution graphics, and flexible multiple source support, the MRCS system is proving to be responsive to the ever-changing needs of a variety of increasingly complex missions.
Building a Predictive Capability for Decision-Making that Supports MultiPEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmichael, Joshua Daniel
Multi-phenomenological explosion monitoring (multiPEM) is a developing science that uses multiple geophysical signatures of explosions to better identify and characterize their sources. MultiPEM researchers seek to integrate explosion signatures together to provide stronger detection, parameter estimation, or screening capabilities between different sources or processes. This talk will address forming a predictive capability for screening waveform explosion signatures to support multiPEM.
NASA Astrophysics Data System (ADS)
Lubow, S.; Budavári, T.
2013-10-01
We have created an initial catalog of objects observed by the WFPC2 and ACS instruments on the Hubble Space Telescope (HST). The catalog is based on observations taken on more than 6000 visits (telescope pointings) of ACS/WFC and more than 25000 visits of WFPC2. The catalog is obtained by cross matching by position in the sky all Hubble Legacy Archive (HLA) Source Extractor source lists for these instruments. The source lists describe properties of source detections within a visit. The calculations are performed on a SQL Server database system. First we collect overlapping images into groups, e.g., Eta Car, and determine nearby (approximately matching) pairs of sources from different images within each group. We then apply a novel algorithm for improving the cross matching of pairs of sources by adjusting the astrometry of the images. Next, we combine pairwise matches into maximal sets of possible multi-source matches. We apply a greedy Bayesian method to split the maximal matches into more reliable matches. We test the accuracy of the matches by comparing the fluxes of the matched sources. The result is a set of information that ties together multiple observations of the same object. A byproduct of the catalog is greatly improved relative astrometry for many of the HST images. We also provide information on nondetections that can be used to determine dropouts. With the catalog, for the first time, one can carry out time domain, multi-wavelength studies across a large set of HST data. The catalog is publicly available. Much more can be done to expand the catalog capabilities.
Intensity-invariant coding in the auditory system.
Barbour, Dennis L
2011-11-01
The auditory system faithfully represents sufficient details from sound sources such that downstream cognitive processes are capable of acting upon this information effectively even in the face of signal uncertainty, degradation or interference. This robust sound source representation leads to an invariance in perception vital for animals to interact effectively with their environment. Due to unique nonlinearities in the cochlea, sound representations early in the auditory system exhibit a large amount of variability as a function of stimulus intensity. In other words, changes in stimulus intensity, such as for sound sources at differing distances, create a unique challenge for the auditory system to encode sounds invariantly across the intensity dimension. This challenge and some strategies available to sensory systems to eliminate intensity as an encoding variable are discussed, with a special emphasis upon sound encoding. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wong, M. M.; Brennan, J.; Bagwell, R.; Behnke, J.
2015-12-01
This poster will introduce and explore the various social media efforts, monthly webinar series and a redesigned website (https://earthdata.nasa.gov) established by National Aeronautics and Space Administration's (NASA) Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools. We have embarked on these efforts to reach out to new audiences and potential new users and to engage our diverse end user communities world-wide. One of the key objectives is to increase awareness of the breadth of Earth science data information, services, and tools that are publicly available while also highlighting how these data and technologies enable scientific research.
Capability of long distance 100 GHz FMCW using a single GDD lamp sensor.
Levanon, Assaf; Rozban, Daniel; Aharon Akram, Avihai; Kopeika, Natan S; Yitzhaky, Yitzhak; Abramovich, Amir
2014-12-20
Millimeter wave (MMW)-based imaging systems are required for applications in medicine, homeland security, concealed weapon detection, and space technology. The lack of inexpensive room temperature imaging sensors makes it difficult to provide a suitable MMW system for many of the above applications. A 3D MMW imaging system based on chirp radar was studied previously using a scanning imaging system of a single detector. The radar system requires that the millimeter wave detector will be able to operate as a heterodyne detector. Since the source of radiation is a frequency modulated continuous wave (FMCW), the detected signal as a result of heterodyne detection gives the object's depth information according to value of difference frequency, in addition to the reflectance of the 2D image. New experiments show the capability of long distance FMCW detection by using a large scale Cassegrain projection system, described first (to our knowledge) in this paper. The system presents the capability to employ a long distance of at least 20 m with a low-cost plasma-based glow discharge detector (GDD) focal plane array (FPA). Each point on the object corresponds to a point in the image and includes the distance information. This will enable relatively inexpensive 3D MMW imaging.
Introducing a New Capability at SSRL: Resonant Soft X-ray Scattering
NASA Astrophysics Data System (ADS)
Lee, Jun-Sik; Jang, Hoyoung; Lu, Donghui; Kao, Chi-Chang
Stanford Synchrotron Radiation Lightsource (SSRL) at SLAC recently developed a setup for the resonant soft x-ray scattering (RSXS). In general, the RSXS technique uniquely probes not only structural information, but also chemical specific information. This is because this technique can explore the spatial periodicities of charge, orbital, spin, and lattice with spectroscopic aspect. Moreover, the soft x-ray range is particularly relevant for a study of soft materials as it covers the K-edge of C, N, F, and O, as well as the L-edges of transition metals and M-edges of rare-earth elements. Hence, the RSXS capability has been regarded as a very powerful technique for investigating the intrinsic properties of materials such as quantum- and energy-materials. The RSXS capability at the SSRL composes of in-vacuum 4-circle diffractometer. There are also the fully motorized sample-motion manipulations. Also, the sample can be cooled down to 25 K via the liquid helium. This capability has been installed at BL 13-3, where the photon source is from elliptically polarized undulator (EPU). Covering the photon energies is from 230 eV to 1400 eV. Furthermore, this EPU system offers more degree of freedoms for controlling x-ray polarizations (linear and circular). Using the advance of controlling x-ray polarization, we can also investigate a morphology effect of local domain/grain in materials. The detailed introduction of the RSXS end-station and several results will be touched in this poster presentation.
Medical Data Architecture (MDA) Project Status
NASA Technical Reports Server (NTRS)
Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.
2018-01-01
The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.
Medical Data Architecture Project Status
NASA Technical Reports Server (NTRS)
Krihak, M.; Middour, C.; Gurram, M.; Wolfe, S.; Marker, N.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.
2018-01-01
The Medical Data Architecture (MDA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the ExMC MDA project addresses the technical limitations identified in ExMC Gap Med 07: We do not have the capability to comprehensively process medically-relevant information to support medical operations during exploration missions. This gap identifies that the current in-flight medical data management includes a combination of data collection and distribution methods that are minimally integrated with on-board medical devices and systems. Furthermore, there are a variety of data sources and methods of data collection. For an exploration mission, the seamless management of such data will enable a more medically autonomous crew than the current paradigm. The medical system requirements are being developed in parallel with the exploration mission architecture and vehicle design. ExMC has recognized that in order to make informed decisions about a medical data architecture framework, current methods for medical data management must not only be understood, but an architecture must also be identified that provides the crew with actionable insight to medical conditions. This medical data architecture will provide the necessary functionality to address the challenges of executing a self-contained medical system that approaches crew health care delivery without assistance from ground support. Hence, the products supported by current prototype development will directly inform exploration medical system requirements.
NASA Astrophysics Data System (ADS)
Chen, Kun; Wu, Tao; Li, Yan; Wei, Haoyun
2017-12-01
Coherent anti-Stokes Raman scattering (CARS) is a powerful nonlinear spectroscopy technique that is rapidly gaining recognition of different molecules. Unfortunately, molecular concentration information is generally not immediately accessible from the raw CARS signal due to the nonresonant background. In addition, mainstream biomedical applications of CARS are currently hampered by a complex and bulky excitation setup. Here, we establish a dual-soliton Stokes based CARS spectroscopy scheme capable of quantifying the sample molecular, using a single fiber laser. This dual-soliton CARS scheme takes advantage of a differential configuration to achieve efficient suppression of nonresonant background and therefore allows extraction of quantitative composition information. Besides, our all-fiber based excitation source can probe the most fingerprint region (1100-1800 cm-1) with a spectral resolution of 15 cm-1 under the spectral focusing mechanism, where is considerably more information contained throughout an entire spectrum than at just a single frequency within that spectrum. Systematic studies of the scope of application and several fundamental aspects are discussed. Quantitative capability is further experimentally demonstrated through the determination of oleic acid concentration based on the linear dependence of signal on different Raman vibration bands.
Toxico-Cheminformatics: New and Expanding Public ...
High-throughput screening (HTS) technologies, along with efforts to improve public access to chemical toxicity information resources and to systematize older toxicity studies, have the potential to significantly improve information gathering efforts for chemical assessments and predictive capabilities in toxicology. Important developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. By annotating toxicity data with associated chemical structure information, these efforts link data across diverse study domains (e.g., ‘omics’, HTS, traditional toxicity studies), toxicity domains (carcinogenicity, developmental toxicity, neurotoxicity, immunotoxicity, etc) and database sources (EPA, FDA, NCI, DSSTox, PubChem, GEO, ArrayExpress, etc.). Public initiatives are developing systematized data models of toxicity study areas and introducing standardized templates, controlled vocabularies, hierarchical organization, and powerful relational searching capability across capt
NASA Astrophysics Data System (ADS)
Anchordoqui, Luis A.; Barger, Vernon; Weiler, Thomas J.
2018-03-01
We argue that if ultrahigh-energy (E ≳1010GeV) cosmic rays are heavy nuclei (as indicated by existing data), then the pointing of cosmic rays to their nearest extragalactic sources is expected for 1010.6 ≲ E /GeV ≲1011. This is because for a nucleus of charge Ze and baryon number A, the bending of the cosmic ray decreases as Z / E with rising energy, so that pointing to nearby sources becomes possible in this particular energy range. In addition, the maximum energy of acceleration capability of the sources grows linearly in Z, while the energy loss per distance traveled decreases with increasing A. Each of these two points tend to favor heavy nuclei at the highest energies. The traditional bi-dimensional analyses, which simultaneously reproduce Auger data on the spectrum and nuclear composition, may not be capable of incorporating the relative importance of all these phenomena. In this paper we propose a multi-dimensional reconstruction of the individual emission spectra (in E, direction, and cross-correlation with nearby putative sources) to study the hypothesis that primaries are heavy nuclei subject to GZK photo-disintegration, and to determine the nature of the extragalactic sources. More specifically, we propose to combine information on nuclear composition and arrival direction to associate a potential clustering of events with a 3-dimensional position in the sky. Actually, both the source distance and maximum emission energy can be obtained through a multi-parameter likelihood analysis to accommodate the observed nuclear composition of each individual event in the cluster. We show that one can track the level of GZK interactions on an statistical basis by comparing the maximum energy at the source of each cluster. We also show that nucleus-emitting-sources exhibit a cepa stratis structure on Earth which could be pealed off by future space-missions, such as POEMMA. Finally, we demonstrate that metal-rich starburst galaxies are highly-plausible candidate sources, and we use them as an explicit example of our proposed multi-dimensional analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... primary energy source. In assessing whether the unit is technically capable of using a mixture of petroleum or natural gas and coal or another alternate fuel as a primary energy source, for purposes of this... technically capable of using the mixture as a primary energy source under § 504.6(c), this certification...
Code of Federal Regulations, 2014 CFR
2014-01-01
... primary energy source. In assessing whether the unit is technically capable of using a mixture of petroleum or natural gas and coal or another alternate fuel as a primary energy source, for purposes of this... technically capable of using the mixture as a primary energy source under § 504.6(c), this certification...
Code of Federal Regulations, 2012 CFR
2012-01-01
... primary energy source. In assessing whether the unit is technically capable of using a mixture of petroleum or natural gas and coal or another alternate fuel as a primary energy source, for purposes of this... technically capable of using the mixture as a primary energy source under § 504.6(c), this certification...
Kondylakis, Haridimos; Spanakis, Emmanouil G; Sfakianakis, Stelios; Sakkalis, Vangelis; Tsiknakis, Manolis; Marias, Kostas; Xia Zhao; Hong Qing Yu; Feng Dong
2015-08-01
The advancements in healthcare practice have brought to the fore the need for flexible access to health-related information and created an ever-growing demand for the design and the development of data management infrastructures for translational and personalized medicine. In this paper, we present the data management solution implemented for the MyHealthAvatar EU research project, a project that attempts to create a digital representation of a patient's health status. The platform is capable of aggregating several knowledge sources relevant for the provision of individualized personal services. To this end, state of the art technologies are exploited, such as ontologies to model all available information, semantic integration to enable data and query translation and a variety of linking services to allow connecting to external sources. All original information is stored in a NoSQL database for reasons of efficiency and fault tolerance. Then it is semantically uplifted through a semantic warehouse which enables efficient access to it. All different technologies are combined to create a novel web-based platform allowing seamless user interaction through APIs that support personalized, granular and secure access to the relevant information.
Relationships Between eHealth Literacy and Health Behaviors in Korean Adults.
Kim, Sun-Hee; Son, Youn-Jung
2017-02-01
The Internet is a useful and accessible source for health-related information for modern healthcare consumers. Individuals with adequate eHealth literacy have an incentive to use the Internet to access health-related information, and they consider themselves capable of using Web-based knowledge for health. This cross-sectional study aimed to describe the relationship between eHealth literacy and health behaviors. A total of 230 adults aged 18 to 39 years and residing in South Korea participated in the study. The mean (SD) score for eHealth literacy was 25.52 (4.35) of a total possible score of 40. The main source of health information was the Internet. Using hierarchical linear regression, the results showed that eHealth literacy was the strongest predictor of health behaviors after adjusting for general characteristics. These findings indicate that eHealth literacy can be an important factor in promoting individual health behaviors. Further research on eHealth literacy and actual health behaviors including intention and self-reported health behaviors are required to explain the impact of eHealth literacy on overall health status.
Three-dimensional laser microvision.
Shimotahira, H; Iizuka, K; Chu, S C; Wah, C; Costen, F; Yoshikuni, Y
2001-04-10
A three-dimensional (3-D) optical imaging system offering high resolution in all three dimensions, requiring minimum manipulation and capable of real-time operation, is presented. The system derives its capabilities from use of the superstructure grating laser source in the implementation of a laser step frequency radar for depth information acquisition. A synthetic aperture radar technique was also used to further enhance its lateral resolution as well as extend the depth of focus. High-speed operation was made possible by a dual computer system consisting of a host and a remote microcomputer supported by a dual-channel Small Computer System Interface parallel data transfer system. The system is capable of operating near real time. The 3-D display of a tunneling diode, a microwave integrated circuit, and a see-through image taken by the system operating near real time are included. The depth resolution is 40 mum; lateral resolution with a synthetic aperture approach is a fraction of a micrometer and that without it is approximately 10 mum.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagler, R.G.
This report, based solely on information available from unclassified sources, provides a coherent picture of the scope and trends of ballistic missile proliferation. The focus is on countries developing, producing, or owning ballistic missiles capable of threatening the military forces, assets, or populations of neighboring or geographically remote countries. The report also identifies other countries expected to obtain operational ballistic missile capabilities, discusses expected growth in performance, and examines the projected availability of warheads of mass destruction. The emphasis is on ballistic missiles of ranges greater than approximately 300 km, though shorter range battlefield weapons are discussed as forerunners. Themore » assessment excludes principal U.S. allies and countries formerly in the Warsaw Pact, except where these countries have sold missiles, technology; or personnel services to developing nations in support of their missile programs.« less
Electronic business model for small- and medium-sized manufacturing enterprises (SME): a case study
NASA Astrophysics Data System (ADS)
Yuen, Karina; Chung, Walter W.
2001-10-01
This paper identifies three essential factors (information infrastructure, executive information system and a new manufacturing paradigm) that are used to support the development of a new business model for competitiveness. They facilitate changes in organization structure in support of business transformation. A SME can source a good manufacturing practice using a model of academic-university collaboration to gain competitive advantage in the e-business world. The collaboration facilitates the change agents to use information systems development as a vehicle to increase the capability of executives in using information and knowledge management to gain higher responsiveness and customer satisfaction. The case company is used to illustrate the application of a web-based executive information system to interface internal communications with external operation. It explains where a good manufacturing practice may be re-applied by other SMEs to acquire skills as a learning organization grows in an extended enterprise setting.
Non-invasive lightweight integration engine for building EHR from autonomous distributed systems.
Angulo, Carlos; Crespo, Pere; Maldonado, José A; Moner, David; Pérez, Daniel; Abad, Irene; Mandingorra, Jesús; Robles, Montserrat
2007-12-01
In this paper we describe Pangea-LE, a message-oriented lightweight data integration engine that allows homogeneous and concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and passes it to the requesting client applications in a flexible XML format. The XML response message can be formatted on demand by appropriate Extensible Stylesheet Language (XSL) transformations in order to meet the needs of client applications. We also present a real deployment in a hospital where Pangea-LE collects and generates an XML view of all the available patient clinical information. The information is presented to healthcare professionals in an Electronic Health Record (EHR) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real setting has been a success due to the non-invasive nature of Pangea-LE which respects the existing information systems.
Non-invasive light-weight integration engine for building EHR from autonomous distributed systems.
Crespo Molina, Pere; Angulo Fernández, Carlos; Maldonado Segura, José A; Moner Cano, David; Robles Viejo, Montserrat
2006-01-01
Pangea-LE is a message oriented light-weight integration engine, allowing concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and serves it to the requester client applications in a flexible XML format. This XML response message can be formatted on demand by the appropriate XSL (Extensible Stylesheet Language) transformation in order to fit client application needs. In this article we present a real use case sample where Pangea-LE collects and generates "on the fly" a structured view of all the patient clinical information available in a healthcare organisation. This information is presented to healthcare professionals in an EHR (Electronic Health Record) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real environment has been a notable success due to the non-invasive method which extremely respects the existing information systems.
Simultaneous computation of jet turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C. H.; Ramos, J. I.
1989-01-01
The existing flow computation methods, wave computation techniques, and theories based on noise source models are reviewed in order to assess the capabilities of numerical techniques to compute jet turbulence noise and understand the physical mechanisms governing it over a range of subsonic and supersonic nozzle exit conditions. In particular, attention is given to (1) methods for extrapolating near field information, obtained from flow computations, to the acoustic far field and (2) the numerical solution of the time-dependent Lilley equation.
NASA Technical Reports Server (NTRS)
Bush, M. W.
1984-01-01
Attention is given to the development history of the Central Weather Processor (CWP) program of the Federal Aviation Administration. The CWP will interface with high speed digital communications links, accept data and information products from new sources, generate data processing products, and provide meteorologists with the capability to automate data retrieval and dissemination. The CWP's users are operational (air traffic controllers, meteorologists and pilots), institutional (logistics, maintenance, testing and evaluation personnel), and administrative.
Barnes, Ronald A; Maswadi, Saher; Glickman, Randolph; Shadaram, Mehdi
2014-01-20
The goal of this paper is to demonstrate the unique capability of measuring the vector or angular information of propagating acoustic waves using an optical sensor. Acoustic waves were generated using photoacoustic interaction and detected by the probe beam deflection technique. Experiments and simulations were performed to study the interaction of acoustic emissions with an optical sensor in a coupling medium. The simulated results predict the probe beam and wavefront interaction and produced simulated signals that are verified by experiment.
Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh;
2014-01-01
The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.
Demonstration of a quantum controlled-NOT gate in the telecommunications band.
Chen, Jun; Altepeter, Joseph B; Medic, Milja; Lee, Kim Fook; Gokden, Burc; Hadfield, Robert H; Nam, Sae Woo; Kumar, Prem
2008-04-04
We present the first quantum controlled-not (cnot) gate realized using a fiber-based indistinguishable photon-pair source in the 1.55 microm telecommunications band. Using this free-space cnot gate, all four Bell states are produced and fully characterized by performing quantum-state tomography, demonstrating the gate's unambiguous entangling capability and high fidelity. Telecom-band operation makes this cnot gate particularly suitable for quantum-information-processing tasks that are at the interface of quantum communication and linear optical quantum computing.
A global, open-source database of flood protection standards
NASA Astrophysics Data System (ADS)
Scussolini, Paolo; Aerts, Jeroen; Jongman, Brenden; Bouwer, Laurens; Winsemius, Hessel; de Moel, Hans; Ward, Philip
2016-04-01
Accurate flood risk estimation is pivotal in that it enables risk-informed policies in disaster risk reduction, as emphasized in the recent Sendai framework for Disaster Risk Reduction. To improve our understanding of flood risk, models are now capable to provide actionable risk information on the (sub)global scale. Still the accuracy of their results is greatly limited by the lack of information on standards of protection to flood that are actually in place; and researchers thus take large assumptions on the extent of protection. With our work we propose a first global, open-source database of FLOod PROtection Standards, FLOPROS, covering a range of spatial scales. FLOPROS is structured in three layers of information, and merges them into one consistent database: 1) the Design layer contains empirical information about the standard of protection presently in place; 2) the Policy layer contains intended protection standards from normative documents; 3) the Model layer uses a validated numerical approach to calculate protection standards for areas not covered in the other layers. The FLOPROS database can be used for more accurate risk assessment exercises across scales. As the database should be continually updated to reflect new interventions, we invite researchers and practitioners to contribute information. Further, we look for partners within the risk community to participate in additional strategies to implement the amount and accuracy of information contained in this first version of FLOPROS.
Framework for Informed Policy Making Using Data from National Environmental Observatories
NASA Astrophysics Data System (ADS)
Wee, B.; Taylor, J. R.; Poinsatte, J.
2012-12-01
Large-scale environmental changes pose challenges that straddle environmental, economic, and social boundaries. As we design and implement climate adaptation strategies at the Federal, state, local, and tribal levels, accessible and usable data are essential for implementing actions that are informed by the best available information. Data-intensive science has been heralded as an enabler for scientific breakthroughs powered by advanced computing capabilities and interoperable data systems. Those same capabilities can be applied to data and information systems that facilitate the transformation of data into highly processed products. At the interface of scientifically informed public policy and data intensive science lies the potential for producers of credible, integrated, multi-scalar environmental data like the National Ecological Observatory Network (NEON) and its partners to capitalize on data and informatics interoperability initiatives that enable the integration of environmental data from across credible data sources. NSF's large-scale environmental observatories such as NEON and the Ocean Observatories Initiative (OOI) are designed to provide high-quality, long-term environmental data for research. These data are also meant to be repurposed for operational needs that like risk management, vulnerability assessments, resource management, and others. The proposed USDA Agriculture Research Service (ARS) Long Term Agro-ecosystem Research (LTAR) network is another example of such an environmental observatory that will produce credible data for environmental / agricultural forecasting and informing policy. To facilitate data fusion across observatories, there is a growing call for observation systems to more closely coordinate and standardize how variables are measured. Together with observation standards, cyberinfrastructure standards enable the proliferation of an ecosystem of applications that utilize diverse, high-quality, credible data. Interoperability facilitates the integration of data from multiple credible sources of data, and enables the repurposing of data for use at different geographical scales. Metadata that captures the transformation of data into value-added products ("provenance") lends reproducability and transparency to the entire process. This way, the datasets and model code used to create any product can be examined by other parties. This talk outlines a pathway for transforming environmental data into value-added products by various stakeholders to better inform sustainable agriculture using data from environmental observatories including NEON and LTAR.;
Results of Evaluation of Solar Thermal Propulsion
NASA Technical Reports Server (NTRS)
Woodcock, Gordon; Byers, Dave
2003-01-01
The solar thermal propulsion evaluation reported here relied on prior research for all information on solar thermal propulsion technology and performance. Sources included personal contacts with experts in the field in addition to published reports and papers. Mission performance models were created based on this information in order to estimate performance and mass characteristics of solar thermal propulsion systems. Mission analysis was performed for a set of reference missions to assess the capabilities and benefits of solar thermal propulsion in comparison with alternative in-space propulsion systems such as chemical and electric propulsion. Mission analysis included estimation of delta V requirements as well as payload capabilities for a range of missions. Launch requirements and costs, and integration into launch vehicles, were also considered. The mission set included representative robotic scientific missions, and potential future NASA human missions beyond low Earth orbit. Commercial communications satellite delivery missions were also included, because if STP technology were selected for that application, frequent use is implied and this would help amortize costs for technology advancement and systems development. A C3 Topper mission was defined, calling for a relatively small STP. The application is to augment the launch energy (C3) available from launch vehicles with their built-in upper stages. Payload masses were obtained from references where available. The communications satellite masses represent the range of payload capabilities for the Delta IV Medium and/or Atlas launch vehicle family. Results indicated that STP could improve payload capability over current systems, but that this advantage cannot be realized except in a few cases because of payload fairing volume limitations on current launch vehicles. It was also found that acquiring a more capable (existing) launch vehicle, rather than adding an STP stage, is the most economical in most cases.
IT/IS plus E: exploring the need for e-integration
NASA Astrophysics Data System (ADS)
Miele, Renato; Gunasekaran, Angappa; Yusuf, Yahaya Y.
2000-10-01
The change in IT/IS strategy is about the Internet becoming a major part of the corporate environment and driving decisions more and more. Companies of all sizes and industries can fully engage employees, customers and partners to capitalize upon the new Internet economy. They can optimize supply chains, managing strategic relationships, reducing time to market, sharing vital information, and increasing productivity and shareholder value. Remaining competitive in today's rapidly changing global marketplace requires fast action. The problem is now how much, how soon, and what kind of Internet based components are essential for companies to be successful, and how the adoption of E-Integration can become a critical component of company's survival in an increasingly competitive environment. How information, knowledge and innovation processes can drive business success are fundamental notions for the information- based economy, which have been extensively researched and confirmed throughout the IT revolution. The new capabilities to use the Internet to supply large amounts of relevant information from multiple internal and external sources give the possibility to move from isolate Information Systems toward an integrated environment in every business organization. The article addresses how E-Integration must link together data from multiple sources, providing a seamless system, fully interoperable with pre-existing IT environment, totally scalable and upgradeable.
Sources of Cryogenic Data and Information
NASA Astrophysics Data System (ADS)
Mohling, R. A.; Hufferd, W. L.; Marquardt, E. D.
It is commonly known that cryogenic data, technology, and information are applied across many military, National Aeronautics and Space Administration (NASA), and civilian product lines. Before 1950, however, there was no centralized US source of cryogenic technology data. The Cryogenic Data Center of the National Bureau of Standards (NBS) maintained a database of cryogenic technical documents that served the national need well from the mid 1950s to the early 1980s. The database, maintained on a mainframe computer, was a highly specific bibliography of cryogenic literature and thermophysical properties that covered over 100 years of data. In 1983, however, the Cryogenic Data Center was discontinued when NBS's mission and scope were redefined. In 1998, NASA contracted with the Chemical Propulsion Information Agency (CPIA) and Technology Applications, Inc. (TAI) to reconstitute and update Cryogenic Data Center information and establish a self-sufficient entity to provide technical services for the cryogenic community. The Cryogenic Information Center (CIC) provided this service until 2004, when it was discontinued due to a lack of market interest. The CIC technical assets were distributed to NASA Marshall Space Flight Center and the National Institute of Standards and Technology. Plans are under way in 2006 for CPIA to launch an e-commerce cryogenic website to offer bibliography data with capability to download cryogenic documents.
Distributed policy based access to networked heterogeneous ISR data sources
NASA Astrophysics Data System (ADS)
Bent, G.; Vyvyan, D.; Wood, David; Zerfos, Petros; Calo, Seraphin
2010-04-01
Within a coalition environment, ad hoc Communities of Interest (CoI's) come together, perhaps for only a short time, with different sensors, sensor platforms, data fusion elements, and networks to conduct a task (or set of tasks) with different coalition members taking different roles. In such a coalition, each organization will have its own inherent restrictions on how it will interact with the others. These are usually stated as a set of policies, including security and privacy policies. The capability that we want to enable for a coalition operation is to provide access to information from any coalition partner in conformance with the policies of all. One of the challenges in supporting such ad-hoc coalition operations is that of providing efficient access to distributed sources of data, where the applications requiring the data do not have knowledge of the location of the data within the network. To address this challenge the International Technology Alliance (ITA) program has been developing the concept of a Dynamic Distributed Federated Database (DDFD), also know as a Gaian Database. This type of database provides a means for accessing data across a network of distributed heterogeneous data sources where access to the information is controlled by a mixture of local and global policies. We describe how a network of disparate ISR elements can be expressed as a DDFD and how this approach enables sensor and other information sources to be discovered autonomously or semi-autonomously and/or combined, fused formally defined local and global policies.
Combining data from multiple sources using the CUAHSI Hydrologic Information System
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Ames, D. P.; Horsburgh, J. S.; Goodall, J. L.
2012-12-01
The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) has developed a Hydrologic Information System (HIS) to provide better access to data by enabling the publication, cataloging, discovery, retrieval, and analysis of hydrologic data using web services. The CUAHSI HIS is an Internet based system comprised of hydrologic databases and servers connected through web services as well as software for data publication, discovery and access. The HIS metadata catalog lists close to 100 web services registered to provide data through this system, ranging from large federal agency data sets to experimental watersheds managed by University investigators. The system's flexibility in storing and enabling public access to similarly formatted data and metadata has created a community data resource from governmental and academic data that might otherwise remain private or analyzed only in isolation. Comprehensive understanding of hydrology requires integration of this information from multiple sources. HydroDesktop is the client application developed as part of HIS to support data discovery and access through this system. HydroDesktop is founded on an open source GIS client and has a plug-in architecture that has enabled the integration of modeling and analysis capability with the functionality for data discovery and access. Model integration is possible through a plug-in built on the OpenMI standard and data visualization and analysis is supported by an R plug-in. This presentation will demonstrate HydroDesktop, showing how it provides an analysis environment within which data from multiple sources can be discovered, accessed and integrated.
A ride in the time machine: information management capabilities health departments will need.
Foldy, Seth; Grannis, Shaun; Ross, David; Smith, Torney
2014-09-01
We have proposed needed information management capabilities for future US health departments predicated on trends in health care reform and health information technology. Regardless of whether health departments provide direct clinical services (and many will), they will manage unprecedented quantities of sensitive information for the public health core functions of assurance and assessment, including population-level health surveillance and metrics. Absent improved capabilities, health departments risk vestigial status, with consequences for vulnerable populations. Developments in electronic health records, interoperability and information exchange, public information sharing, decision support, and cloud technologies can support information management if health departments have appropriate capabilities. The need for national engagement in and consensus on these capabilities and their importance to health department sustainability make them appropriate for consideration in the context of accreditation.
The Role of Applied Epidemiology Methods in the Disaster Management Cycle
Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.
2014-01-01
Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748
Application of ERTS-1 data to the protection and management of New Jersey's coastal environment
NASA Technical Reports Server (NTRS)
Yunghans, R. S. (Principal Investigator); Feinberg, E. B.; Mairs, R. L.; Wobber, F. J.; Martin, K. R.; Pettinger, L. R.; Macomber, R. T.
1973-01-01
The author has identified the following significant results. Analysis of ERTS-1 imagery and complementary aircraft overflights has led to the development of seventeen information products that are being utilized within the Department of Environmental Protection as new sources of information for coastal zone management. Problem areas of significance to the State, and in which product development has contributed to date, have been identified as: the environmental effects of offshore waste disposal, the placement of ocean outfalls, the better understanding of littoral processes for shore protection, the delineation of the coastal ecozones, and determination of the flushing characteristics of the State's estuaries. Of equal importance has been the development of a capability within the State to use and understand remote sensor-derived information.
Active Laplacian electrode for the data-acquisition system of EHG
NASA Astrophysics Data System (ADS)
Li, G.; Wang, Y.; Lin, L.; Jiang, W.; Wang, L. L.; C-Y Lu, Stephen; Besio, Walter G.
2005-01-01
EHG (electrohysterogram) is the recording of uterine electromyogram with external electrodes located on the abdomen of pregnant woman. Derived from the electrical activity generated at the muscle fiber lever, it provides complementary information from the muscle, and appears to be a very promising technique for clinical or physiologic investigation of uterine activity, compared with current monitoring which can't give us complementary phase information of uterine activity. In this article we have shown the disadvantages of the conventional electrodes for EHG data-acquisition system and put forward a new type of electrode that is called active Laplacian electrode. It integrates concentric rings electrode with a bioelectricity preamplifier and is capable of acquiring localized information. We can localise the EHG signals source more easily by using this new electrode.
Immersion ultrasonography: simultaneous A-scan and B-scan.
Coleman, D J; Dallow, R L; Smith, M E
1979-01-01
In eyes with opaque media, ophthalmic ultrasound provides a unique source of information that can dramatically affect the course of patient management. In addition, when an ocular abnormality can be visualized, ultrasonography provides information that supplements and complements other diagnostic testing. It provides documentation and differentiation of abnormal states, such as vitreous hemorrhage and intraocular tumor, as well as differentiation of orbital tumors from inflammatory causes of exophthalmos. Additional capabilities of ultrasound are biometric determinations for calculation of intraocular lens implant powers and drug-effectiveness studies. Maximal information is derived from ultrasonography when A-scan and B-scan techniques are employed simultaneously. Flexibility of electronics, variable-frequency transducers, and the use of several different manual scanning patterns aid in detection and interpretation of results. The immersion system of ultrasonography provides these features optimally.
A taxonomy of hospitals participating in Medicare accountable care organizations.
Bazzoli, Gloria J; Harless, David W; Chukmaitov, Askar S
2017-03-03
Medicare was an early innovator of accountable care organizations (ACOs), establishing the Medicare Shared Savings Program (MSSP) and Pioneer programs in 2012-2013. Existing research has documented that ACOs bring together an array of health providers with hospitals serving as important participants. Hospitals vary markedly in their service structure and organizational capabilities, and thus, one would expect hospital ACO participants to vary in these regards. Our research identifies hospital subgroups that share certain capabilities and competencies. Such research, in conjunction with existing ACO research, provides deeper understanding of the structure and operation of these organizations. Given that Medicare was an initiator of the ACO concept, our findings provide a baseline to track the evolution of ACO hospitals over time. Hierarchical clustering methods are used in separate analyses of MSSP and Pioneer ACO hospitals. Hospitals participating in ACOs with 2012-2013 start dates are identified through multiple sources. Study data come from the Centers for Medicare and Medicaid Services, American Hospital Association, and Health Information and Management Systems Society. Five-cluster solutions were developed separately for the MSSP and Pioneer hospital samples. Both the MSSP and Pioneer taxonomies had several clusters with high levels of health information technology capabilities. Also distinct clusters with strong physician linkages were present. We examined Pioneer ACO hospitals that subsequently left the program and found that they commonly had low levels of ambulatory care services or health information technology. Distinct subgroups of hospitals exist in both the MSSP and Pioneer programs, suggesting that individual hospitals serve different roles within an ACO. Health information technology and physician linkages appear to be particularly important features in ACO hospitals. ACOs need to consider not only geographic and service mix when selecting hospital participants but also their vertical integration features and management competencies.
Kintrup, J; Wünsch, G
2001-11-01
The capability of sewer slime to accumulate heavy metals from municipal wastewater can be exploited to identify the sources of sewage sludge pollution. Former investigations of sewer slime looked for a few elements only and could, therefore, not account for deviations of the enrichment efficiency of the slime or for irregularities from sampling. Results of ICP-MS multi element determinations were analyzed by multivariate statistical methods. A new dimensionless characteristic "sewer slime impact" is proposed, which is zero for unloaded samples. Patterns expressed in this data format specifically extract the information required to identify the type of pollution and polluter quicker and with less effort and cost than hitherto.
Environmental benefits of chemical propulsion
NASA Technical Reports Server (NTRS)
Hayes, Joyce A.; Goldberg, Benjamin E.; Anderson, David M.
1995-01-01
This paper identifies the necessity of chemical propulsion to satellite usage and some of the benefits accrued through monitoring global resources and patterns, including the Global Climate Change Model (GCM). The paper also summarized how the satellite observations are used to affect national and international policies. Chemical propulsion, like all environmentally conscious industries, does provide limited, controlled pollutant sources through its manufacture and usage. However, chemical propulsion is the sole source which enables mankind to launch spacecraft and monitor the Earth. The information provided by remote sensing directly affects national and international policies designed to protect the environment and enhance the overall quality of life on Earth. The resultant of chemical propulsion is the capability to reduce overall pollutant emissions to the benefit of mankind.
NASA Astrophysics Data System (ADS)
Barnes, Cris W.; Fernández, Juan; Hartsfield, Thomas; Sandberg, Richard; Sheffield, Richard; Tapia, John P.; Wang, Zhehui
2017-06-01
NNSA does not have a capability to understand and test the response of materials and conditions necessary to determine the linkages between microstructure of materials and performance in extreme weapons-relevant environments. Required is an x-ray source, coherent to optimize imaging capability, brilliant and high repetition-rate to address all relevant time scales, and with high enough energy to see into and through the amount of material in the middle or mesoscale where microstructure determines materials response. The Department of Energy has determined there is a mission need for a MaRIE Project to deliver this capability. There are risks to the Project to successfully deliver all the technology needed to provide the capability for the mission need and to use those photons to control the time-dependent production and performance of materials. The present technology risk mitigation activities for the MaRIE project are: developing ultrafast high-energy x-ray detectors, combining the data from several imaging probes to obtain multi-dimensional information about the sample, and developing techniques for bulk dynamic measurements of temperature. This talk will describe these efforts and other critical technology elements requiring future investment by the project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.
While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describemore » our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.« less
Suomi satellite brings to light a unique frontier of nighttime environmental sensing capabilities
Miller, Steven D.; Mills, Stephen P.; Elvidge, Christopher D.; Lindsey, Daniel T.; Lee, Thomas F.; Hawkins, Jeffrey D.
2012-01-01
Most environmental satellite radiometers use solar reflectance information when it is available during the day but must resort at night to emission signals from infrared bands, which offer poor sensitivity to low-level clouds and surface features. A few sensors can take advantage of moonlight, but the inconsistent availability of the lunar source limits measurement utility. Here we show that the Day/Night Band (DNB) low-light visible sensor on the recently launched Suomi National Polar-orbiting Partnership (NPP) satellite has the unique ability to image cloud and surface features by way of reflected airglow, starlight, and zodiacal light illumination. Examples collected during new moon reveal not only meteorological and surface features, but also the direct emission of airglow structures in the mesosphere, including expansive regions of diffuse glow and wave patterns forced by tropospheric convection. The ability to leverage diffuse illumination sources for nocturnal environmental sensing applications extends the advantages of visible-light information to moonless nights. PMID:22984179
Chen, Kun; Wu, Tao; Wei, Haoyun; Zhou, Tian; Li, Yan
2016-01-01
Coherent anti-Stokes Raman microscopy (CARS) is a quantitative, chemically specific, and label-free optical imaging technique for studying inhomogeneous systems. However, the complicating influence of the nonresonant response on the CARS signal severely limits its sensitivity and specificity and especially limits the extent to which CARS microscopy has been used as a fully quantitative imaging technique. On the basis of spectral focusing mechanism, we establish a dual-soliton Stokes based CARS microspectroscopy and microscopy scheme capable of quantifying the spatial information of densities and chemical composition within inhomogeneous samples, using a single fiber laser. Dual-soliton Stokes scheme not only removes the nonresonant background but also allows robust acquisition of multiple characteristic vibrational frequencies. This all-fiber based laser source can cover the entire fingerprint (800-2200 cm−1) region with a spectral resolution of 15 cm−1. We demonstrate that quantitative degree determination of lipid-chain unsaturation in the fatty acids mixture can be achieved by the characterization of C = C stretching and CH2 deformation vibrations. For microscopy purposes, we show that the spatially inhomogeneous distribution of lipid droplets can be further quantitatively visualized using this quantified degree of lipid unsaturation in the acyl chain for contrast in the hyperspectral CARS images. The combination of compact excitation source and background-free capability to facilitate extraction of quantitative composition information with multiplex spectral peaks will enable wider applications of quantitative chemical imaging in studying biological and material systems. PMID:27867704
Biomimetic MEMS sensor array for navigation and water detection
NASA Astrophysics Data System (ADS)
Futterknecht, Oliver; Macqueen, Mark O.; Karman, Salmah; Diah, S. Zaleha M.; Gebeshuber, Ille C.
2013-05-01
The focus of this study is biomimetic concept development for a MEMS sensor array for navigation and water detection. The MEMS sensor array is inspired by abstractions of the respective biological functions: polarized skylight-based navigation sensors in honeybees (Apis mellifera) and the ability of African elephants (Loxodonta africana) to detect water. The focus lies on how to navigate to and how to detect water sources in desert-like or remote areas. The goal is to develop a sensor that can provide both, navigation clues and help in detecting nearby water sources. We basically use the information provided by the natural polarization pattern produced by the sunbeams scattered within the atmosphere combined with the capability of the honeybee's compound eye to extrapolate the navigation information. The detection device uses light beam reactive MEMS, which are capable to detect the skylight polarization based on the Rayleigh sky model. For water detection we present various possible approaches to realize the sensor. In the first approach, polarization is used: moisture saturated areas near ground have a small but distinctively different effect on scattering and polarizing light than less moist ones. Modified skylight polarization sensors (Karman, Diah and Gebeshuber, 2012) are used to visualize this small change in scattering. The second approach is inspired by the ability of elephants to detect infrasound produced by underground water reservoirs, and shall be used to determine the location of underground rivers and visualize their exact routes.
Web-Based Surveillance Systems for Human, Animal, and Plant Diseases.
Madoff, Lawrence C; Li, Annie
2014-02-01
The emergence of infectious diseases, caused by novel pathogens or the spread of existing ones to new populations and regions, represents a continuous threat to humans and other species. The early detection of emerging human, animal, and plant diseases is critical to preventing the spread of infection and protecting the health of our species and environment. Today, more than 75% of emerging infectious diseases are estimated to be zoonotic and capable of crossing species barriers and diminishing food supplies. Traditionally, surveillance of diseases has relied on a hierarchy of health professionals that can be costly to build and maintain, leading to a delay or interruption in reporting. However, Internet-based surveillance systems bring another dimension to epidemiology by utilizing technology to collect, organize, and disseminate information in a more timely manner. Partially and fully automated systems allow for earlier detection of disease outbreaks by searching for information from both formal sources (e.g., World Health Organization and government ministry reports) and informal sources (e.g., blogs, online media sources, and social networks). Web-based applications display disparate information online or disperse it through e-mail to subscribers or the general public. Web-based early warning systems, such as ProMED-mail, the Global Public Health Intelligence Network (GPHIN), and Health Map, have been able to recognize emerging infectious diseases earlier than traditional surveillance systems. These systems, which are continuing to evolve, are now widely utilized by individuals, humanitarian organizations, and government health ministries.
Liu, Hesheng; Schimpf, Paul H; Dong, Guoya; Gao, Xiaorong; Yang, Fusheng; Gao, Shangkai
2005-10-01
This paper presents a new algorithm called Standardized Shrinking LORETA-FOCUSS (SSLOFO) for solving the electroencephalogram (EEG) inverse problem. Multiple techniques are combined in a single procedure to robustly reconstruct the underlying source distribution with high spatial resolution. This algorithm uses a recursive process which takes the smooth estimate of sLORETA as initialization and then employs the re-weighted minimum norm introduced by FOCUSS. An important technique called standardization is involved in the recursive process to enhance the localization ability. The algorithm is further improved by automatically adjusting the source space according to the estimate of the previous step, and by the inclusion of temporal information. Simulation studies are carried out on both spherical and realistic head models. The algorithm achieves very good localization ability on noise-free data. It is capable of recovering complex source configurations with arbitrary shapes and can produce high quality images of extended source distributions. We also characterized the performance with noisy data in a realistic head model. An important feature of this algorithm is that the temporal waveforms are clearly reconstructed, even for closely spaced sources. This provides a convenient way to estimate neural dynamics directly from the cortical sources.
Huang, Yingxiang; Lee, Junghye; Wang, Shuang; Sun, Jimeng; Liu, Hongfang; Jiang, Xiaoqian
2018-05-16
Data sharing has been a big challenge in biomedical informatics because of privacy concerns. Contextual embedding models have demonstrated a very strong representative capability to describe medical concepts (and their context), and they have shown promise as an alternative way to support deep-learning applications without the need to disclose original data. However, contextual embedding models acquired from individual hospitals cannot be directly combined because their embedding spaces are different, and naive pooling renders combined embeddings useless. The aim of this study was to present a novel approach to address these issues and to promote sharing representation without sharing data. Without sacrificing privacy, we also aimed to build a global model from representations learned from local private data and synchronize information from multiple sources. We propose a methodology that harmonizes different local contextual embeddings into a global model. We used Word2Vec to generate contextual embeddings from each source and Procrustes to fuse different vector models into one common space by using a list of corresponding pairs as anchor points. We performed prediction analysis with harmonized embeddings. We used sequential medical events extracted from the Medical Information Mart for Intensive Care III database to evaluate the proposed methodology in predicting the next likely diagnosis of a new patient using either structured data or unstructured data. Under different experimental scenarios, we confirmed that the global model built from harmonized local models achieves a more accurate prediction than local models and global models built from naive pooling. Such aggregation of local models using our unique harmonization can serve as the proxy for a global model, combining information from a wide range of institutions and information sources. It allows information unique to a certain hospital to become available to other sites, increasing the fluidity of information flow in health care. ©Yingxiang Huang, Junghye Lee, Shuang Wang, Jimeng Sun, Hongfang Liu, Xiaoqian Jiang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 16.05.2018.
Extragalactic Science With Kepler
NASA Astrophysics Data System (ADS)
Fanelli, Michael N.; Marcum, P.
2012-01-01
Although designed as an exoplanet and stellar astrophysics experiment, the Kepler mission provides a unique capability to explore the essentially unknown photometric stability of galactic systems at millimag levels using Kepler's blend of high precision and continuous monitoring. Time series observations of galaxies are sensitive to both quasi-continuous variability, driven by accretion activity from embedded active nuclei, and random, episodic events, such as supernovae. In general, galaxies lacking active nuclei are not expected to be variable with the timescales and amplitudes observed in stellar sources and are free of source motions that affect stars (e.g., parallax). These sources can serve as a population of quiescent, non-variable sources, which may be used to quantify the photometric stability and noise characteristics of the Kepler photometer. A factor limiting galaxy monitoring in the Kepler FOV is the overall lack of detailed quantitative information for the galaxy population. Despite these limitations, a significant number of galaxies are being observed, forming the Kepler Galaxy Archive. Observed sources total approximately 100, 250, and 700 in Cycles 1-3 (Cycle 3 began in June 2011). In this poster we interpret the properties of a set of 20 galaxies monitored during quarters 4 through 8, their associated light curves, photometric and astrometric precision and potential variability. We describe data analysis issues relevant to extended sources and available software tools. In addition, we detail ongoing surveys that are providing new photometric and morphological information for galaxies over the entire field. These new datasets will both aid the interpretation of the time series, and improve source selection, e.g., help identify candidate AGNs and starburst systems, for further monitoring.
Extended X-ray Absorption Fine Structure Study of Bond Constraints in Ge-Sb-Te Alloys
2011-02-07
Ray Absorption Spectroscopy, or EXAFS. Using the spectroscopic capabilities provided by the MCAT line at the Advanced Photon Source at Argonne...Absorption Spectroscopy, or EXAFS. Using the spectroscopic capabilities provided by the MCAT line at the Advanced Photon Source at Argonne National
Probabilistic drug connectivity mapping
2014-01-01
Background The aim of connectivity mapping is to match drugs using drug-treatment gene expression profiles from multiple cell lines. This can be viewed as an information retrieval task, with the goal of finding the most relevant profiles for a given query drug. We infer the relevance for retrieval by data-driven probabilistic modeling of the drug responses, resulting in probabilistic connectivity mapping, and further consider the available cell lines as different data sources. We use a special type of probabilistic model to separate what is shared and specific between the sources, in contrast to earlier connectivity mapping methods that have intentionally aggregated all available data, neglecting information about the differences between the cell lines. Results We show that the probabilistic multi-source connectivity mapping method is superior to alternatives in finding functionally and chemically similar drugs from the Connectivity Map data set. We also demonstrate that an extension of the method is capable of retrieving combinations of drugs that match different relevant parts of the query drug response profile. Conclusions The probabilistic modeling-based connectivity mapping method provides a promising alternative to earlier methods. Principled integration of data from different cell lines helps to identify relevant responses for specific drug repositioning applications. PMID:24742351
A computational framework for modeling targets as complex adaptive systems
NASA Astrophysics Data System (ADS)
Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh
2017-05-01
Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.
Computational toxicology using the OpenTox application programming interface and Bioclipse
2011-01-01
Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173
Smart CMOS sensor for wideband laser threat detection
NASA Astrophysics Data System (ADS)
Schwarze, Craig R.; Sonkusale, Sameer
2015-09-01
The proliferation of lasers has led to their widespread use in applications ranging from short range standoff chemical detection to long range Lidar sensing and target designation operating across the UV to LWIR spectrum. Recent advances in high energy lasers have renewed the development of laser weapons systems. The ability to measure and assess laser source information is important to both identify a potential threat as well as determine safety and nominal hazard zone (NHZ). Laser detection sensors are required that provide high dynamic range, wide spectral coverage, pulsed and continuous wave detection, and large field of view. OPTRA, Inc. and Tufts have developed a custom ROIC smart pixel imaging sensor architecture and wavelength encoding optics for measurement of source wavelength, pulse length, pulse repetition frequency (PRF), irradiance, and angle of arrival. The smart architecture provides dual linear and logarithmic operating modes to provide 8+ orders of signal dynamic range and nanosecond pulse measurement capability that can be hybridized with the appropriate detector array to provide UV through LWIR laser sensing. Recent advances in sputtering techniques provide the capability for post-processing CMOS dies from the foundry and patterning PbS and PbSe photoconductors directly on the chip to create a single monolithic sensor array architecture for measuring sources operating from 0.26 - 5.0 microns, 1 mW/cm2 - 2 kW/cm2.
An Intuitionistic Fuzzy Logic Models for Multicriteria Decision Making Under Uncertainty
NASA Astrophysics Data System (ADS)
Jana, Biswajit; Mohanty, Sachi Nandan
2017-04-01
The purpose of this paper is to enhance the applicability of the fuzzy sets for developing mathematical models for decision making under uncertainty, In general a decision making process consist of four stages, namely collection of information from various sources, compile the information, execute the information and finally take the decision/action. Only fuzzy sets theory is capable to quantifying the linguistic expression to mathematical form in complex situation. Intuitionistic fuzzy set (IFSs) which reflects the fact that the degree of non membership is not always equal to one minus degree of membership. There may be some degree of hesitation. Thus, there are some situations where IFS theory provides a more meaningful and applicable to cope with imprecise information present for solving multiple criteria decision making problem. This paper emphasis on IFSs, which is help for solving real world problem in uncertainty situation.
Information Management Systems for Monitoring and Documenting World Heritage - the Silk Roads Chris
NASA Astrophysics Data System (ADS)
Vileikis, O.; Serruys, E.; Dumont, B.; van Balen, K.; Santana Quinterod, M.; de Maeyer, P.; Tigny, V.
2012-07-01
This paper discusses the application of Information Management Systems (IMS) for documenting and monitoring World Heritage (WH) properties. The application of IMS in WH can support all stakeholders involved in conservation, and management of cultural heritage by more easily inventorying, mining and exchanging information from multiple sources based on international standards. Moreover, IMS could assist in detecting damages and preparing management strategies to mitigate risks, and slowing down the deterioration of the integrity of WH properties. The case study of the Silk Roads Cultural Heritage Resource Information System (CHRIS), a Belgian Federal Science Policy Office funded project, illustrates the capabilities of IMS in the context of the nomination of the Central Asian Silk Roads on the WH List. This multi-lingual, web-based IMS will act as a collaborative platform allowing for the completion of improved transnational nomination dossiers and subsequent monitoring activities with all necessary baseline information to easily verify consistency and quality of the proposal. The Silk Roads CHRIS Geospatial Content Management System uses open source technologies and allows to georeference data from different scales and sources including data from field recording methods and combine it with historical and heritage features documented through various means such as textual descriptions, documents, photographs, 3D models or videos. Moreover, tailored maps can also be generated by overlaying a selection of available layers and then be exported to support the nomination dossier. Finally, by using this innovative information and decision support system, the State Parties and other interested stakeholders will have access to a complete nomination dossier and could therefore respond more effectively to hazards and disaster phenomena.
Development of XAFS Into a Structure Determination Technique
NASA Astrophysics Data System (ADS)
Stern, E. A.
After the detection of diffraction of x-rays by M. Laue in 1912, the technique was soon applied to structure determination by Bragg within a year. On the other hand, although the edge steps in X-Ray absorption were discovered even earlier by Barkla and both the near edge (XANES) and extended X-Ray fine structure (EXAFS) past the edge were detected by 1929, it still took over 40 years to realize the structure information contained in this X-Ray absorption fine structure (XAFS). To understand this delay a brief historical review of the development of the scientific ideas that transformed XAFS into the premiere technique for local structure determination is given. The development includes both advances in theoretical understanding and calculational capabilities, and in experimental facilities, especially synchrotron radiation sources. The present state of the XAFS technique and its capabilities are summarized.
Aircraft Icing Weather Data Reporting and Dissemination System
NASA Technical Reports Server (NTRS)
Bass, Ellen J.; Minsk, Brian; Lindholm, Tenny; Politovich, Marcia; Reehorst, Andrew (Technical Monitor)
2002-01-01
The long-term operational concept of this research is to develop an onboard aircraft system that assesses and reports atmospheric icing conditions automatically and in a timely manner in order to improve aviation safety and the efficiency of aircraft operations via improved real-time and forecast weather products. The idea is to use current measurement capabilities on aircraft equipped with icing sensors and in-flight data communication technologies as a reporting source. Without requiring expensive avionics upgrades, aircraft data must be processed and available for downlink. Ideally, the data from multiple aircraft can then be integrated (along with other real-time and modeled data) on the ground such that aviation-centered icing hazard metrics for volumes of airspace can be assessed. As the effect of icing on different aircraft types can vary, the information should be displayed in meaningful ways such that multiple types of users can understand the information. That is, information must be presented in a manner to allow users to understand the icing conditions with respect to individual concerns and aircraft capabilities. This research provides progress toward this operational concept by: identifying an aircraft platform capable of digitally capturing, processing, and downlinking icing data; identifying the required in situ icing data processing; investigating the requirements for routing the icing data for use by weather products; developing an icing case study in order to gain insight into major air carrier needs; developing and prototyping icing display concepts based on the National Center for Atmospheric Research's existing diagnostic and forecast experimental icing products; and conducting a usability study for the prototyped icing display concepts.
Fuselets: an agent based architecture for fusion of heterogeneous information and data
NASA Astrophysics Data System (ADS)
Beyerer, Jürgen; Heizmann, Michael; Sander, Jennifer
2006-04-01
A new architecture for fusing information and data from heterogeneous sources is proposed. The approach takes criminalistics as a model. In analogy to the work of detectives, who attempt to investigate crimes, software agents are initiated that pursue clues and try to consolidate or to dismiss hypotheses. Like their human pendants, they can, if questions beyond their competences arise, consult expert agents. Within the context of a certain task, region, and time interval, specialized operations are applied to each relevant information source, e.g. IMINT, SIGINT, ACINT,..., HUMINT, data bases etc. in order to establish hit lists of first clues. Each clue is described by its pertaining facts, uncertainties, and dependencies in form of a local degree-of-belief (DoB) distribution in a Bayesian sense. For each clue an agent is initiated which cooperates with other agents and experts. Expert agents support to make use of different information sources. Consultations of experts, capable to access certain information sources, result in changes of the DoB of the pertaining clue. According to the significance of concentration of their DoB distribution clues are abandoned or pursued further to formulate task specific hypotheses. Communications between the agents serve to find out whether different clues belong to the same cause and thus can be put together. At the end of the investigation process, the different hypotheses are evaluated by a jury and a final report is created that constitutes the fusion result. The approach proposed avoids calculating global DoB distributions by adopting a local Bayesian approximation and thus reduces the complexity of the exact problem essentially. Different information sources are transformed into DoB distributions using the maximum entropy paradigm and considering known facts as constraints. Nominal, ordinal and cardinal quantities can be treated within this framework equally. The architecture is scalable by tailoring the number of agents according to the available computer resources, to the priority of tasks, and to the maximum duration of the fusion process. Furthermore, the architecture allows cooperative work of human and automated agents and experts, as long as not all subtasks can be accomplished automatically.
Cost Comparison of B-1B Non-Mission-Capable Drivers Using Finite Source Queueing with Spares
2012-09-06
COMPARISON OF B-1B NON-MISSION-CAPABLE DRIVERS USING FINITE SOURCE QUEUEING WITH SPARES GRADUATE RESEARCH PAPER Presented to the Faculty...step into the lineup making large-number approximations unusable. Instead, a finite source queueing model including spares is incorporated...were reported as flying time accrued since last occurrence. Service time was given in both start-stop format and MX man-hours utilized. Service time was
NASA Astrophysics Data System (ADS)
Subedi, Kiran; Trejos, Tatiana; Almirall, José
2015-01-01
Elemental analysis, using either LA-ICP-MS or LIBS, can be used for the chemical characterization of materials of forensic interest to discriminate between source materials originating from different sources and also for the association of materials known to originate from the same source. In this study, a tandem LIBS/LA-ICP-MS system that combines the benefits of both LIBS and LA-ICP-MS was evaluated for the characterization of samples of printing inks (toners, inkjets, intaglio and offset.). The performance of both laser sampling methods is presented. A subset of 9 black laser toners, 10 colored (CMYK) inkjet samples, 12 colored (CMYK) offset samples and 12 intaglio inks originating from different manufacturing sources were analyzed to evaluate the discrimination capability of the tandem method. These samples were selected because they presented a very similar elemental profile by LA-ICP-MS. Although typical discrimination between different ink sources is found to be > 99% for a variety of inks when only LA-ICP-MS was used for the analysis, additional discrimination was achieved by combining the elemental results from the LIBS analysis to the LA-ICP-MS analysis in the tandem technique, enhancing the overall discrimination capability of the individual laser ablation methods. The LIBS measurements of the Ca, Fe, K and Si signals, in particular, improved the discrimination for this specific set of different ink samples previously shown to exhibit very similar LA-ICP-MS elemental profiles. The combination of these two techniques in a single setup resulted in better discrimination of the printing inks with two distinct fingerprint spectra, providing information from atomic/ionic emissions and isotopic composition (m/z) for each ink sample.
Searching Across the International Space Station Databases
NASA Technical Reports Server (NTRS)
Maluf, David A.; McDermott, William J.; Smith, Ernest E.; Bell, David G.; Gurram, Mohana
2007-01-01
Data access in the enterprise generally requires us to combine data from different sources and different formats. It is advantageous thus to focus on the intersection of the knowledge across sources and domains; keeping irrelevant knowledge around only serves to make the integration more unwieldy and more complicated than necessary. A context search over multiple domain is proposed in this paper to use context sensitive queries to support disciplined manipulation of domain knowledge resources. The objective of a context search is to provide the capability for interrogating many domain knowledge resources, which are largely semantically disjoint. The search supports formally the tasks of selecting, combining, extending, specializing, and modifying components from a diverse set of domains. This paper demonstrates a new paradigm in composition of information for enterprise applications. In particular, it discusses an approach to achieving data integration across multiple sources, in a manner that does not require heavy investment in database and middleware maintenance. This lean approach to integration leads to cost-effectiveness and scalability of data integration with an underlying schemaless object-relational database management system. This highly scalable, information on demand system framework, called NX-Search, which is an implementation of an information system built on NETMARK. NETMARK is a flexible, high-throughput open database integration framework for managing, storing, and searching unstructured or semi-structured arbitrary XML and HTML used widely at the National Aeronautics Space Administration (NASA) and industry.
Study of gamma detection capabilities of the REWARD mobile spectroscopic system
NASA Astrophysics Data System (ADS)
Balbuena, J. P.; Baptista, M.; Barros, S.; Dambacher, M.; Disch, C.; Fiederle, M.; Kuehn, S.; Parzefall, U.
2017-07-01
REWARD is a novel mobile spectroscopic radiation detector system for Homeland Security applications. The system integrates gamma and neutron detection equipped with wireless communication. A comprehensive simulation study on its gamma detection capabilities in different radioactive scenarios is presented in this work. The gamma detection unit consists of a precise energy resolution system based on two stacked (Cd,Zn)Te sensors working in coincidence sum mode. The volume of each of these CZT sensors is 1 cm3. The investigated energy windows used to determine the detection capabilities of the detector correspond to the gamma emissions from 137Cs and 60Co radioactive sources (662 keV and 1173/1333 keV respectively). Monte Carlo and Technology Computer-Aided Design (TCAD) simulations are combined to determine its sensing capabilities for different radiation sources and estimate the limits of detection of the sensing unit as a function of source activity for several shielding materials.
Zanoni, Michele; Piccinini, Filippo; Arienti, Chiara; Zamagni, Alice; Santi, Spartaco; Polico, Rolando; Bevilacqua, Alessandro; Tesei, Anna
2016-01-01
The potential of a spheroid tumor model composed of cells in different proliferative and metabolic states for the development of new anticancer strategies has been amply demonstrated. However, there is little or no information in the literature on the problems of reproducibility of data originating from experiments using 3D models. Our analyses, carried out using a novel open source software capable of performing an automatic image analysis of 3D tumor colonies, showed that a number of morphology parameters affect the response of large spheroids to treatment. In particular, we found that both spheroid volume and shape may be a source of variability. We also compared some commercially available viability assays specifically designed for 3D models. In conclusion, our data indicate the need for a pre-selection of tumor spheroids of homogeneous volume and shape to reduce data variability to a minimum before use in a cytotoxicity test. In addition, we identified and validated a cytotoxicity test capable of providing meaningful data on the damage induced in large tumor spheroids of up to diameter in 650 μm by different kinds of treatments. PMID:26752500
Power source evaluation capabilities at Sandia National Laboratories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doughty, D.H.; Butler, P.C.
1996-04-01
Sandia National Laboratories maintains one of the most comprehensive power source characterization facilities in the U.S. National Laboratory system. This paper describes the capabilities for evaluation of fuel cell technologies. The facility has a rechargeable battery test laboratory and a test area for performing nondestructive and functional computer-controlled testing of cells and batteries.
Exploring Market and Competitive Intelligence Research as a Source for Enhancing Innovation Capacity
ERIC Educational Resources Information Center
Bajaj, Deepak
2015-01-01
The purpose of this study was to assess the role of Competitive and Market Intelligence (CI/MI) Research as a potential source for improving the innovation capability of Small and Medium Enterprises (SME's) leading to successful new product/services/processes/capabilities development (Cooper & Edgett, 2002). This report highlights the…
Understanding USGS user needs and Earth observing data use for decision making
NASA Astrophysics Data System (ADS)
Wu, Z.
2016-12-01
US Geological Survey (USGS) initiated the Requirements, Capabilities and Analysis for Earth Observations (RCA-EO) project in the Land Remote Sensing (LRS) program, collaborating with the National Oceanic and Atmospheric Administration (NOAA) to jointly develop the supporting information infrastructure - The Earth Observation Requirements Evaluation Systems (EORES). RCA-EO enables us to collect information on current data products and projects across the USGS and evaluate the impacts of Earth observation data from all sources, including spaceborne, airborne, and ground-based platforms. EORES allows users to query, filter, and analyze usage and impacts of Earth observation data at different organizational level within the bureau. We engaged over 500 subject matter experts and evaluated more than 1000 different Earth observing data sources and products. RCA-EO provides a comprehensive way to evaluate impacts of Earth observing data on USGS mission areas and programs through the survey of 345 key USGS products and services. We paid special attention to user feedback about Earth observing data to inform decision making on improving user satisfaction. We believe the approach and philosophy of RCA-EO can be applied in much broader scope to derive comprehensive knowledge of Earth observing systems impacts and usage and inform data products development and remote sensing technology innovation.
NASA Astrophysics Data System (ADS)
King, John L.; Corwin, Dennis L.
Information technologies are already delivering important new capabilities for scientists working on non-point source (NPS) pollution in the vadose zone, and more are expected. This paper focuses on the special contributions of modeling and network communications for enhancing the effectiveness of scientists in the realm of policy debates regarding NPS pollution mitigation and abatement. The discussion examines a fundamental shift from a strict regulatory strategy of pollution control characterized by a bureaucratic/technical alliance during the period through the 1970's and early 1980's, to a more recently evolving paradigm of pluralistic environmental management. The role of science and scientists in this shift is explored, with special attention to the challenges facing scientists working in NPS pollution in the vadose zone. These scientists labor under a special handicap in the evolving model because their scientific tools are often times incapable of linking NPS pollution with individuals responsible for causing it. Information can facilitate the effectiveness of these scientists in policy debates, but not under the usual assumptions in which scientific truth prevails. Instead, information technology's key role is in helping scientists shape the evolving discussion of trade-offs and in bringing citizens and policymakers closer to the routine work of scientists.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-26
... assess disaster logistics planning and response capabilities and identify areas of relative strength and...; Logistics Capability Assessment Tool (LCAT) AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice...: Collection of Information Title: Logistics Capability Assessment Tool (LCAT). Type of Information Collection...
NASA Technical Reports Server (NTRS)
Scholz, A. L.; Hart, M. T.; Lowry, D. J.
1987-01-01
The Preliminary Issues Database (PIDB) was assembled very early in the study as one of the fundamental tools to be used throughout the study. Data was acquired from a variety of sources and compiled in such a way that the data could be easily sorted in accordance with a number of different analytical objectives. The system was computerized to significantly expedite sorting and make it more usable. The information contained in the PIDB is summarized and the reader is provided with the capability to manually find items of interest.
The application of remote sensing techniques: Technical and methodological issues
NASA Technical Reports Server (NTRS)
Polcyn, F. C.; Wagner, T. W.
1974-01-01
Capabilities and limitations of modern imaging electromagnetic sensor systems are outlined, and the products of such systems are compared with those of the traditional aerial photographic system. Focus is given to the interface between the rapidly developing remote sensing technology and the information needs of operational agencies, and communication gaps are shown to retard early adoption of the technology by these agencies. An assessment is made of the current status of imaging remote sensors and their potential for the future. Public sources of remote sensor data and several cost comparisons are included.
NASA Technical Reports Server (NTRS)
Lietzke, K. R.
1974-01-01
The application of remotely-sensed information to the mineral, fossil fuel, and geothermal energy extraction industry is investigated. Public and private cost savings are documented in geologic mapping activities. Benefits and capabilities accruing to the ERS system are assessed. It is shown that remote sensing aids in resource extraction, as well as the monitoring of several dynamic phenomena, including disturbed lands, reclamation, erosion, glaciation, and volcanic and seismic activity.
Jia, Jia; Chen, Jhensi; Yao, Jun; Chu, Daping
2017-03-17
A high quality 3D display requires a high amount of optical information throughput, which needs an appropriate mechanism to distribute information in space uniformly and efficiently. This study proposes a front-viewing system which is capable of managing the required amount of information efficiently from a high bandwidth source and projecting 3D images with a decent size and a large viewing angle at video rate in full colour. It employs variable gratings to support a high bandwidth distribution. This concept is scalable and the system can be made compact in size. A horizontal parallax only (HPO) proof-of-concept system is demonstrated by projecting holographic images from a digital micro mirror device (DMD) through rotational tiled gratings before they are realised on a vertical diffuser for front-viewing.
Virtual Observatory Science Applications
NASA Technical Reports Server (NTRS)
McGlynn, Tom
2005-01-01
Many Virtual-Observatory-based applications are now available to astronomers for use in their research. These span data discovery, access, visualization and analysis. Tools can quickly gather and organize information from sites around the world to help in planning a response to a gamma-ray burst, help users pick filters to isolate a desired feature, make an average template for z=2 AGN, select sources based upon information in many catalogs, or correlate massive distributed databases. Using VO protocols, the reach of existing software tools and packages can be greatly extended, allowing users to find and access remote information almost as conveniently as local data. The talk highlights just a few of the tools available to scientists, describes how both large and small scale projects can use existing tools, and previews some of the new capabilities that will be available in the next few years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Manajit; Habte, Aron; Gueymard, Christian
As the world looks for low-carbon sources of energy, solar power stands out as the single most abundant energy resource on Earth. Harnessing this energy is the challenge for this century. Photovoltaics, solar heating and cooling, and concentrating solar power (CSP) are primary forms of energy applications using sunlight. These solar energy systems use different technologies, collect different fractions of the solar resource, and have different siting requirements and production capabilities. Reliable information about the solar resource is required for every solar energy application. This holds true for small installations on a rooftop as well as for large solar powermore » plants; however, solar resource information is of particular interest for large installations, because they require substantial investment, sometimes exceeding 1 billion dollars in construction costs. Before such a project is undertaken, the best possible information about the quality and reliability of the fuel source must be made available. That is, project developers need reliable data about the solar resource available at specific locations, including historic trends with seasonal, daily, hourly, and (preferably) subhourly variability to predict the daily and annual performance of a proposed power plant. Without this data, an accurate financial analysis is not possible. Additionally, with the deployment of large amounts of distributed photovoltaics, there is an urgent need to integrate this source of generation to ensure the reliability and stability of the grid. Forecasting generation from the various sources will allow for larger penetrations of these generation sources because utilities and system operators can then ensure stable grid operations. Developed by the foremost experts in the field who have come together under the umbrella of the International Energy Agency's Solar Heating and Cooling Task 46, this handbook summarizes state-of-the-art information about all the above topics.« less
TOWARD THE DEVELOPMENT OF A CONSENSUS MATERIALS DATABASE FOR PRESSURE TECHNOLGY APPLICATIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swindeman, Robert W; Ren, Weiju
The ASME construction code books specify materials and fabrication procedures that are acceptable for pressure technology applications. However, with few exceptions, the materials properties provided in the ASME code books provide no statistics or other information pertaining to material variability. Such information is central to the prediction and prevention of failure events. Many sources of materials data exist that provide variability information but such sources do not necessarily represent a consensus of experts with respect to the reported trends that are represented. Such a need has been identified by the ASME Standards Technology, LLC and initial steps have been takenmore » to address these needs: however, these steps are limited to project-specific applications only, such as the joint DOE-ASME project on materials for Generation IV nuclear reactors. In contrast to light-water reactor technology, the experience base for the Generation IV nuclear reactors is somewhat lacking and heavy reliance must be placed on model development and predictive capability. The database for model development is being assembled and includes existing code alloys such as alloy 800H and 9Cr-1Mo-V steel. Ownership and use rights are potential barriers that must be addressed.« less
Source Methodology for Turbofan Noise Prediction (SOURCE3D Technical Documentation)
NASA Technical Reports Server (NTRS)
Meyer, Harold D.
1999-01-01
This report provides the analytical documentation for the SOURCE3D Rotor Wake/Stator Interaction Code. It derives the equations for the rotor scattering coefficients and stator source vector and scattering coefficients that are needed for use in the TFANS (Theoretical Fan Noise Design/Prediction System). SOURCE3D treats the rotor and stator as isolated source elements. TFANS uses this information, along with scattering coefficients for inlet and exit elements, and provides complete noise solutions for turbofan engines. SOURCE3D is composed of a collection of FORTRAN programs that have been obtained by extending the approach of the earlier V072 Rotor Wake/Stator Interaction Code. Similar to V072, it treats the rotor and stator as a collection of blades and vanes having zero thickness and camber contained in an infinite, hardwall annular duct. SOURCE3D adds important features to the V072 capability-a rotor element, swirl flow and vorticity waves, actuator disks for flow turning, and combined rotor/actuator disk and stator/actuator disk elements. These items allow reflections from the rotor, frequency scattering, and mode trapping, thus providing more complete noise predictions than previously. The code has been thoroughly verified through comparison with D.B. Hanson's CUP2D two- dimensional code using a narrow annulus test case.
Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform
NASA Astrophysics Data System (ADS)
Shrestha, S. R.; Viswambharan, V.; Doshi, A.
2017-12-01
Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics capabilities that exploit distributed computing in an enterprise environment.
Unified Simulation and Analysis Framework for Deep Space Navigation Design
NASA Technical Reports Server (NTRS)
Anzalone, Evan; Chuang, Jason; Olsen, Carrie
2013-01-01
As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.
NASA Astrophysics Data System (ADS)
Webb-Williams, Jane
2017-04-01
Self-efficacy has been shown to influence student engagement, effort and performance as well as course selection and future career choice. Extending our knowledge regarding the development of self-efficacy has important implications for educators and for those concerned about the international uptake of science careers. Previous research has identified four sources that may contribute towards self-efficacy: mastery experiences, vicarious experiences, verbal persuasion and physiological/affective states. Very little research has been conducted within the school environment that looks at the formation of these sources and yet early school experiences have been posited to be a key factor in girls' lack of engagement in post compulsory science education. This paper investigates children's self-efficacy beliefs in science and reports on findings from mixed method research conducted with 182 children aged between 10 and 12 years. Classroom data were collected through focus groups, individual interviews and surveys. Findings revealed that although girls and boys held similar levels of academic performance in science, many girls underestimated their capability. The four sources of self-efficacy identified by Bandura (1997) plus self-regulation as an additional source, were evident in the children's descriptions, with boys being more influenced by mastery experience and girls by a combination of vicarious experience and physiological/affective states. Girl's appraisal of information appeared to operate through a heuristic process whereby girls disregarded salient information such as teacher feedback in favour of reliance on social comparison. Contextual factors were identified. Implications for science teachers are discussed.
A Shared Infrastructure for Federated Search Across Distributed Scientific Metadata Catalogs
NASA Astrophysics Data System (ADS)
Reed, S. A.; Truslove, I.; Billingsley, B. W.; Grauch, A.; Harper, D.; Kovarik, J.; Lopez, L.; Liu, M.; Brandt, M.
2013-12-01
The vast amount of science metadata can be overwhelming and highly complex. Comprehensive analysis and sharing of metadata is difficult since institutions often publish to their own repositories. There are many disjoint standards used for publishing scientific data, making it difficult to discover and share information from different sources. Services that publish metadata catalogs often have different protocols, formats, and semantics. The research community is limited by the exclusivity of separate metadata catalogs and thus it is desirable to have federated search interfaces capable of unified search queries across multiple sources. Aggregation of metadata catalogs also enables users to critique metadata more rigorously. With these motivations in mind, the National Snow and Ice Data Center (NSIDC) and Advanced Cooperative Arctic Data and Information Service (ACADIS) implemented two search interfaces for the community. Both the NSIDC Search and ACADIS Arctic Data Explorer (ADE) use a common infrastructure which keeps maintenance costs low. The search clients are designed to make OpenSearch requests against Solr, an Open Source search platform. Solr applies indexes to specific fields of the metadata which in this instance optimizes queries containing keywords, spatial bounds and temporal ranges. NSIDC metadata is reused by both search interfaces but the ADE also brokers additional sources. Users can quickly find relevant metadata with minimal effort and ultimately lowers costs for research. This presentation will highlight the reuse of data and code between NSIDC and ACADIS, discuss challenges and milestones for each project, and will identify creation and use of Open Source libraries.
Elysee, Gerald; Herrin, Jeph; Horwitz, Leora I
2017-10-01
Stagnation in hospitals' adoption of data integration functionalities coupled with reduction in the number of operational health information exchanges could become a significant impediment to hospitals' adoption of 3 critical capabilities: electronic health information exchange, interoperability, and medication reconciliation, in which electronic systems are used to assist with resolving medication discrepancies and improving patient safety. Against this backdrop, we assessed the relationships between the 3 capabilities.We conducted an observational study applying partial least squares-structural equation modeling technique to 27 variables obtained from the 2013 American Hospital Association annual survey Information Technology (IT) supplement, which describes health IT capabilities.We included 1330 hospitals. In confirmatory factor analysis, out of the 27 variables, 15 achieved loading values greater than 0.548 at P < .001, as such were validated as the building blocks of the 3 capabilities. Subsequent path analysis showed a significant, positive, and cyclic relationship between the capabilities, in that decreases in the hospitals' adoption of one would lead to decreases in the adoption of the others.These results show that capability for high quality medication reconciliation may be impeded by lagging adoption of interoperability and health information exchange capabilities. Policies focused on improving one or more of these capabilities may have ancillary benefits.
NASA Astrophysics Data System (ADS)
Hutchens, Robert E., III
2001-04-01
Joint force commanders must have the right information at the right time in order to make the best decisions to conduct successful contingency operations in defense of U.S. national security interests. A key enabler to this end is sufficient wideband satellite communications connectivity DoD's (Department of Defense) organic wideband satellite communications capabilities are inadequate, so commercial services must be used to overcome the shortfall. The problem is to dedicate enough resources in the most efficient manner to meet this growing need, and time is of the essence, This paper capitalizes on the vast work already accomplished concerning what DoD needs to do to obtain the commercial wideband satellite communications it needs. DoD is procuring advanced satellite ground terminals capable of using commercial wideband satellites and is contracting to launch more of its own capabilities, but the gap is continuing to widen. This paper offers a solution of procuring 140 percent of DoD's projected wideband satellite communications from commercial sources, to ensure sufficient capacity is available to support contingency operations.
NASA Technical Reports Server (NTRS)
Babiak-Vazquez, Adriana; Ruffaner, Lanie M.; Wear, Mary L.; Crucian, Brian; Sams, Clarence; Lee, Lesley R.; Van Baalen, Mary
2016-01-01
In 2010, NASA implemented Lifetime Surveillance of Astronaut Health, a formal occupational surveillance program for the U.S. astronaut corps. Because of the nature of the space environment, space medicine presents unique challenges and opportunities for epidemiologists. One such example is the use of telemedicine while crewmembers are in flight, where the primary source of information about crew health is verbal communication between physicians and their crewmembers. Due to restricted medical capabilities, the available health information is primarily crewmember report of signs and symptoms, rather than diagnoses. As epidemiologists at NASA, Johnson Space Center, we have shifted our paradigm from tracking diagnoses based on traditional terrestrial clinical practice to one in which we also incorporate reported symptomology as potential antecedents of disease. In this presentation we describe how characterization of reported signs and symptoms can be used to establish incidence rates for inflight immunologic events. We describe interdisciplinary data sources of information that are used in combination with medical information to analyze the data. We also delineate criteria for symptom classification inclusion. Finally, we present incidence tables and graphs to illustrate the final outcomes. Using signs and symptoms reported via telemedicine, the epidemiologists provide summary evidence regarding incidence of potential inflight medical conditions. These results inform our NASA physicians and scientists, and support evaluation of the occupational health risks associated with spaceflight.
Ozone Satellite Data Synergy and Combination with Non-satellite Data in the AURORA project
NASA Astrophysics Data System (ADS)
Cortesi, U.; Tirelli, C.; Arola, A.; Dragani, R.; Keppens, A.; Loenen, E.; Masini, A.; Tsiakos, , C.; van der A, R.; Verberne, K.
2017-12-01
The geostationary satellite constellation composed of TEMPO (North America), SENTINEL-4 (Europe) and GEMS (Asia) missions is a major instance of space component in the fundamentally new paradigm aimed at integrating information on air quality from a wide variety of sources. Space-borne data on tropospheric composition from new generation satellites have a growing impact in this context because of their unprecedented quantity and quality, while merging with non-satellite measurements and other types of auxiliary data via state-of-the-art modelling capabilities remains essential to fit the purpose of highly accurate information made readily available at high temporal and spatial resolution, both in analysis and forecast mode. Proper and effective implementation of this paradigm poses severe challenges to science, technology and applications that must be addressed in a closely interconnected manner to pave the way to high quality products and innovative services. Novel ideas and tools built on these three pillars are currently under investigation in the AURORA (Advanced Ultraviolet Radiation and Ozone Retrieval for Applications) Horizon 2020 project of the European Commission. The primary goal of the project is the proof of concept of a synergistic approach to the exploitation of Sentinel-4 and -5 Ozone measurements in the UV, Visible and Thermal Infrared based on the combination of an innovative data fusion method and assimilation models. The scientific objective shares the same level of priority with the technological effort to realize a prototype data processor capable to manage the full data processing chain and with the development of two downstream applications for demonstration purposes. The presentation offers a first insight in mid-term results of the project, which is mostly based on the use of synthetic data from the atmospheric Sentinels. Specific focus is given to the role of satellite data synergy in integrated systems for air quality monitoring, in particular when testing the impact of TEMPO and GEMS Ozone data in AURORA. As a further element relevant for the integration of multiple data sources, we describe the AIR-Portal application, which is going to combine AURORA partial columns of tropospheric ozone with other source of information for air quality analysis and forecast in metropolitan areas.
Digital Image Processing Overview For Helmet Mounted Displays
NASA Astrophysics Data System (ADS)
Parise, Michael J.
1989-09-01
Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orvis, W.J.
1993-11-03
The Computer Incident Advisory Capability (CIAC) operates two information servers for the DOE community, FELICIA (formerly FELIX) and IRBIS. FELICIA is a computer Bulletin Board System (BBS) that can be accessed by telephone with a modem. IRBIS is an anonymous ftp server that can be accessed on the Internet. Both of these servers contain all of the publicly available CIAC, CERT, NIST, and DDN bulletins, virus descriptions, the VIRUS-L moderated virus bulletin board, copies of public domain and shareware virus- detection/protection software, and copies of useful public domain and shareware utility programs. This guide describes how to connect these systemsmore » and obtain files from them.« less
NASA Astrophysics Data System (ADS)
Udell, C.; Selker, J. S.
2017-12-01
The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.
Military clouds: utilization of cloud computing systems at the battlefield
NASA Astrophysics Data System (ADS)
Süleyman, Sarıkürk; Volkan, Karaca; İbrahim, Kocaman; Ahmet, Şirzai
2012-05-01
Cloud computing is known as a novel information technology (IT) concept, which involves facilitated and rapid access to networks, servers, data saving media, applications and services via Internet with minimum hardware requirements. Use of information systems and technologies at the battlefield is not new. Information superiority is a force multiplier and is crucial to mission success. Recent advances in information systems and technologies provide new means to decision makers and users in order to gain information superiority. These developments in information technologies lead to a new term, which is known as network centric capability. Similar to network centric capable systems, cloud computing systems are operational today. In the near future extensive use of military clouds at the battlefield is predicted. Integrating cloud computing logic to network centric applications will increase the flexibility, cost-effectiveness, efficiency and accessibility of network-centric capabilities. In this paper, cloud computing and network centric capability concepts are defined. Some commercial cloud computing products and applications are mentioned. Network centric capable applications are covered. Cloud computing supported battlefield applications are analyzed. The effects of cloud computing systems on network centric capability and on the information domain in future warfare are discussed. Battlefield opportunities and novelties which might be introduced to network centric capability by cloud computing systems are researched. The role of military clouds in future warfare is proposed in this paper. It was concluded that military clouds will be indispensible components of the future battlefield. Military clouds have the potential of improving network centric capabilities, increasing situational awareness at the battlefield and facilitating the settlement of information superiority.
Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali
2017-06-01
Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.
NASA Astrophysics Data System (ADS)
Del Pozzo, Walter
2012-08-01
The advanced worldwide network of gravitational waves (GW) observatories is scheduled to begin operations within the current decade. Thanks to their improved sensitivity, they promise to yield a number of detections and thus to open new observational windows for astronomy and astrophysics. Among the scientific goals that should be achieved, there is the independent measurement of the value of the cosmological parameters, hence an independent test of the current cosmological paradigm. Because of the importance of such a task, a number of studies have evaluated the capabilities of GW telescopes in this respect. However, since GW do not yield information about the source redshift, different groups have made different assumptions regarding the means through which the GW redshift can be obtained. These different assumptions imply also different methodologies to solve this inference problem. This work presents a formalism based on Bayesian inference developed to facilitate the inclusion of all assumptions and prior information about a GW source within a single data analysis framework. This approach guarantees the minimization of information loss and the possibility of including naturally event-specific knowledge (such as the sky position for a gamma ray burst-GW coincident observation) in the analysis. The workings of the method are applied to a specific example, loosely designed along the lines of the method proposed by Schutz in 1986, in which one uses information from wide-field galaxy surveys as prior information for the location of a GW source. I show that combining the results from few tens of observations from a network of advanced interferometers will constrain the Hubble constant H0 to an accuracy of ˜4%-5% at 95% confidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; Henry, Michael J.; Burtner, IV, E. R.
The International Atomic Energy Agency (IAEA) is interested in increasing capabilities of IAEA safeguards inspectors to access information that would improve their situational awareness on the job. A mobile information platform could potentially provide access to information, analytics, and technical and logistical support to inspectors in the field, as well as providing regular updates to analysts at IAEA Headquarters in Vienna or at satellite offices. To demonstrate the potential capability of such a system, Pacific Northwest National Laboratory (PNNL) implemented a number of example capabilities within a PNNL-developed precision information environment (PIE), and using a tablet as a mobile informationmore » platform. PNNL’s safeguards proof-of-concept PIE intends to; demonstrate novel applications of mobile information platforms to international safeguards use cases; demonstrate proof-of-principle capability implementation; and provide “vision” for capabilities that could be implemented. This report documents the lessons learned from this two-year development activity for the Precision Information Environment for International Safeguards (PIE-IS), describing the developed capabilities, technical challenges, and considerations for future development, so that developers working to develop a similar system for the IAEA or other safeguards agencies might benefit from our work.« less
NASA Astrophysics Data System (ADS)
Biswas, A.
2016-12-01
A proficient way to deal with appraisal model parameters from total gradient of gravity and magnetic data in light of Very Fast Simulated Annealing (VFSA) has been exhibited. This is the first run through of applying VFSA in deciphering total gradient of potential field information with another detailing estimation brought on because of detached causative sources installed in the subsurface. The model parameters translated here are the amplitude coefficient (k), accurate origin of causative source (x0) depth (z0) and the shape factor (q). The outcome of VFSA improvement demonstrates that it can exceptionally decide all the model parameters when shape variable is fixed. The model parameters assessed by the present strategy, for the most part the shape and depth of the covered structures was observed to be in astounding concurrence with the genuine parameters. The technique has likewise the capability of dodging very uproarious information focuses and enhances the understanding results. Investigation of Histogram and cross-plot examination likewise proposes the translation inside the assessed ambiguity. Inversion of noise-free and noisy synthetic data information for single structures and field information shows the viability of the methodology. The procedure has been carefully and adequately connected to genuine field cases (Leona Anomaly, Senegal for gravity and Pima copper deposit, USA for magnetic) with the nearness of mineral bodies. The present technique can be to a great degree material for mineral investigation or ore bodies of dyke-like structure rooted in the shallow and more deep subsurface. The calculation time for the entire procedure is short.
James, Joseph; Murukeshan, Vadakke Matham; Woh, Lye Sun
2014-07-01
The structural and molecular heterogeneities of biological tissues demand the interrogation of the samples with multiple energy sources and provide visualization capabilities at varying spatial resolution and depth scales for obtaining complementary diagnostic information. A novel multi-modal imaging approach that uses optical and acoustic energies to perform photoacoustic, ultrasound and fluorescence imaging at multiple resolution scales from the tissue surface and depth is proposed in this paper. The system comprises of two distinct forms of hardware level integration so as to have an integrated imaging system under a single instrumentation set-up. The experimental studies show that the system is capable of mapping high resolution fluorescence signatures from the surface, optical absorption and acoustic heterogeneities along the depth (>2cm) of the tissue at multi-scale resolution (<1µm to <0.5mm).
Combined mining: discovering informative knowledge in complex data.
Cao, Longbing; Zhang, Huaifeng; Zhao, Yanchang; Luo, Dan; Zhang, Chengqi
2011-06-01
Enterprise data mining applications often involve complex data such as multiple large heterogeneous data sources, user preferences, and business impact. In such situations, a single method or one-step mining is often limited in discovering informative knowledge. It would also be very time and space consuming, if not impossible, to join relevant large data sources for mining patterns consisting of multiple aspects of information. It is crucial to develop effective approaches for mining patterns combining necessary information from multiple relevant business lines, catering for real business settings and decision-making actions rather than just providing a single line of patterns. The recent years have seen increasing efforts on mining more informative patterns, e.g., integrating frequent pattern mining with classifications to generate frequent pattern-based classifiers. Rather than presenting a specific algorithm, this paper builds on our existing works and proposes combined mining as a general approach to mining for informative patterns combining components from either multiple data sets or multiple features or by multiple methods on demand. We summarize general frameworks, paradigms, and basic processes for multifeature combined mining, multisource combined mining, and multimethod combined mining. Novel types of combined patterns, such as incremental cluster patterns, can result from such frameworks, which cannot be directly produced by the existing methods. A set of real-world case studies has been conducted to test the frameworks, with some of them briefed in this paper. They identify combined patterns for informing government debt prevention and improving government service objectives, which show the flexibility and instantiation capability of combined mining in discovering informative knowledge in complex data.
EO-1 analysis applicable to coastal characterization
NASA Astrophysics Data System (ADS)
Burke, Hsiao-hua K.; Misra, Bijoy; Hsu, Su May; Griffin, Michael K.; Upham, Carolyn; Farrar, Kris
2003-09-01
The EO-1 satellite is part of NASA's New Millennium Program (NMP). It consists of three imaging sensors: the multi-spectral Advanced Land Imager (ALI), Hyperion and Atmospheric Corrector. Hyperion provides a high-resolution hyperspectral imager capable of resolving 220 spectral bands (from 0.4 to 2.5 micron) with a 30 m resolution. The instrument images a 7.5 km by 100 km land area per image. Hyperion is currently the only space-borne HSI data source since the launch of EO-1 in late 2000. The discussion begins with the unique capability of hyperspectral sensing to coastal characterization: (1) most ocean feature algorithms are semi-empirical retrievals and HSI has all spectral bands to provide legacy with previous sensors and to explore new information, (2) coastal features are more complex than those of deep ocean that coupled effects are best resolved with HSI, and (3) with contiguous spectral coverage, atmospheric compensation can be done with more accuracy and confidence, especially since atmospheric aerosol effects are the most pronounced in the visible region where coastal feature lie. EO-1 data from Chesapeake Bay from 19 February 2002 are analyzed. In this presentation, it is first illustrated that hyperspectral data inherently provide more information for feature extraction than multispectral data despite Hyperion has lower SNR than ALI. Chlorophyll retrievals are also shown. The results compare favorably with data from other sources. The analysis illustrates the potential value of Hyperion (and HSI in general) data to coastal characterization. Future measurement requirements (air borne and space borne) are also discussed.
Federated Giovanni: A Distributed Web Service for Analysis and Visualization of Remote Sensing Data
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2014-01-01
The Geospatial Interactive Online Visualization and Analysis Interface (Giovanni) is a popular tool for users of the Goddard Earth Sciences Data and Information Services Center (GES DISC) and has been in use for over a decade. It provides a wide variety of algorithms and visualizations to explore large remote sensing datasets without having to download the data and without having to write readers and visualizers for it. Giovanni is now being extended to enable its capabilities at other data centers within the Earth Observing System Data and Information System (EOSDIS). This Federated Giovanni will allow four other data centers to add and maintain their data within Giovanni on behalf of their user community. Those data centers are the Physical Oceanography Distributed Active Archive Center (PO.DAAC), MODIS Adaptive Processing System (MODAPS), Ocean Biology Processing Group (OBPG), and Land Processes Distributed Active Archive Center (LP DAAC). Three tiers are supported: Tier 1 (GES DISC-hosted) gives the remote data center a data management interface to add and maintain data, which are provided through the Giovanni instance at the GES DISC. Tier 2 packages Giovanni up as a virtual machine for distribution to and deployment by the other data centers. Data variables are shared among data centers by sharing documents from the Solr database that underpins Giovanni's data management capabilities. However, each data center maintains their own instance of Giovanni, exposing the variables of most interest to their user community. Tier 3 is a Shared Source model, in which the data centers cooperate to extend the infrastructure by contributing source code.
NASA Technical Reports Server (NTRS)
Fletcher, Lauren E.; Aldridge, Ann M.; Wheelwright, Charles; Maida, James
1997-01-01
Task illumination has a major impact on human performance: What a person can perceive in his environment significantly affects his ability to perform tasks, especially in space's harsh environment. Training for lighting conditions in space has long depended on physical models and simulations to emulate the effect of lighting, but such tests are expensive and time-consuming. To evaluate lighting conditions not easily simulated on Earth, personnel at NASA Johnson Space Center's (JSC) Graphics Research and Analysis Facility (GRAF) have been developing computerized simulations of various illumination conditions using the ray-tracing program, Radiance, developed by Greg Ward at Lawrence Berkeley Laboratory. Because these computer simulations are only as accurate as the data used, accurate information about the reflectance properties of materials and light distributions is needed. JSC's Lighting Environment Test Facility (LETF) personnel gathered material reflectance properties for a large number of paints, metals, and cloths used in the Space Shuttle and Space Station programs, and processed these data into reflectance parameters needed for the computer simulations. They also gathered lamp distribution data for most of the light sources used, and validated the ability to accurately simulate lighting levels by comparing predictions with measurements for several ground-based tests. The result of this study is a database of material reflectance properties for a wide variety of materials, and lighting information for most of the standard light sources used in the Shuttle/Station programs. The combination of the Radiance program and GRAF's graphics capability form a validated computerized lighting simulation capability for NASA.
Multidepth imaging by chromatic dispersion confocal microscopy
NASA Astrophysics Data System (ADS)
Olsovsky, Cory A.; Shelton, Ryan L.; Saldua, Meagan A.; Carrasco-Zevallos, Oscar; Applegate, Brian E.; Maitland, Kristen C.
2012-03-01
Confocal microscopy has shown potential as an imaging technique to detect precancer. Imaging cellular features throughout the depth of epithelial tissue may provide useful information for diagnosis. However, the current in vivo axial scanning techniques for confocal microscopy are cumbersome, time-consuming, and restrictive when attempting to reconstruct volumetric images acquired in breathing patients. Chromatic dispersion confocal microscopy (CDCM) exploits severe longitudinal chromatic aberration in the system to axially disperse light from a broadband source and, ultimately, spectrally encode high resolution images along the depth of the object. Hyperchromat lenses are designed to have severe and linear longitudinal chromatic aberration, but have not yet been used in confocal microscopy. We use a hyperchromat lens in a stage scanning confocal microscope to demonstrate the capability to simultaneously capture information at multiple depths without mechanical scanning. A photonic crystal fiber pumped with a 830nm wavelength Ti:Sapphire laser was used as a supercontinuum source, and a spectrometer was used as the detector. The chromatic aberration and magnification in the system give a focal shift of 140μm after the objective lens and an axial resolution of 5.2-7.6μm over the wavelength range from 585nm to 830nm. A 400x400x140μm3 volume of pig cheek epithelium was imaged in a single X-Y scan. Nuclei can be seen at several depths within the epithelium. The capability of this technique to achieve simultaneous high resolution confocal imaging at multiple depths may reduce imaging time and motion artifacts and enable volumetric reconstruction of in vivo confocal images of the epithelium.
BBN technical memorandum W1291 infrasound model feasibility study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, T., BBN Systems and Technologies
1998-05-01
The purpose of this study is to determine the need and level of effort required to add existing atmospheric databases and infrasound propagation models to the DOE`s Hydroacoustic Coverage Assessment Model (HydroCAM) [1,2]. The rationale for the study is that the performance of the infrasound monitoring network will be an important factor for both the International Monitoring System (IMS) and US national monitoring capability. Many of the technical issues affecting the design and performance of the infrasound network are directly related to the variability of the atmosphere and the corresponding uncertainties in infrasound propagation. It is clear that the studymore » of these issues will be enhanced by the availability of software tools for easy manipulation and interfacing of various atmospheric databases and infrasound propagation models. In addition, since there are many similarities between propagation in the oceans and in the atmosphere, it is anticipated that much of the software infrastructure developed for hydroacoustic database manipulation and propagation modeling in HydroCAM will be directly extendible to an infrasound capability. The study approach was to talk to the acknowledged domain experts in the infrasound monitoring area to determine: 1. The major technical issues affecting infrasound monitoring network performance. 2. The need for an atmospheric database/infrasound propagation modeling capability similar to HydroCAM. 3. The state of existing infrasound propagation codes and atmospheric databases. 4. A recommended approach for developing the required capabilities. A list of the people who contributed information to this study is provided in Table 1. We also relied on our knowledge of oceanographic and meteorological data sources to determine the availability of atmospheric databases and the feasibility of incorporating this information into the existing HydroCAM geographic database software. This report presents a summary of the need for an integrated infrasound modeling capability in Section 2.0. Section 3.0 provides a recommended approach for developing this capability in two stages; a basic capability and an extended capability. This section includes a discussion of the available static and dynamic databases, and the various modeling tools which are available or could be developed under such a task. The conclusions and recommendations of the study are provided in Section 4.0.« less
Materials management information systems.
1996-01-01
The hospital materials management function--ensuring that goods and services get from a source to an end user--encompasses many areas of the hospital and can significantly affect hospital costs. Performing this function in a manner that will keep costs down and ensure adequate cash flow requires effective management of a large amount of information from a variety of sources. To effectively coordinate such information, most hospitals have implemented some form of materials management information system (MMIS). These systems can be used to automate or facilitate functions such as purchasing, accounting, inventory management, and patient supply charges. In this study, we evaluated seven MMISs from seven vendors, focusing on the functional capabilities of each system and the quality of the service and support provided by the vendor. This Evaluation is intended to (1) assist hospitals purchasing an MMIS by educating materials managers about the capabilities, benefits, and limitations of MMISs and (2) educate clinical engineers and information system managers about the scope of materials management within a healthcare facility. Because software products cannot be evaluated in the same manner as most devices typically included in Health Devices Evaluations, our standard Evaluation protocol was not applicable for this technology. Instead, we based our ratings on our observations (e.g., during site visits), interviews we conducted with current users of each system, and information provided by the vendor (e.g., in response to a request for information [RFI]). We divided the Evaluation into the following sections: Section 1. Responsibilities and Information Requirements of Materials Management: Provides an overview of typical materials management functions and describes the capabilities, benefits, and limitations of MMISs. Also includes the supplementary article, "Inventory Cost and Reimbursement Issues" and the glossary, "Materials Management Terminology." Section 2. The MMIS Selection Process: Outlines steps to follow and describes factors to consider when selecting an MMIS. Also includes our Materials Management Process Evaluation and Needs Assessment Worksheet (which is also available online through ECRInet(TM)) and a list of suggested interview questions to be used when gathering user experience information for systems under consideration. Section 3A. MMIS Vendor Profiles: Presents information for the evaluated systems in a standardized, easy-to-compare format. Profiles include an Executive Summary describing our findings, a discussion of user comments, a listing of MMIS specifications, and information on the vendor's business background. Section 3B. Discussion of Vendor Profile Conclusions and Ratings: Presents our ratings and summarizes our rationale for all evaluated systems. Also includes a blank Vendor Profile Template to be used when gathering information on other vendors and systems. We found that, in general, all of the evaluated systems are able to meet most of the functional needs of a materials management department. However, we did uncover significant differences in the quality of service and support provided by each vendor, and our ratings reflect these differences: we rated two of the systems Acceptable--Preferred and four of the systems Acceptable. We have not yet rated the seventh system because our user experience information may not reflect the vendor's new ownership and management. When this vendor provides the references we requested, we will interview users and supply a rating. We caution readers against basing purchasing decisions solely on our ratings. Each hospital must consider the unique needs of its users and its overall strategic plans--a process that can be aided by using our Process Evaluation and Needs Assessment Worksheet. Our conclusions can then be used to narrow down the number of vendors under consideration...
Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali
2017-01-01
Background Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. Objective The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. Methods This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach’s alpha of 75%. Data was analyzed using the decision Delphi technique. Results GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Conclusion Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information. PMID:28848627
A reliable sewage quality abnormal event monitoring system.
Li, Tianling; Winnel, Melissa; Lin, Hao; Panther, Jared; Liu, Chang; O'Halloran, Roger; Wang, Kewen; An, Taicheng; Wong, Po Keung; Zhang, Shanqing; Zhao, Huijun
2017-09-15
With closing water loop through purified recycled water, wastewater becomes a part of source water, requiring reliable wastewater quality monitoring system (WQMS) to manage wastewater source and mitigate potential health risks. However, the development of reliable WQMS is fatally constrained by severe contamination and biofouling of sensors due to the hostile analytical environment of wastewaters, especially raw sewages, that challenges the limit of existing sensing technologies. In this work, we report a technological solution to enable the development of WQMS for real-time abnormal event detection with high reliability and practicality. A vectored high flow hydrodynamic self-cleaning approach and a dual-sensor self-diagnostic concept are adopted for WQMS to effectively encounter vital sensor failing issues caused by contamination and biofouling and ensure the integrity of sensing data. The performance of the WQMS has been evaluated over a 3-year trial period at different sewage catchment sites across three Australian states. It has demonstrated that the developed WQMS is capable of continuously operating in raw sewage for a prolonged period up to 24 months without maintenance and failure, signifying the high reliability and practicality. The demonstrated WQMS capability to reliably acquire real-time wastewater quality information leaps forward the development of effective wastewater source management system. The reported self-cleaning and self-diagnostic concepts should be applicable to other online water quality monitoring systems, opening a new way to encounter the common reliability and stability issues caused by sensor contamination and biofouling. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mwambete, Kennedy D; Mtaturu, Zephania
2006-09-01
In Tanzania, it is considered a taboo for teachers and parents to talk with children about sexual matters including sexually transmitted diseases (STDs) in schools and at home because of cultural and religious barriers. Political pressure also keeps sexual education and thus education on STDs out of classrooms. Generally, there is disagreement over STDs education on what to teach, by whom, and to what extent. To assess the knowledge of STDs, and attitude towards sexual behavior and STDs among secondary school students. This was a cross-sectional study using a semi-structured questionnaire. A sample size of 635 students was determined by simple random sampling. Majority of the students (98%) said have heard about STDs; however their knowledge of the symptoms associated with STDs was poor. Similarly 147 (23%) students did not know other means of STDs transmission rather than sexual intercourse. A number of students who were capable of identifying all tracer STDs was comparable between the ordinary (10.5%) and advanced (10.6%) level students (p < 0.001). Thirty-two students (8%) were completely unable to identify even a single tracer STD. About 96% respondents said were capable of preventing themselves from contracting STDs, however 38% of them admitted that they were at risk of contracting STDs. Majority (99%) described more than one source of information on STDs, television and radio were the most commonly mentioned sources, whilst none of them cited parents as source of information (p < 0.001). Regarding vulnerability to STDs, 503 (79%) students said female students were more vulnerable to STDs compared to males. The level of knowledge about STDs (ability to identify tracer STDs, to describe symptoms associated with STDs and their mode of transmission) is poor with regard to the students' levels of education. Female students are more vulnerable to STDs compared to male counterparts. Mass media is still the more effective means of educating the students on STDs.
High brightness electrodeless Z-Pinch EUV source for mask inspection tools
NASA Astrophysics Data System (ADS)
Horne, Stephen F.; Partlow, Matthew J.; Gustafson, Deborah S.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.
2012-03-01
Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinchTM light source since 1995. The source is currently being used for metrology, mask inspection, and resist development. Energetiq's higher brightness source has been selected as the source for pre-production actinic mask inspection tools. This improved source enables the mask inspection tool suppliers to build prototype tools with capabilities of defect detection and review down to 16nm design rules. In this presentation we will present new source technology being developed at Energetiq to address the critical source brightness issue. The new technology will be shown to be capable of delivering brightness levels sufficient to meet the HVM requirements of AIMS and ABI and potentially API tools. The basis of the source technology is to use the stable pinch of the electrodeless light source and have a brightness of up to 100W/mm(carat)2-sr. We will explain the source design concepts, discuss the expected performance and present the modeling results for the new design.
KA-SB: from data integration to large scale reasoning
Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F
2009-01-01
Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402
Equivalent source modeling of the main field using MAGSAT data
NASA Technical Reports Server (NTRS)
1980-01-01
The software was considerably enhanced to accommodate a more comprehensive examination of data available for field modeling using the equivalent sources method by (1) implementing a dynamic core allocation capability into the software system for the automatic dimensioning of the normal matrix; (2) implementing a time dependent model for the dipoles; (3) incorporating the capability to input specialized data formats in a fashion similar to models in spherical harmonics; and (4) implementing the optional ability to simultaneously estimate observatory anomaly biases where annual means data is utilized. The time dependence capability was demonstrated by estimating a component model of 21 deg resolution using the 14 day MAGSAT data set of Goddard's MGST (12/80). The equivalent source model reproduced both the constant and the secular variation found in MGST (12/80).
Lee, Jaehoon; Hulse, Nathan C; Wood, Grant M; Oniki, Thomas A; Huff, Stanley M
2016-01-01
In this study we developed a Fast Healthcare Interoperability Resources (FHIR) profile to support exchanging a full pedigree based family health history (FHH) information across multiple systems and applications used by clinicians, patients, and researchers. We used previously developed clinical element models (CEMs) that are capable of representing the FHH information, and derived essential data elements including attributes, constraints, and value sets. We analyzed gaps between the FHH CEM elements and existing FHIR resources. Based on the analysis, we developed a profile that consists of 1) FHIR resources for essential FHH data elements, 2) extensions for additional elements that were not covered by the resources, and 3) a structured definition to integrate patient and family member information in a FHIR message. We implemented the profile using an open-source based FHIR framework and validated it using patient-entered FHH data that was captured through a locally developed FHH tool.
Network Security via Biometric Recognition of Patterns of Gene Expression
NASA Technical Reports Server (NTRS)
Shaw, Harry C.
2016-01-01
Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT (Information Technology) organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time assays of gene expression products.
Smartphones and tablets: Reshaping radiation oncologists’ lives
Gomez-Iturriaga, Alfonso; Bilbao, Pedro; Casquero, Francisco; Cacicedo, Jon; Crook, Juanita
2012-01-01
Background Smartphones and tablets are new handheld devices always connected to an information source and capable of providing instant updates, they allow doctors to access the most updated information and provide decision support at the point of care. Aim The practice of radiation oncology has always been a discipline that relies on advanced technology. Smartphones provide substantial processing power, incorporating innovative user interfaces and applications. Materials and methods The most popular smartphone and tablet app stores were searched for “radiation oncology” and “oncology” related apps. A web search was also performed searching for smartphones, tablets, oncology, radiology and radiation oncology. Results Smartphones and tablets allow rapid access to information in the form of podcasts, apps, protocols, reference texts, recent research and more. Conclusion With the rapidly changing advances in radiation oncology, the trend toward accessing resources via smartphones and tablets will only increase, future will show if this technology will improve clinical care. PMID:24669308
Integrating Remote and Social Sensing Data for a Scenario on Secure Societies in Big Data Platform
NASA Astrophysics Data System (ADS)
Albani, Sergio; Lazzarini, Michele; Koubarakis, Manolis; Taniskidou, Efi Karra; Papadakis, George; Karkaletsis, Vangelis; Giannakopoulos, George
2016-08-01
In the framework of the Horizon 2020 project BigDataEurope (Integrating Big Data, Software & Communities for Addressing Europe's Societal Challenges), a pilot for the Secure Societies Societal Challenge was designed considering the requirements coming from relevant stakeholders. The pilot is focusing on the integration in a Big Data platform of data coming from remote and social sensing.The information on land changes coming from the Copernicus Sentinel 1A sensor (Change Detection workflow) is integrated with information coming from selected Twitter and news agencies accounts (Event Detection workflow) in order to provide the user with multiple sources of information.The Change Detection workflow implements a processing chain in a distributed parallel manner, exploiting the Big Data capabilities in place; the Event Detection workflow implements parallel and distributed social media and news agencies monitoring as well as suitable mechanisms to detect and geo-annotate the related events.
Jia, Jia; Chen, Jhensi; Yao, Jun; Chu, Daping
2017-01-01
A high quality 3D display requires a high amount of optical information throughput, which needs an appropriate mechanism to distribute information in space uniformly and efficiently. This study proposes a front-viewing system which is capable of managing the required amount of information efficiently from a high bandwidth source and projecting 3D images with a decent size and a large viewing angle at video rate in full colour. It employs variable gratings to support a high bandwidth distribution. This concept is scalable and the system can be made compact in size. A horizontal parallax only (HPO) proof-of-concept system is demonstrated by projecting holographic images from a digital micro mirror device (DMD) through rotational tiled gratings before they are realised on a vertical diffuser for front-viewing. PMID:28304371
Handling of huge multispectral image data volumes from a spectral hole burning device (SHBD)
NASA Astrophysics Data System (ADS)
Graff, Werner; Rosselet, Armel C.; Wild, Urs P.; Gschwind, Rudolf; Keller, Christoph U.
1995-06-01
We use chlorin-doped polymer films at low temperatures as the primary imaging detector. Based on the principles of persistent spectral hole burning, this system is capable of storing spatial and spectral information simultaneously in one exposure with extremely high resolution. The sun as an extended light source has been imaged onto the film. The information recorded amounts to tens of GBytes. This data volume is read out by scanning the frequency of a tunable dye laser and reading the images with a digital CCD camera. For acquisition, archival, processing, and visualization, we use MUSIC (MUlti processor System with Intelligent Communication), a single instruction multiple data parallel processor system equipped with the necessary I/O facilities. The huge amount of data requires the developemnt of sophisticated algorithms to efficiently calibrate the data and to extract useful and new information for solar physics.
Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling
NASA Astrophysics Data System (ADS)
Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.
The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.
Retrieval of biophysical parameters with AVIRIS and ISM: The Landes Forest, south west France
NASA Technical Reports Server (NTRS)
Zagolski, F.; Gastellu-Etchegorry, J. P.; Mougin, E.; Giordano, G.; Marty, G.; Letoan, T.; Beaudoin, A.
1992-01-01
The first steps of an experiment for investigating the capability of airborne spectrometer data for retrieval of biophysical parameters of vegetation, especially water conditions are presented. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and ISM data were acquired in the frame of the 1991 NASA/JPL and CNES campaigns on the Landes, South west France, a large and flat forest area with mainly maritime pines. In-situ measurements were completed at that time; i.e. reflectance spectra, atmospheric profiles, sampling for further laboratory analyses of elements concentrations (lignin, water, cellulose, nitrogen,...). All information was integrated in an already existing data base (age, LAI, DBH, understory cover,...). A methodology was designed for (1) obtaining geometrically and atmospherically corrected reflectance data, (2) registering all available information, and (3) analyzing these multi-source informations. Our objective is to conduct comparative studies with simulation reflectance models, and to improve these models, especially in the MIR.
NASA Astrophysics Data System (ADS)
Lee, Sam; Lucas, Nathan P.; Ellis, R. Darin; Pandya, Abhilash
2012-06-01
This paper presents a seamlessly controlled human multi-robot system comprised of ground and aerial robots of semiautonomous nature for source localization tasks. The system combines augmented reality interfaces capabilities with human supervisor's ability to control multiple robots. The role of this human multi-robot interface is to allow an operator to control groups of heterogeneous robots in real time in a collaborative manner. It used advanced path planning algorithms to ensure obstacles are avoided and that the operators are free for higher-level tasks. Each robot knows the environment and obstacles and can automatically generate a collision-free path to any user-selected target. It displayed sensor information from each individual robot directly on the robot in the video view. In addition, a sensor data fused AR view is displayed which helped the users pin point source information or help the operator with the goals of the mission. The paper studies a preliminary Human Factors evaluation of this system in which several interface conditions are tested for source detection tasks. Results show that the novel Augmented Reality multi-robot control (Point-and-Go and Path Planning) reduced mission completion times compared to the traditional joystick control for target detection missions. Usability tests and operator workload analysis are also investigated.
NASA Astrophysics Data System (ADS)
Cheung, Derek
2015-02-01
For students to be successful in school chemistry, a strong sense of self-efficacy is essential. Chemistry self-efficacy can be defined as students' beliefs about the extent to which they are capable of performing specific chemistry tasks. According to Bandura (Psychol. Rev. 84:191-215, 1977), students acquire information about their level of self-efficacy from four sources: performance accomplishments, vicarious experiences, verbal persuasion, and physiological states. No published studies have investigated how instructional strategies in chemistry lessons can provide students with positive experiences with these four sources of self-efficacy information and how the instructional strategies promote students' chemistry self-efficacy. In this study, questionnaire items were constructed to measure student perceptions about instructional strategies, termed efficacy-enhancing teaching, which can provide positive experiences with the four sources of self-efficacy information. Structural equation modeling was then applied to test a hypothesized mediation model, positing that efficacy-enhancing teaching positively affects students' chemistry self-efficacy through their use of deep learning strategies such as metacognitive control strategies. A total of 590 chemistry students at nine secondary schools in Hong Kong participated in the survey. The mediation model provided a good fit to the student data. Efficacy-enhancing teaching had a direct effect on students' chemistry self-efficacy. Efficacy-enhancing teaching also directly affected students' use of deep learning strategies, which in turn affected students' chemistry self-efficacy. The implications of these findings for developing secondary school students' chemistry self-efficacy are discussed.
Selecting a provider: what factors influence patients' decision making?
Abraham, Jean; Sick, Brian; Anderson, Joseph; Berg, Andrea; Dehmer, Chad; Tufano, Amanda
2011-01-01
Each year consumers make a variety of decisions relating to their healthcare. Some experts argue that stronger consumer engagement in decisions about where to obtain medical care is an important mechanism for improving efficiency in healthcare delivery and financing. Consumers' ability and motivation to become more active decision makers are affected by several factors, including financial incentives and access to information. This study investigates the set of factors that consumers consider when selecting a provider, including attributes of the provider and the care experience and the reputation of the provider. Additionally, the study evaluates consumers awareness and use of formal sources of provider selection information. Our results from analyzing data from a survey of 467 patients at four clinics in Minnesota suggest that the factors considered of greatest importance include reputation of the physician and reputation of the healthcare organization. Contractual and logistical factors also play a role, with respondents highlighting the importance of seeing a provider affiliated with their health plan and appointment availability. Few respondents indicated that advertisements or formal sources of quality information affected their decision making. The key implication for provider organizations is to carefully manage referral sources to ensure that they consistently meet the needs of referrers. Excellent service to existing patients and to the network of referring physicians yields patient and referrer satisfaction that is critical to attracting new patients. Finally, organizations more generally may want to explore the capabilities of new media and social networking sites for building reputation.
Multi-source and ontology-based retrieval engine for maize mutant phenotypes
Green, Jason M.; Harnsomburana, Jaturon; Schaeffer, Mary L.; Lawrence, Carolyn J.; Shyu, Chi-Ren
2011-01-01
Model Organism Databases, including the various plant genome databases, collect and enable access to massive amounts of heterogeneous information, including sequence data, gene product information, images of mutant phenotypes, etc, as well as textual descriptions of many of these entities. While a variety of basic browsing and search capabilities are available to allow researchers to query and peruse the names and attributes of phenotypic data, next-generation search mechanisms that allow querying and ranking of text descriptions are much less common. In addition, the plant community needs an innovative way to leverage the existing links in these databases to search groups of text descriptions simultaneously. Furthermore, though much time and effort have been afforded to the development of plant-related ontologies, the knowledge embedded in these ontologies remains largely unused in available plant search mechanisms. Addressing these issues, we have developed a unique search engine for mutant phenotypes from MaizeGDB. This advanced search mechanism integrates various text description sources in MaizeGDB to aid a user in retrieving desired mutant phenotype information. Currently, descriptions of mutant phenotypes, loci and gene products are utilized collectively for each search, though expansion of the search mechanism to include other sources is straightforward. The retrieval engine, to our knowledge, is the first engine to exploit the content and structure of available domain ontologies, currently the Plant and Gene Ontologies, to expand and enrich retrieval results in major plant genomic databases. Database URL: http:www.PhenomicsWorld.org/QBTA.php PMID:21558151
NASA Astrophysics Data System (ADS)
de Boer, Maaike H. T.; Bouma, Henri; Kruithof, Maarten C.; ter Haar, Frank B.; Fischer, Noëlle M.; Hagendoorn, Laurens K.; Joosten, Bart; Raaijmakers, Stephan
2017-10-01
The information available on-line and off-line, from open as well as from private sources, is growing at an exponential rate and places an increasing demand on the limited resources of Law Enforcement Agencies (LEAs). The absence of appropriate tools and techniques to collect, process, and analyze the volumes of complex and heterogeneous data has created a severe information overload. If a solution is not found, the impact on law enforcement will be dramatic, e.g. because important evidence is missed or the investigation time is too long. Furthermore, there is an uneven level of capabilities to deal with the large volumes of complex and heterogeneous data that come from multiple open and private sources at national level across the EU, which hinders cooperation and information sharing. Consequently, there is a pertinent need to develop tools, systems and processes which expedite online investigations. In this paper, we describe a suite of analysis tools to identify and localize generic concepts, instances of objects and logos in images, which constitutes a significant portion of everyday law enforcement data. We describe how incremental learning based on only a few examples and large-scale indexing are addressed in both concept detection and instance search. Our search technology allows querying of the database by visual examples and by keywords. Our tools are packaged in a Docker container to guarantee easy deployment on a system and our tools exploit possibilities provided by open source toolboxes, contributing to the technical autonomy of LEAs.
Multiband Study of Radio Sources of the Rcr Catalogue with Virtual Observatory Tools
NASA Astrophysics Data System (ADS)
Zhelenkova, O. P.; Soboleva, N. S.; Majorova, E. K.; Temirova, A. V.
We present early results of our multiband study of the RATAN Cold Revised (RCR) catalogue obtained from seven cycles of the ``Cold'' survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980--1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO) tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.
H i Absorption in the Steep-Spectrum Superluminal Quasar 3C 216.
Pihlström; Vermeulen; Taylor; Conway
1999-11-01
The search for H i absorption in strong compact steep-spectrum sources is a natural way to probe the neutral gas contents in young radio sources. In turn, this may provide information about the evolution of powerful radio sources. The recently improved capabilities of the Westerbork Synthesis Radio Telescope have made it possible to detect a 0.31% (19 mJy) deep neutral atomic hydrogen absorption line associated with the steep-spectrum superluminal quasar 3C 216. The redshift (z=0.67) of the source shifts the frequency of the 21 cm line down to the ultra-high-frequency (UHF) band (850 MHz). The exact location of the H i-absorbing gas remains to be determined by spectral line VLBI observations at 850 MHz. We cannot exclude that the gas might be extended on galactic scales, but we think it is more likely to be located in the central kiloparsec. Constraints from the lack of X-ray absorption probably rule out obscuration of the core region, and we argue that the most plausible site for the H i absorption is in the jet-cloud interaction observed in this source.
Transmission x-ray microscopy at Diamond-Manchester I13 Imaging Branchline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vila-Comamala, Joan, E-mail: joan.vila.comamala@gmail.com; Wagner, Ulrich; Bodey, Andrew J.
2016-01-28
Full-field Transmission X-ray Microscopy (TXM) has been shown to be a powerful method for obtaining quantitative internal structural and chemical information from materials at the nanoscale. The installation of a Full-field TXM station will extend the current microtomographic capabilities of the Diamond-Manchester I13 Imaging Branchline at Diamond Light Source (UK) into the sub-100 nm spatial resolution range using photon energies from 8 to 14 keV. The dedicated Full-field TXM station will be built in-house with contributions of Diamond Light Source support divisions and via collaboration with the X-ray Optics Group of Paul Scherrer Institut (Switzerland) which will develop state-of-the-art diffractive X-raymore » optical elements. Preliminary results of the I13 Full-field TXM station are shown. The Full-field TXM will become an important Diamond Light Source direct imaging asset for material science, energy science and biology at the nanoscale.« less
Memory enhancement by a semantically unrelated emotional arousal source induced after learning.
Nielson, Kristy A; Yee, Douglas; Erickson, Kirk I
2005-07-01
It has been well established that moderate physiological or emotional arousal modulates memory. However, there is some controversy about whether the source of arousal must be semantically related to the information to be remembered. To test this idea, 35 healthy young adult participants learned a list of common nouns and afterward viewed a semantically unrelated, neutral or emotionally arousing videotape. The tape was shown after learning to prevent arousal effects on encoding or attention, instead influencing memory consolidation. Heart rate increase was significantly greater in the arousal group, and negative affect was significantly less reported in the non-arousal group after the video. The arousal group remembered significantly more words than the non-arousal group at both 30 min and 24 h delays, despite comparable group memory performance prior to the arousal manipulation. These results demonstrate that emotional arousal, even from an unrelated source, is capable of modulating memory consolidation. Potential reasons for contradictory findings in some previous studies, such as the timing of "delayed" memory tests, are discussed.
Protocol for buffer space negotiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nessett, D.
There are at least two ways to manage the buffer memory of a communications node. On etechnique veiws the buffer as a single resource that is to be reserved and released as a unit for a particular communication transaction. A more common approach treats the node's buffer space as a collection of resources (e.g., bytes, words, packet slots) capable of being allocated among multiple concurrent conversations. To achieve buffer space multiplexing, some sort of negotiation for buffer space must take place between source and sink nodes before a transaction can commence. Results are presented which indicate that, for an applicationmore » involving a CSMA broadcast network, buffer space multiplexing offers better performance than buffer reservation. To achieve this improvement, a simple protocol is presented that features flow-control information traveling both from source to sink as well as from sink to source. It is argued that this bidirectionality allows the sink to allocate buffer space among its active communication paths more effectively. 13 figures.« less
NASA Astrophysics Data System (ADS)
Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.
2018-01-01
The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.
Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockhart, Madeline Louise; McMath, Garrett Earl
Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less
Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries
Lockhart, Madeline Louise; McMath, Garrett Earl
2017-10-26
Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less
Smith, Kevin A; Athey, Brian D; Chahal, Amar P S; Sahai, Priti
2008-11-06
Velos eResearch is a commercially-developed, regulatory-compliant, web-based clinical research information system from Velos Inc. Aithent Inc. is a software development services company. The University of Michigan (UM) has public/private partnerships with Velos and Aithent to collaborate on development of additional capabilities, modules, and new products to better support the needs of clinical and translational research communities. These partnerships provide UM with a mechanism for obtaining high-quality functionally comprehensive capabilities more quickly and at lower cost, while the corporate partners get a quality advisory and development partner--this benefits all parties. The UM chose to partner with Velos in part because of its commitment to interoperability. Velos is an active participant in the NCI caBIG project and is committed to caBIG compatibility. Velos already provides interoperability with other Velos sites in the CTSA context. One example of the partnership is co-development of integrated specimen management capabilities. UM spent more than a year defining business requirements and technical specifications for, and is funding development of, this capability. UM also facilitates an autonomous user community (20+ institutions, 7 CTSA awardees); the broad goal of the group is to share experiences, expertise, identify collaborative opportunities, and support one another as well as provide a source of future needs identification to Velos. Advantages and risks related to delivering informatics capabilities to an AHC research community through a public/private partnership will be presented. The UM, Velos and Aithent will discuss frameworks, agreements and other factors that have contributed to a successful partnership.
NASA Technical Reports Server (NTRS)
Usher, P. D.
1971-01-01
The almucantar radio telescope development and characteristics are presented. The radio telescope consists of a paraboloidal reflector free to rotate in azimuth but limited in altitude between two fixed angles from the zenith. The fixed angles are designed to provide the capability where sources lying between two small circles parallel with the horizon (almucantars) are accessible at any one instant. Basic geometrical considerations in the almucantar design are presented. The capabilities of the almucantar telescope for source counting and for monitoring which are essential to a resolution of the cosmological problem are described.
Strategies for satellite-based monitoring of CO2 from distributed area and point sources
NASA Astrophysics Data System (ADS)
Schwandner, Florian M.; Miller, Charles E.; Duren, Riley M.; Natraj, Vijay; Eldering, Annmarie; Gunson, Michael R.; Crisp, David
2014-05-01
Atmospheric CO2 budgets are controlled by the strengths, as well as the spatial and temporal variabilities of CO2 sources and sinks. Natural CO2 sources and sinks are dominated by the vast areas of the oceans and the terrestrial biosphere. In contrast, anthropogenic and geogenic CO2 sources are dominated by distributed area and point sources, which may constitute as much as 70% of anthropogenic (e.g., Duren & Miller, 2012), and over 80% of geogenic emissions (Burton et al., 2013). Comprehensive assessments of CO2 budgets necessitate robust and highly accurate satellite remote sensing strategies that address the competing and often conflicting requirements for sampling over disparate space and time scales. Spatial variability: The spatial distribution of anthropogenic sources is dominated by patterns of production, storage, transport and use. In contrast, geogenic variability is almost entirely controlled by endogenic geological processes, except where surface gas permeability is modulated by soil moisture. Satellite remote sensing solutions will thus have to vary greatly in spatial coverage and resolution to address distributed area sources and point sources alike. Temporal variability: While biogenic sources are dominated by diurnal and seasonal patterns, anthropogenic sources fluctuate over a greater variety of time scales from diurnal, weekly and seasonal cycles, driven by both economic and climatic factors. Geogenic sources typically vary in time scales of days to months (geogenic sources sensu stricto are not fossil fuels but volcanoes, hydrothermal and metamorphic sources). Current ground-based monitoring networks for anthropogenic and geogenic sources record data on minute- to weekly temporal scales. Satellite remote sensing solutions would have to capture temporal variability through revisit frequency or point-and-stare strategies. Space-based remote sensing offers the potential of global coverage by a single sensor. However, no single combination of orbit and sensor provides the full range of temporal sampling needed to characterize distributed area and point source emissions. For instance, point source emission patterns will vary with source strength, wind speed and direction. Because wind speed, direction and other environmental factors change rapidly, short term variabilities should be sampled. For detailed target selection and pointing verification, important lessons have already been learned and strategies devised during JAXA's GOSAT mission (Schwandner et al, 2013). The fact that competing spatial and temporal requirements drive satellite remote sensing sampling strategies dictates a systematic, multi-factor consideration of potential solutions. Factors to consider include vista, revisit frequency, integration times, spatial resolution, and spatial coverage. No single satellite-based remote sensing solution can address this problem for all scales. It is therefore of paramount importance for the international community to develop and maintain a constellation of atmospheric CO2 monitoring satellites that complement each other in their temporal and spatial observation capabilities: Polar sun-synchronous orbits (fixed local solar time, no diurnal information) with agile pointing allow global sampling of known distributed area and point sources like megacities, power plants and volcanoes with daily to weekly temporal revisits and moderate to high spatial resolution. Extensive targeting of distributed area and point sources comes at the expense of reduced mapping or spatial coverage, and the important contextual information that comes with large-scale contiguous spatial sampling. Polar sun-synchronous orbits with push-broom swath-mapping but limited pointing agility may allow mapping of individual source plumes and their spatial variability, but will depend on fortuitous environmental conditions during the observing period. These solutions typically have longer times between revisits, limiting their ability to resolve temporal variations. Geostationary and non-sun-synchronous low-Earth-orbits (precessing local solar time, diurnal information possible) with agile pointing have the potential to provide, comprehensive mapping of distributed area sources such as megacities with longer stare times and multiple revisits per day, at the expense of global access and spatial coverage. An ad hoc CO2 remote sensing constellation is emerging. NASA's OCO-2 satellite (launch July 2014) joins JAXA's GOSAT satellite in orbit. These will be followed by GOSAT-2 and NASA's OCO-3 on the International Space Station as early as 2017. Additional polar orbiting satellites (e.g., CarbonSat, under consideration at ESA) and geostationary platforms may also become available. However, the individual assets have been designed with independent science goals and requirements, and limited consideration of coordinated observing strategies. Every effort must be made to maximize the science return from this constellation. We discuss the opportunities to exploit the complementary spatial and temporal coverage provided by these assets as well as the crucial gaps in the capabilities of this constellation. References Burton, M.R., Sawyer, G.M., and Granieri, D. (2013). Deep carbon emissions from volcanoes. Rev. Mineral. Geochem. 75: 323-354. Duren, R.M., Miller, C.E. (2012). Measuring the carbon emissions of megacities. Nature Climate Change 2, 560-562. Schwandner, F.M., Oda, T., Duren, R., Carn, S.A., Maksyutov, S., Crisp, D., Miller, C.E. (2013). Scientific Opportunities from Target-Mode Capabilities of GOSAT-2. NASA Jet Propulsion Laboratory, California Institute of Technology, Pasadena CA, White Paper, 6p., March 2013.
Integrating digital information for coastal and marine sciences
Marincioni, Fausto; Lightsom, Frances L.; Riall, Rebecca L.; Linck, Guthrie A.; Aldrich, Thomas C.; Caruso, Michael J.
2004-01-01
A pilot distributed geolibrary, the Marine Realms Information Bank (MRIB), was developed by the U.S. Geological Survey Coastal and Marine Geology Program and the Woods Hole Oceanographic Institution, to classify, integrate, and facilitate access to scientific information about oceans, coasts, and lakes. The MRIB is composed of a categorization scheme, a metadata database, and a specialized software backend, capable of drawing together information from remote sources without modifying their original format or content. Twelve facets are used to classify information: location, geologic time, feature type, biota, discipline, research method, hot topics, project, agency, author, content type, and file type. The MRIB approach allows easy and flexible organization of large or growing document collections for which centralized repositories would be impractical. Geographic searching based on the gazetteer and map interface is the centerpiece of the MRIB distributed geolibrary. The MRIB is one of a very few digital libraries that employ georeferencing -- a fundamentally different way to structure information from the traditional author/title/subject/keyword approach employed by most digital libraries. Lessons learned in developing the MRIB will be useful as other digital libraries confront the challenges of georeferencing.
Training Delivery Methods as Source of Dynamic Capabilities: The Case of Sports' Organisations
ERIC Educational Resources Information Center
Arraya, Marco António Mexia; Porfírio, Jose António
2017-01-01
Purpose: Training as an important source of dynamic capabilities (DC) is important to the performance of sports' organisations (SO) both to athletes and to non-athletic staff. There are a variety of training delivery methods (TDMs). The purpose of this study is to determine from a set of six TDMs which one is considered to be the most suitable to…
Introducing Earthdata 3.0: An All-New Way of Creating and Publishing Content
NASA Astrophysics Data System (ADS)
Bagwell, R.; Wong, M. M.; Siarto, J.; Reese, M.; Berrick, S. W.
2015-12-01
Since the launch of the National Aeronautics and Space Administration (NASA) Earthdata website (https://earthdata.nasa.gov) in the later part of 2011, there has been an emphasis on improving the user experience and providing more enriched content to the user, ultimately with the focus to bring the "pixels to the people" or to ensure that a user clicks the fewest amount of times to get to the data, tools, or information which they seek. NASA Earthdata was founded to be a single source of information as a conglomeration between over 20 different websites. With an increased focus on access to Earth science data, the recognition is now on transforming Earthdata from a static website to one that is a dynamic, data-driven site full of enriched content. This poster will present the process of utilizing a custom-built Content Management System (CMS) called "Conduit" to manage and publish content into the new Earthdata website, with examples of the various components of the CMS, as well as featured areas from the new website design. NASA Earthdata is a part of the Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools. In the near future, Earthdata will have a number of components that will drive the access to the data, such as the Earthdata Search Client and the Common Metadata Repository (CMR). The focus on content curation will be to leverage the use of these components to provide an enriched content environment and a better overall user experience, with an emphasis on Earthdata being "powered by EOSDIS" components and services.
a Cultural Landscape Information System Developed with Open Source Tools
NASA Astrophysics Data System (ADS)
Chudyk, C.; Müller, H.; Uhler, M.; Würriehausen, F.
2013-07-01
Since 2010, the state of Rhineland-Palatinate in Germany has developed a cultural landscape information system as a process to secure and further enrich aggregate data about its cultural assets. In an open dialogue between governing authorities and citizens, the intention of the project is an active cooperation of public and private actors. A cultural landscape information system called KuLIS was designed as a web platform, combining semantic wiki software with a geographic information system. Based on data sets from public administrations, the information about cultural assets can be extended and enhanced by interested participants. The developed infrastructure facilitates local information accumulation through a crowdsourcing approach. This capability offers new possibilities for e-governance and open data developments. The collaborative approach allows governing authorities to manage and supervise official data, while public participation enables affordable information acquisition. Gathered cultural heritage information can provide incentives for touristic valorisation of communities or concepts for strengthening regional identification. It can also influence political decisions in defining significant cultural regions worth of protecting from industrial influences. The presented cultural landscape information allows citizens to influence the statewide development of cultural landscapes in a democratic way.
NASA Astrophysics Data System (ADS)
Yu, Lingfeng; Liu, Gangjun; Rubinstein, Marc; Saidi, Arya; Guo, Shuguang; Wong, Brian J. F.; Chen, Zhongping
2009-02-01
Optical coherence tomography (OCT) is an evolving noninvasive imaging modality and has been used to image the human larynx during surgical endoscopy. The design of a long GRIN lens based probe capable of capturing images of the human larynx by use of swept-source OCT during a typical office-based laryngoscopy examination is presented. In vivo OCT imaging of the human larynx is demonstrated with 40 fame/second. Dynamic vibration of the vocal folds is recorded to provide not only high-resolution cross-sectional tissue structures but also vibration parameters, such as the vibration frequency and magnitude of the vocal cord, which provide important information for clinical diagnosis and treatment, as well as in fundamental research of the voice. Office-based OCT is a promising imaging modality to study the larynx.
Caudell, Thomas P; Xiao, Yunhai; Healy, Michael J
2003-01-01
eLoom is an open source graph simulation software tool, developed at the University of New Mexico (UNM), that enables users to specify and simulate neural network models. Its specification language and libraries enables users to construct and simulate arbitrary, potentially hierarchical network structures on serial and parallel processing systems. In addition, eLoom is integrated with UNM's Flatland, an open source virtual environments development tool to provide real-time visualizations of the network structure and activity. Visualization is a useful method for understanding both learning and computation in artificial neural networks. Through 3D animated pictorially representations of the state and flow of information in the network, a better understanding of network functionality is achieved. ART-1, LAPART-II, MLP, and SOM neural networks are presented to illustrate eLoom and Flatland's capabilities.
Combined rule extraction and feature elimination in supervised classification.
Liu, Sheng; Patel, Ronak Y; Daga, Pankaj R; Liu, Haining; Fu, Gang; Doerksen, Robert J; Chen, Yixin; Wilkins, Dawn E
2012-09-01
There are a vast number of biology related research problems involving a combination of multiple sources of data to achieve a better understanding of the underlying problems. It is important to select and interpret the most important information from these sources. Thus it will be beneficial to have a good algorithm to simultaneously extract rules and select features for better interpretation of the predictive model. We propose an efficient algorithm, Combined Rule Extraction and Feature Elimination (CRF), based on 1-norm regularized random forests. CRF simultaneously extracts a small number of rules generated by random forests and selects important features. We applied CRF to several drug activity prediction and microarray data sets. CRF is capable of producing performance comparable with state-of-the-art prediction algorithms using a small number of decision rules. Some of the decision rules are biologically significant.
Biometrics Enabling Capability Increment 1 (BEC Inc 1)
2016-03-01
2016 Major Automated Information System Annual Report Biometrics Enabling Capability Increment 1 (BEC Inc 1) Defense Acquisition Management...Phone: 227-3119 DSN Fax: Date Assigned: July 15, 2015 Program Information Program Name Biometrics Enabling Capability Increment 1 (BEC Inc 1) DoD...therefore, no Original Estimate has been established. BEC Inc 1 2016 MAR UNCLASSIFIED 4 Program Description The Biometrics Enabling Capability (BEC
Interested consumers' awareness of harmful chemicals in everyday products.
Hartmann, Sabrina; Klaschka, Ursula
2017-01-01
Everyday products can contain a multitude of harmful substances unnoticed by most consumers, because established risk communication channels reach only part of the society. The question is, whether at least interested and informed consumers are able to use risk communication tools and assess harmful chemicals in products. An online survey investigated the awareness of 1030 consumers on harmful substances in everyday items. Participating consumers' education level, knowledge in chemistry, and motivation were above society's average. Although a large number of responses showed that survey participants were familiar with several aspects of the issue, the results revealed that knowledge in chemistry helped, but was not enough. Many participants assumed that products with an eco-label, natural personal care products, products without hazard pictograms or products produced in the European Union would not contain harmful substances. Most participants indicated to use hazard pictograms, information on the packaging, reports in the media, and environmental and consumer organizations as information sources, while information by authorities and manufacturers were not named frequently and did not receive high confidence. Smartphone applications were not indicated by many participants as information sources. The information sources most trusted were environmental and consumer organizations, hazard pictograms, and lists of ingredients on the containers. The declared confidence in certain risk communication instruments did not always correspond to the use frequencies indicated. Nearly all participants considered legislators as responsible for the reduction of harmful substances in consumer products. Misconceptions about harmful substances in products can be dangerous for the personal health and the environment. The survey indicates that motivation, educational level, and chemical expertise do not automatically provide an appropriate understanding of harmful substances in products. If well-informed consumers are not sufficiently capable to use risk information elements as revealed in this study, then this will be even more the case for the general public. Consumer awareness should be stipulated by an improved information strategy about chemical risks in consumer products with an extensive participation of the target groups and by more efforts by authorities and manufactures to build trust and to provide easily understandable information.
Development of mobile preventive notification system (PreNotiS)
NASA Astrophysics Data System (ADS)
Kumar, Abhinav; Akopian, David; Chen, Philip
2009-02-01
The tasks achievable by mobile handsets continuously exceed our imagination. Statistics show that the mobile phone sales are soaring, rising exponentially year after year with predictions being that they will rise to a billion units in 2009, with a large section of these being smartphones. Mobile service providers, mobile application developers and researchers have been working closely over the past decade to bring about revolutionary and hardware and software advancements in hand-sets such as embedded digital camera, large memory capacity, accelerometer, touch sensitive screens, GPS, Wi- Fi capabilities etc. as well as in the network infrastructure to support these features. Recently we presented a multi-platform, massive data collection system from distributive sources such as cell phone users1 called PreNotiS. This technology was intended to significantly simplify the response to the events and help e.g. special agencies to gather crucial information in time and respond as quickly as possible to prevent or contain potential emergency situations and act as a massive, centralized evidence collection mechanism that effectively exploits the advancements in mobile application development platforms and the existing network infrastructure to present an easy-touse, fast and effective tool to mobile phone users. We successfully demonstrated the functionality of the client-server application suite to post user information onto the server. This paper presents a new version of the system PreNotiS, with a revised client application and with all new server capabilities. PreNotiS still puts forth the idea of having a fast, efficient client-server based application suite for mobile phones which through a highly simplified user interface will collect security/calamity based information in a structured format from first responders and relay that structured information to a central server where this data is sorted into a database in a predefined manner. This information which includes selections, images and text will be instantly available to authorities and action forces through a secure web portal thus helping them to make decisions in a timely and prompt manner. All the cell phones have self-localizing capability according to FCC E9112 mandate, thus the communicated information can be further tagged automatically by location and time information at the server making all this information available through the secure web-portal.
service line analytics in the new era.
Spence, Jay; Seargeant, Dan
2015-08-01
To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.
Bardo, Dianna M E; Brown, Paul
2008-08-01
Cardiac MDCT is here to stay. And, it is more than just imaging coronary arteries. Understanding the differences in and the benefits of one CT scanner from another will help you to optimize the capabilities of the scanner, but requires a basic understanding of the MDCT imaging physics.This review provides key information needed to understand the differences in the types of MDCT scanners, from 64 - 320 detectors, flat panels, single and dual source configurations, step and shoot prospective and retrospective gating, and how each factor influences radiation dose, spatial and temporal resolution, and image noise.
Operating manual for the Bulk Shielding Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-04-01
The BSR is a pool-type reactor. It has the capabilities of continuous operation at a power level of 2 MW or at any desired lower power level. This manual presents descriptive and operational information. The reactor and its auxillary facilities are described from physical and operational viewpoints. Detailed operating procedures are included which are applicable from source-level startup to full-power operation. Also included are procedures relative to the safety of personnel and equipment in the areas of experiments, radiation and contamination control, emergency actions, and general safety. This manual supercedes all previous operating manuals for the BSR.
Operating manual for the Bulk Shielding Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-03-01
The BSR is a pool-type reactor. It has the capabilities of continuous operation at a power level of 2 MW or at any desired lower power level. This manual presents descriptive and operational information. The reactor and its auxiliary facilities are described from physical and operational viewpoints. Detailed operating procedures are included which are applicable from source-level startup to full-power operation. Also included are procedures relative to the safety of personnel and equipment in the areas of experiments, radiation and contamination control, emergency actions, and general safety. This manual supersedes all previous operating manuals for the BSR.
The Compton Observatory Science Workshop
NASA Technical Reports Server (NTRS)
Shrader, Chris R. (Editor); Gehrels, Neil (Editor); Dennis, Brian (Editor)
1992-01-01
The Compton Observatory Science Workshop was held in Annapolis, Maryland on September 23-25, 1991. The primary purpose of the workshop was to provide a forum for the exchange of ideas and information among scientists with interests in various areas of high energy astrophysics, with emphasis on the scientific capabilities of the Compton Observatory. Early scientific results, as well as reports on in-flight instrument performance and calibrations are presented. Guest investigator data products, analysis techniques, and associated software were discussed. Scientific topics covered included active galaxies, cosmic gamma ray bursts, solar physics, pulsars, novae, supernovae, galactic binary sources, and diffuse galactic and extragalactic emission.
NIR light propagation in a digital head model for traumatic brain injury (TBI)
Francis, Robert; Khan, Bilal; Alexandrakis, George; Florence, James; MacFarlane, Duncan
2015-01-01
Near infrared spectroscopy (NIRS) is capable of detecting and monitoring acute changes in cerebral blood volume and oxygenation associated with traumatic brain injury (TBI). Wavelength selection, source-detector separation, optode density, and detector sensitivity are key design parameters that determine the imaging depth, chromophore separability, and, ultimately, clinical usefulness of a NIRS instrument. We present simulation results of NIR light propagation in a digital head model as it relates to the ability to detect intracranial hematomas and monitor the peri-hematomal tissue viability. These results inform NIRS instrument design specific to TBI diagnosis and monitoring. PMID:26417498
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
Dynamic-scanning-electron-microscope study of friction and wear
NASA Technical Reports Server (NTRS)
Brainard, W. A.; Buckley, D. H.
1974-01-01
A friction and wear apparatus was built into a real time scanning electron microscope (SEM). The apparatus and SEM comprise a system which provides the capability of performing dynamic friction and wear experiments in situ. When the system is used in conjunction with dispersive X-ray analysis, a wide range of information on the wearing process can be obtained. The type of wear and variation with speed, load, and time can be investigated. The source, size, and distribution of wear particles can be determined and metallic transferal observed. Some typical results obtained with aluminum, copper, and iron specimens are given.
1992-02-01
Designation with the CL-227 Sea Sentinel 31 byH SotadS.Joes SESSION V - LONGER TERM SYSTEMS Avionic System Improvement Proposal for the TORNADO...18’s fire control capability to deliver some types of smart munitions. Yet we also noted that while we lacked the target designators and control...source of lines came qystems designed to deny the information about the tactical enemy the use of height. Sophisticated situation they are facing. Enemy
A Theoretical and Experimental Analysis of the Outside World Perception Process
NASA Technical Reports Server (NTRS)
Wewerinke, P. H.
1978-01-01
The outside scene is often an important source of information for manual control tasks. Important examples of these are car driving and aircraft control. This paper deals with modelling this visual scene perception process on the basis of linear perspective geometry and the relative motion cues. Model predictions utilizing psychophysical threshold data from base-line experiments and literature of a variety of visual approach tasks are compared with experimental data. Both the performance and workload results illustrate that the model provides a meaningful description of the outside world perception process, with a useful predictive capability.
Fowler, Joseph F
2016-01-01
Cobalt has been a recognized allergen capable of causing contact dermatitis for decades. Why, therefore, has it been named 2016 "Allergen of the Year"? Simply put, new information has come to light in the last few years regarding potential sources of exposure to this metallic substance. In addition to reviewing some background on our previous understanding of cobalt exposures, this article will highlight the recently recognized need to consider leather as a major site of cobalt and the visual cues suggesting the presence of cobalt in jewelry. In addition, a chemical spot test for cobalt now allows us to better identify its presence in suspect materials.
A global travelers' electronic health record template standard for personal health records.
Li, Yu-Chuan; Detmer, Don E; Shabbir, Syed-Abdul; Nguyen, Phung Anh; Jian, Wen-Shan; Mihalas, George I; Shortliffe, Edward H; Tang, Paul; Haux, Reinhold; Kimura, Michio
2012-01-01
Tourism as well as international business travel creates health risks for individuals and populations both in host societies and home countries. One strategy to reduce health-related risks to travelers is to provide travelers and relevant caregivers timely, ongoing access to their own health information. Many websites offer health advice for travelers. For example, the WHO and US Department of State offer up-to-date health information about countries relevant to travel. However, little has been done to assure travelers that their medical information is available at the right place and time when the need might arise. Applications of Information and Communication Technology (ICT) utilizing mobile phones for health management are promising tools both for the delivery of healthcare services and the promotion of personal health. This paper describes the project developed by international informaticians under the umbrella of the International Medical Informatics Association. A template capable of becoming an international standard is proposed. This application is available free to anyone who is interested. Furthermore, its source code is made open.
Social media for patients: benefits and drawbacks.
De Martino, Ivan; D'Apolito, Rocco; McLawhorn, Alexander S; Fehring, Keith A; Sculco, Peter K; Gasparini, Giorgio
2017-03-01
Social media is increasingly utilized by patients to educate themselves on a disease process and to find hospital, physicians, and physician networks most capable of treating their condition. However, little is known about quality of the content of the multiple online platforms patients have to communicate with other potential patients and their potential benefits and drawbacks. Patients are not passive consumers of health information anymore but are playing an active role in the delivery of health services through an online environment. The control and the regulation of the sources of information are very difficult. The overall quality of the information was poor. Bad or misleading information can be detrimental for patients as well as influence their confidence on physicians and their mutual relationship. Orthopedic surgeons and hospital networks must be aware of these online patient portals as they provide important feedback on the patient opinion and experience that can have a major impact on future patient volume, patient opinion, and perceived quality of care.
Development of EPA Protocol Information Enquiry Service System Based on Embedded ARM Linux
NASA Astrophysics Data System (ADS)
Peng, Daogang; Zhang, Hao; Weng, Jiannian; Li, Hui; Xia, Fei
Industrial Ethernet is a new technology for industrial network communications developed in recent years. In the field of industrial automation in China, EPA is the first standard accepted and published by ISO, and has been included in the fourth edition IEC61158 Fieldbus of NO.14 type. According to EPA standard, Field devices such as industrial field controller, actuator and other instruments are all able to realize communication based on the Ethernet standard. The Atmel AT91RM9200 embedded development board and open source embedded Linux are used to develop an information inquiry service system of EPA protocol based on embedded ARM Linux in this paper. The system is capable of designing an EPA Server program for EPA data acquisition procedures, the EPA information inquiry service is available for programs in local or remote host through Socket interface. The EPA client can access data and information of other EPA equipments on the EPA network when it establishes connection with the monitoring port of the server.
3D-Lab: a collaborative web-based platform for molecular modeling.
Grebner, Christoph; Norrby, Magnus; Enström, Jonatan; Nilsson, Ingemar; Hogner, Anders; Henriksson, Jonas; Westin, Johan; Faramarzi, Farzad; Werner, Philip; Boström, Jonas
2016-09-01
The use of 3D information has shown impact in numerous applications in drug design. However, it is often under-utilized and traditionally limited to specialists. We want to change that, and present an approach making 3D information and molecular modeling accessible and easy-to-use 'for the people'. A user-friendly and collaborative web-based platform (3D-Lab) for 3D modeling, including a blazingly fast virtual screening capability, was developed. 3D-Lab provides an interface to automatic molecular modeling, like conformer generation, ligand alignments, molecular dockings and simple quantum chemistry protocols. 3D-Lab is designed to be modular, and to facilitate sharing of 3D-information to promote interactions between drug designers. Recent enhancements to our open-source virtual reality tool Molecular Rift are described. The integrated drug-design platform allows drug designers to instantaneously access 3D information and readily apply advanced and automated 3D molecular modeling tasks, with the aim to improve decision-making in drug design projects.
EcoCyc: a comprehensive database resource for Escherichia coli
Keseler, Ingrid M.; Collado-Vides, Julio; Gama-Castro, Socorro; Ingraham, John; Paley, Suzanne; Paulsen, Ian T.; Peralta-Gil, Martín; Karp, Peter D.
2005-01-01
The EcoCyc database (http://EcoCyc.org/) is a comprehensive source of information on the biology of the prototypical model organism Escherichia coli K12. The mission for EcoCyc is to contain both computable descriptions of, and detailed comments describing, all genes, proteins, pathways and molecular interactions in E.coli. Through ongoing manual curation, extensive information such as summary comments, regulatory information, literature citations and evidence types has been extracted from 8862 publications and added to Version 8.5 of the EcoCyc database. The EcoCyc database can be accessed through a World Wide Web interface, while the downloadable Pathway Tools software and data files enable computational exploration of the data and provide enhanced querying capabilities that web interfaces cannot support. For example, EcoCyc contains carefully curated information that can be used as training sets for bioinformatics prediction of entities such as promoters, operons, genetic networks, transcription factor binding sites, metabolic pathways, functionally related genes, protein complexes and protein–ligand interactions. PMID:15608210
Quinn, Emma; Hsiao, Kai; Truman, George; Rose, Nectarios; Broome, Richard
2018-05-02
Geographic information systems (GIS) have emerged in the past few decades as a technology capable of assisting in the control of infectious disease outbreaks. A Legionnaires' disease cluster investigation in May 2016 in Sydney, New South Wales (NSW), Australia, demonstrated the importance of using GIS to identify at-risk water sources in real-time for field investigation to help control any immediate environmental health risk, as well as the need for more staff trained in the use of this technology. Sydney Local Health District Public Health Unit (PHU) subsequently ran an exercise (based on this investigation) with 11 staff members from 4 PHUs across Sydney to further test staff capability to use GIS across NSW. At least 80% of exercise participants reported that the scenario progression was realistic, assigned tasks were clear, and sufficient data were provided to complete tasks. The exercise highlighted the multitude of geocoding applications and need for inter-operability of systems, as well as the need for trained staff with specific expertise in spatial analysis to help assist in outbreak control activity across NSW. Evaluation data demonstrated the need for a common GIS, regular education and training, and guidelines to support the collaborative use of GIS for infectious disease epidemiology in NSW. (Disaster Med Public Health Preparedness. 2018;page 1 of 3).
NASA Astrophysics Data System (ADS)
Purss, M. B. J.; Mueller, N. R.; Killough, B.; Oliver, S. A.
2016-12-01
In 2014 Geoscience Australia launched Water Observations from Space (WOfS) providing a continental-scale water product that shows how often surface water has been observed across Australia by the Landsat satellites since 1987. WOfS is a 23-step band-based decision tree that classifies pixels as water or non-water with 97% overall accuracy. The enabling infrastructure for WOfS is the Australian Geoscience Data Cube (AGDC), a high performance computing system organising Australian earth observation data into a systematic, consistently corrected analysis engine. The Committee on Earth Observation Satellites (CEOS) has adopted the AGDC methodology to create a series of international Data Cubes to provide the same capability to areas that would otherwise not be able to undertake time series analysis of the environment at these scales. The CEOS Systems Engineering Office (SEO) recently completed testing of WOfS using Data Cubes based on the AGDC version 2 over Kenya and Colombia. The results show how Data Cubes can provide water management information at large scales, and provide information in remote locations where other sources of water information are unavailable. The results also show an improvement in water detection capability over the Landsat CFmask. This water management product provides critical insight into the behavior of surface water over time and in particular, the extent of flooding.
B-HIT - A Tool for Harvesting and Indexing Biodiversity Data
Barker, Katharine; Braak, Kyle; Cawsey, E. Margaret; Coddington, Jonathan; Robertson, Tim; Whitacre, Jamie
2015-01-01
With the rapidly growing number of data publishers, the process of harvesting and indexing information to offer advanced search and discovery becomes a critical bottleneck in globally distributed primary biodiversity data infrastructures. The Global Biodiversity Information Facility (GBIF) implemented a Harvesting and Indexing Toolkit (HIT), which largely automates data harvesting activities for hundreds of collection and observational data providers. The team of the Botanic Garden and Botanical Museum Berlin-Dahlem has extended this well-established system with a range of additional functions, including improved processing of multiple taxon identifications, the ability to represent associations between specimen and observation units, new data quality control and new reporting capabilities. The open source software B-HIT can be freely installed and used for setting up thematic networks serving the demands of particular user groups. PMID:26544980
B-HIT - A Tool for Harvesting and Indexing Biodiversity Data.
Kelbert, Patricia; Droege, Gabriele; Barker, Katharine; Braak, Kyle; Cawsey, E Margaret; Coddington, Jonathan; Robertson, Tim; Whitacre, Jamie; Güntsch, Anton
2015-01-01
With the rapidly growing number of data publishers, the process of harvesting and indexing information to offer advanced search and discovery becomes a critical bottleneck in globally distributed primary biodiversity data infrastructures. The Global Biodiversity Information Facility (GBIF) implemented a Harvesting and Indexing Toolkit (HIT), which largely automates data harvesting activities for hundreds of collection and observational data providers. The team of the Botanic Garden and Botanical Museum Berlin-Dahlem has extended this well-established system with a range of additional functions, including improved processing of multiple taxon identifications, the ability to represent associations between specimen and observation units, new data quality control and new reporting capabilities. The open source software B-HIT can be freely installed and used for setting up thematic networks serving the demands of particular user groups.
Extending the Li&Ma method to include PSF information
NASA Astrophysics Data System (ADS)
Nievas-Rosillo, M.; Contreras, J. L.
2016-02-01
The so called Li&Ma formula is still the most frequently used method for estimating the significance of observations carried out by Imaging Atmospheric Cherenkov Telescopes. In this work a straightforward extension of the method for point sources that profits from the good imaging capabilities of current instruments is proposed. It is based on a likelihood ratio under the assumption of a well-known PSF and a smooth background. Its performance is tested with Monte Carlo simulations based on real observations and its sensitivity is compared to standard methods which do not incorporate PSF information. The gain of significance that can be attributed to the inclusion of the PSF is around 10% and can be boosted if a background model is assumed or a finer binning is used.
Scalable loading of a two-dimensional trapped-ion array
Bruzewicz, Colin D.; McConnell, Robert; Chiaverini, John; Sage, Jeremy M.
2016-01-01
Two-dimensional arrays of trapped-ion qubits are attractive platforms for scalable quantum information processing. Sufficiently rapid reloading capable of sustaining a large array, however, remains a significant challenge. Here with the use of a continuous flux of pre-cooled neutral atoms from a remotely located source, we achieve fast loading of a single ion per site while maintaining long trap lifetimes and without disturbing the coherence of an ion quantum bit in an adjacent site. This demonstration satisfies all major criteria necessary for loading and reloading extensive two-dimensional arrays, as will be required for large-scale quantum information processing. Moreover, the already high loading rate can be increased by loading ions in parallel with only a concomitant increase in photo-ionization laser power and no need for additional atomic flux. PMID:27677357
Senger, Stefan
2017-04-21
Patents are an important source of information for effective decision making in drug discovery. Encouragingly, freely accessible patent-chemistry databases are now in the public domain. However, at present there is still a wide gap between relatively low coverage-high quality manually-curated data sources and high coverage data sources that use text mining and automated extraction of chemical structures. To secure much needed funding for further research and an improved infrastructure, hard evidence is required to demonstrate the significance of patent-derived information in drug discovery. Surprisingly little such evidence has been reported so far. To address this, the present study attempts to quantify the relevance of patents for formulating and substantiating hypotheses for compound-target interactions. A manually-curated set of 130 compound-target interaction pairs annotated with what are considered to be the earliest patent and publication has been produced. The analysis of this set revealed that in stark contrast to what has been reported for novel chemical structures, only about 10% of the compound-target interaction pairs could be found in publications in the scientific literature within one year of being reported in patents. The average delay across all interaction pairs is close to 4 years. In an attempt to benchmark current capabilities, it was also examined how much of the benefit of using patent-derived information can be retained when a bioannotated version of SureChEMBL is used as secondary source for the patent literature. Encouragingly, this approach found the patents in the annotated set for 72% of the compound-target interaction pairs. Similarly, the effect of using the bioactivity database ChEMBL as secondary source for the scientific literature was studied. Here, the publications from the annotated set were only found for 46% of the compound-target interaction pairs. Patent-derived information is a significant enabler for formulating compound-target interaction hypotheses even in cases where the respective interaction is later reported in the scientific literature. The findings of this study clearly highlight the significance of future investments in the development and provision of databases and tools that will allow scientists to search patent information in a comprehensive, reliable, and efficient manner.
Farmer data sourcing. The case study of the spatial soil information maps in South Tyrol.
NASA Astrophysics Data System (ADS)
Della Chiesa, Stefano; Niedrist, Georg; Thalheimer, Martin; Hafner, Hansjörg; La Cecilia, Daniele
2017-04-01
Nord-Italian region South Tyrol is Europe's largest apple growing area exporting ca. 15% in Europe and 2% worldwide. Vineyards represent ca. 1% of Italian production. In order to deliver high quality food, most of the farmers in South Tyrol follow sustainable farming practices. One of the key practice is the sustainable soil management, where farmers collect regularly (each 5 years) soil samples and send for analyses to improve cultivation management, yield and finally profitability. However, such data generally remain inaccessible. On this regard, in South Tyrol, private interests and the public administration have established a long tradition of collaboration with the local farming industry. This has granted to the collection of large spatial and temporal database of soil analyses along all the cultivated areas. Thanks to this best practice, information on soil properties are centralized and geocoded. The large dataset consist mainly in soil information of texture, humus content, pH and microelements availability such as, K, Mg, Bor, Mn, Cu Zn. This data was finally spatialized by mean of geostatistical methods and several high-resolution digital maps were created. In this contribution, we present the best practice where farmers data source soil information in South Tyrol. Show the capability of a large spatial-temporal geocoded soil dataset to reproduce detailed digital soil property maps and to assess long-term changes in soil properties. Finally, implication and potential application are discussed.
Using WNTR to Model Water Distribution System Resilience ...
The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres
NASA Astrophysics Data System (ADS)
Wright, L.; Coddington, O.; Pilewskie, P.
2016-12-01
Hyperspectral instruments are a growing class of Earth observing sensors designed to improve remote sensing capabilities beyond discrete multi-band sensors by providing tens to hundreds of continuous spectral channels. Improved spectral resolution, range and radiometric accuracy allow the collection of large amounts of spectral data, facilitating thorough characterization of both atmospheric and surface properties. These new instruments require novel approaches for processing imagery and separating surface and atmospheric signals. One approach is numerical source separation, which allows the determination of the underlying physical causes of observed signals. Improved source separation will enable hyperspectral imagery to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. We developed an Informed Non-negative Matrix Factorization (INMF) method for separating atmospheric and surface sources. INMF offers marked benefits over other commonly employed techniques including non-negativity, which avoids physically impossible results; and adaptability, which tailors the method to hyperspectral source separation. The INMF algorithm is adapted to separate contributions from physically distinct sources using constraints on spectral and spatial variability, and library spectra to improve the initial guess. We also explore methods to produce an initial guess of the spatial separation patterns. Using this INMF algorithm we decompose hyperspectral imagery from the NASA Hyperspectral Imager for the Coastal Ocean (HICO) with a focus on separating surface and atmospheric signal contributions. HICO's coastal ocean focus provides a dataset with a wide range of atmospheric conditions, including high and low aerosol optical thickness and cloud cover, with only minor contributions from the ocean surfaces in order to isolate the contributions of the multiple atmospheric sources.
Predicting Near-Term Water Quality from Satellite Observations of Watershed Conditions
NASA Astrophysics Data System (ADS)
Weiss, W. J.; Wang, L.; Hoffman, K.; West, D.; Mehta, A. V.; Lee, C.
2017-12-01
Despite the strong influence of watershed conditions on source water quality, most water utilities and water resource agencies do not currently have the capability to monitor watershed sources of contamination with great temporal or spatial detail. Typically, knowledge of source water quality is limited to periodic grab sampling; automated monitoring of a limited number of parameters at a few select locations; and/or monitoring relevant constituents at a treatment plant intake. While important, such observations are not sufficient to inform proactive watershed or source water management at a monthly or seasonal scale. Satellite remote sensing data on the other hand can provide a snapshot of an entire watershed at regular, sub-monthly intervals, helping analysts characterize watershed conditions and identify trends that could signal changes in source water quality. Accordingly, the authors are investigating correlations between satellite remote sensing observations of watersheds and source water quality, at a variety of spatial and temporal scales and lags. While correlations between remote sensing observations and direct in situ measurements of water quality have been well described in the literature, there are few studies that link remote sensing observations across a watershed with near-term predictions of water quality. In this presentation, the authors will describe results of statistical analyses and discuss how these results are being used to inform development of a desktop decision support tool to support predictive application of remote sensing data. Predictor variables under evaluation include parameters that describe vegetative conditions; parameters that describe climate/weather conditions; and non-remote sensing, in situ measurements. Water quality parameters under investigation include nitrogen, phosphorus, organic carbon, chlorophyll-a, and turbidity.
Open source tools for fluorescent imaging.
Hamilton, Nicholas A
2012-01-01
As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.
Information Quality Evaluation of C2 Systems at Architecture Level
2014-06-01
based on architecture models of C2 systems, which can help to identify key factors impacting information quality and improve the system capability at the stage of architecture design of C2 system....capability evaluation of C2 systems at architecture level becomes necessary and important for improving the system capability at the stage of architecture ... design . This paper proposes a method for information quality evaluation of C2 system at architecture level. First, the information quality model is
Decentralized modal identification using sparse blind source separation
NASA Astrophysics Data System (ADS)
Sadhu, A.; Hazra, B.; Narasimhan, S.; Pandey, M. D.
2011-12-01
Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time-frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure.
Development of a Microphone Phased Array Capability for the Langley 14- by 22-Foot Subsonic Tunnel
NASA Technical Reports Server (NTRS)
Humphreys, William M.; Brooks, Thomas F.; Bahr, Christopher J.; Spalt, Taylor B.; Bartram, Scott M.; Culliton, William G.; Becker, Lawrence E.
2014-01-01
A new aeroacoustic measurement capability has been developed for use in open-jet testing in the NASA Langley 14- by 22-Foot Subsonic Tunnel (14x22 tunnel). A suite of instruments has been developed to characterize noise source strengths, locations, and directivity for both semi-span and full-span test articles in the facility. The primary instrument of the suite is a fully traversable microphone phased array for identification of noise source locations and strengths on models. The array can be mounted in the ceiling or on either side of the facility test section to accommodate various test article configurations. Complementing the phased array is an ensemble of streamwise traversing microphones that can be placed around the test section at defined locations to conduct noise source directivity studies along both flyover and sideline axes. A customized data acquisition system has been developed for the instrumentation suite that allows for command and control of all aspects of the array and microphone hardware, and is coupled with a comprehensive data reduction system to generate information in near real time. This information includes such items as time histories and spectral data for individual microphones and groups of microphones, contour presentations of noise source locations and strengths, and hemispherical directivity data. The data acquisition system integrates with the 14x22 tunnel data system to allow real time capture of facility parameters during acquisition of microphone data. The design of the phased array system has been vetted via a theoretical performance analysis based on conventional monopole beamforming and DAMAS deconvolution. The performance analysis provides the ability to compute figures of merit for the array as well as characterize factors such as beamwidths, sidelobe levels, and source discrimination for the types of noise sources anticipated in the 14x22 tunnel. The full paper will summarize in detail the design of the instrumentation suite, the construction of the hardware system, and the results of the performance analysis. Although the instrumentation suite is designed to characterize noise for a variety of test articles in the 14x22 tunnel, this paper will concentrate on description of the instruments for two specific test campaigns in the facility, namely a full-span NASA Hybrid Wing Body (HWB) model entry and a semi-span Gulfstream aircraft model entry, tested in the facility in the winter of 2012 and spring of 2013, respectively.
Toward information management in corporations (4)
NASA Astrophysics Data System (ADS)
Yamamoto, Takeo
The roles of personal computers (PC's) and workstations (WS's) in developing the corporate information system is discussed. The history and state of art for PC's and WS's are reviewed. Checkpoints for introducing PC's and WS's are ; Japanese word-processing capabilities, multi-media capabilities and network capabilities.
The UK Soil Observatory (UKSO) and mySoil app: crowdsourcing and disseminating soil information.
NASA Astrophysics Data System (ADS)
Robinson, David; Bell, Patrick; Emmett, Bridget; Panagos, Panos; Lawley, Russell; Shelley, Wayne
2017-04-01
Digital technologies in terms of web based data portals and mobiles apps offer a new way to provide both information to the public, and to engage the public in becoming involved in contributing to the effort of collecting data through crowdsourcing. We are part of the Landpotential.org consortium which is a global partnership committed to developing and supporting the adoption of freely available technology and tools for sustainable land use management, monitoring, and connecting people across the globe. The mySoil app was launched in 2012 and is an example of a free mobile application downloadable from iTunes and Google Play. It serves as a gateway tool to raise interest in, and awareness of, soils. It currently has over 50,000 dedicated users and has crowd sourced more than 4000 data records. Recent developments have expanded the coverage of mySoil from the United Kingdom to Europe, introduced a new user interface and provided language capability, while the UKSO displays the crowd-sourced records from across the globe. We are now trying to identify which industry, education and citizen sectors are using these platforms and how they can be improved. Please help us by providing feedback or taking the survey on the UKSO website. www.UKSO.org The UKSO is a collaboration between major UK soil-data holders to provide maps, spatial data and real-time temporal data from observing platforms such as the UK soil moisture network. Both UKSO and mySoil have crowdsourcing capability and are slowly building global citizen science maps of soil properties such as pH and texture. Whilst these data can't replace professional monitoring data, the information they provide both stimulates public interest and can act as 'soft data' that can help support the interpretation of monitoring data, or guide future monitoring, identifying areas that don't correspond with current analysis. In addition, soft data can be used to map soils with machine learning approaches, such as SoilGrids.
A STUDY OF SIMULATOR CAPABILITIES IN AN OPERATIONAL TRAINING PROGRAM.
ERIC Educational Resources Information Center
MEYER, DONALD E.; AND OTHERS
THE EXPERIMENT WAS CONDUCTED TO DETERMINE THE EFFECTS OF SIMULATOR TRAINING TO CRITERION PROFICIENCY UPON TIME REQUIRED IN THE AIRCRAFT. DATA WERE ALSO COLLECTED ON PROFICIENCY LEVELS ATTAINED, SELF-CONFIDENCE LEVELS, INDIVIDUAL ESTIMATES OF CAPABILITY, AND SOURCES FROM WHICH THAT CAPABILITY WAS DERIVED. SUBJECTS FOR THE EXPERIMENT--48 AIRLINE…
NASA Technical Reports Server (NTRS)
Schaffner, Philip R.; Harrah, Steven; Neece, Robert T.
2012-01-01
The air transportation system of the future will need to support much greater traffic densities than are currently possible, while preserving or improving upon current levels of safety. Concepts are under development to support a Next Generation Air Transportation System (NextGen) that by some estimates will need to support up to three times current capacity by the year 2025. Weather and other atmospheric phenomena, such as wake vortices and volcanic ash, constitute major constraints on airspace system capacity and can present hazards to aircraft if encountered. To support safe operations in the NextGen environment advanced systems for collection and dissemination of aviation weather and environmental information will be required. The envisioned NextGen Network Enabled Weather (NNEW) infrastructure will be a critical component of the aviation weather support services, providing access to a common weather picture for all system users. By taking advantage of Network Enabled Operations (NEO) capabilities, a virtual 4-D Weather Data Cube with aviation weather information from many sources will be developed. One new source of weather observations may be airborne forward-looking sensors, such as the X-band weather radar. Future sensor systems that are the subject of current research include advanced multi-frequency and polarimetric radar, a variety of Lidar technologies, and infrared imaging spectrometers.
NASA Astrophysics Data System (ADS)
Nowak-Lovato, K.
2014-12-01
Seepage from enhanced oil recovery, carbon storage, and natural gas sites can emit trace gases such as carbon dioxide, methane, and hydrogen sulfide. Trace gas emission at these locations demonstrate unique light stable isotope signatures that provide information to enable source identification of the material. Light stable isotope detection through surface monitoring, offers the ability to distinguish between trace gases emitted from sources such as, biological (fertilizers and wastes), mineral (coal or seams), or liquid organic systems (oil and gas reservoirs). To make light stable isotope measurements, we employ the ultra-sensitive technique, frequency modulation spectroscopy (FMS). FMS is an absorption technique with sensitivity enhancements approximately 100-1000x more than standard absorption spectroscopy with the advantage of providing stable isotope signature information. We have developed an integrated in situ (point source) system that measures carbon dioxide, methane and hydrogen sulfide with isotopic resolution and enhanced sensitivity. The in situ instrument involves the continuous collection of air and records the stable isotope ratio for the gas being detected. We have included in-line flask collection points to obtain gas samples for validation of isotopic concentrations using our in-house isotope ratio mass spectroscopy (IRMS). We present calibration curves for each species addressed above to demonstrate the sensitivity and accuracy of the system. We also show field deployment data demonstrating the capabilities of the system in making live dynamic measurements from an active source.
Solar energy to meet the nation's energy needs
NASA Technical Reports Server (NTRS)
Rom, F. E.; Thomas, R. L.
1973-01-01
Discussion of the possibilities afforded by solar energy as one of the alternative energy sources capable to take the place of the dwindling oil and gas reserves. Solar energy, being a nondepleting clean source of energy, is shown to be capable of providing energy in all the forms in which it is used today. Steps taken toward providing innovative solutions that are economically competitive with other systems are briefly reviewed.
An 'X-banded' Tidbinbilla interferometer
NASA Technical Reports Server (NTRS)
Batty, Michael J.; Gardyne, R. G.; Gay, G. J.; Jauncy, David L.; Gulkis, S.; Kirk, A.
1986-01-01
The recent upgrading of the Tidbinbilla two-element interferometer to simultaneous S-band (2.3 GHz) and X-band (8.4 GHz) operation has provided a powerful new astronomical facility for weak radio source measurement in the Southern Hemisphere. The new X-band system has a minimum fringe spacing of 38 arcsec, and about the same positional measurement capability (approximately 2 arcsec) and sensitivity (1 s rms noise of 10 mJy) as the previous S-band system. However, the far lower confusion limit will allow detection and accurate positional measurements for sources as weak as a few millijanskys. This capability will be invaluable for observations of radio stars, X-ray sources and other weak, compact radio sources.
Baumann, Eva; Czerwinski, Fabian
2017-01-01
Background Online health information-seeking behavior (OHISB) is currently a widespread and common behavior that has been described as an important prerequisite of empowerment and health literacy. Although demographic factors such as socioeconomic status (SES), age, and gender have been identified as important determinants of OHISB, research is limited regarding the gender-specific motivational determinants of OHISB and differences between women and men in the use of online resources for health information purposes. Objective The aim of this study was to identify gender-specific determinants and patterns of OHISB by analyzing data from a representative German sample of adults (N=1728) with special attention to access and frequency of use as well as topics and sources of OHISB. Methods We employed a 2-step analysis, that is, after exploring differences between users and nonusers of online health information using logistic regression models, we highlighted gender-specific determinants of the frequency of OHISB by applying zero-truncated negative binomial models. Results Age (odds ratio, OR for females=0.97, 95% CI 0.96-0.99) and degree of satisfaction with one’s general practitioner (GP) (OR for males=0.73, 95% CI 0.57-0.92) were gender-specific determinants of access to OHISB. Regarding the frequency of OHISB, daily Internet use (incidence rate ratio, IRR=1.67, 95% CI 1.19-2.33) and a strong interest in health topics (IRR=1.45, 95% CI 1.19-1.77) were revealed to be more important predictors than SES (IRR for high SES=1.25, 95% CI 0.91-1.73). Conclusions Users indicate that the Internet seems to be capable of providing a valuable source of informational support and patient empowerment. Increasing the potential value of the Internet as a source for health literacy and patient empowerment requires need-oriented and gender-specific health communication efforts, media, and information strategies. PMID:28377367
Off-Gas Adsorption Model Capabilities and Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, Kevin L.; Welty, Amy K.; Law, Jack
2016-03-01
Off-gas treatment is required to reduce emissions from aqueous fuel reprocessing. Evaluating the products of innovative gas adsorption research requires increased computational simulation capability to more effectively transition from fundamental research to operational design. Early modeling efforts produced the Off-Gas SeParation and REcoverY (OSPREY) model that, while efficient in terms of computation time, was of limited value for complex systems. However, the computational and programming lessons learned in development of the initial model were used to develop Discontinuous Galerkin OSPREY (DGOSPREY), a more effective model. Initial comparisons between OSPREY and DGOSPREY show that, while OSPREY does reasonably well to capturemore » the initial breakthrough time, it displays far too much numerical dispersion to accurately capture the real shape of the breakthrough curves. DGOSPREY is a much better tool as it utilizes a more stable set of numerical methods. In addition, DGOSPREY has shown the capability to capture complex, multispecies adsorption behavior, while OSPREY currently only works for a single adsorbing species. This capability makes DGOSPREY ultimately a more practical tool for real world simulations involving many different gas species. While DGOSPREY has initially performed very well, there is still need for improvement. The current state of DGOSPREY does not include any micro-scale adsorption kinetics and therefore assumes instantaneous adsorption. This is a major source of error in predicting water vapor breakthrough because the kinetics of that adsorption mechanism is particularly slow. However, this deficiency can be remedied by building kinetic kernels into DGOSPREY. Another source of error in DGOSPREY stems from data gaps in single species, such as Kr and Xe, isotherms. Since isotherm data for each gas is currently available at a single temperature, the model is unable to predict adsorption at temperatures outside of the set of data currently available. Thus, in order to improve the predictive capabilities of the model, there is a need for more single-species adsorption isotherms at different temperatures, in addition to extending the model to include adsorption kinetics. This report provides background information about the modeling process and a path forward for further model improvement in terms of accuracy and user interface.« less
Plasma Propulsion Testing Capabilities at Arnold Engineering Development Center
NASA Technical Reports Server (NTRS)
Polzin, Kurt A.; Dawbarn, Albert; Moeller, Trevor
2007-01-01
This paper describes the results of a series of experiments aimed at quantifying the plasma propulsion testing capabilities of a 12-ft diameter vacuum facility (12V) at USAF-Arnold Engineering Development Center (AEDC). Vacuum is maintained in the 12V facility by cryogenic panels lining the interior of the chamber. The pumping capability of these panels was shown to be great enough to support plasma thrusters operating at input electrical power >20 kW. In addition, a series of plasma diagnostics inside the chamber allowed for measurement of plasma parameters at different spatial locations, providing information regarding the chamber's effect on the global plasma thruster flowfield. The plasma source used in this experiment was Hall thruster manufactured by Busek Co. The thruster was operated at up to 20 kW steady-state power in both a lower current and higher current mode. The vacuum level in the chamber never rose above 9 x 10(exp -6) torr during the course of testing. Langmuir probes, ion flux probes, and Faraday cups were used to quantify the plasma parameters in the chamber. We present the results of these measurements and estimates of pumping speed based on the background pressure level and thruster propellant mass flow rate.
Imaging of blood cells based on snapshot Hyper-Spectral Imaging systems
NASA Astrophysics Data System (ADS)
Robison, Christopher J.; Kolanko, Christopher; Bourlai, Thirimachos; Dawson, Jeremy M.
2015-05-01
Snapshot Hyper-Spectral imaging systems are capable of capturing several spectral bands simultaneously, offering coregistered images of a target. With appropriate optics, these systems are potentially able to image blood cells in vivo as they flow through a vessel, eliminating the need for a blood draw and sample staining. Our group has evaluated the capability of a commercial Snapshot Hyper-Spectral imaging system, the Arrow system from Rebellion Photonics, in differentiating between white and red blood cells on unstained blood smear slides. We evaluated the imaging capabilities of this hyperspectral camera; attached to a microscope at varying objective powers and illumination intensity. Hyperspectral data consisting of 25, 443x313 hyperspectral bands with ~3nm spacing were captured over the range of 419 to 494nm. Open-source hyper-spectral data cube analysis tools, used primarily in Geographic Information Systems (GIS) applications, indicate that white blood cells features are most prominent in the 428-442nm band for blood samples viewed under 20x and 50x magnification over a varying range of illumination intensities. These images could potentially be used in subsequent automated white blood cell segmentation and counting algorithms for performing in vivo white blood cell counting.
Goodall, Ken; Ward, Paul; Newman, Lareen
2010-01-01
Governments and businesses are increasingly using the internet and mobile telephones to disseminate information about services and products. However, not all population groups have the resources and capabilities to support equality of access to and use of these technologies. While Australia's ageing population receives attention in a wide variety of literatures, the ageing migrant population has received very little attention in relation to understanding their place in the 'digital divide'. It is not known how this group gathers information used in everyday living, or what role the internet or mobile phones plays within this. At a time when the population is ageing and there is an increasing use of the internet to deliver services and information, there is little research on the effects of ethnicity, migration, socio-economic status, education or gender of older people on the use of information and communication technology (ICT). Addressing this should be a priority in Australia, which has an old and ageing population that includes many post-war migrants from non-English speaking European countries. To analyse the views of older migrants living in South Australia with respect to their current information sources, their use of ICT and any barriers and enablers to future use of ICT for accessing health information. A qualitative study employing eight focus groups involving 43 older Italian and Greek migrants living in the community in metropolitan or regional settings in South Australia. Interviews were held and audio-recorded and the English language components transcribed. Transcriptions were analysed manually using a grounded theory approach. Older migrants do not use ICT to a great extent to access information in their everyday lives, with many expressing no interest in learning how to do so. However, they access the information they need to function in society with a desired quality of life from multiple sources by various means. Sources include electronic and print media from Australia and their home countries, family and acquaintances, government departments or service providers. Many expressed a preference for receiving information as printed material or directly from another person. Governments or primary healthcare organisations planning to make health information solely available via ICT should be aware that doing so may lead to an increase in 'information exclusion' and the formation of functional knowledge deficits for older migrants. At the moment at least, our participants do not perceive any functional knowledge deficits as they engage multiple sources to access the information they need for everyday life. We recommend that governments and healthcare organisations evaluate the appropriateness of using ICT to directly provide information to older migrants and consider non-digital means or the engagement of 'information brokers' when communicating with groups identified as low or non-users of ICT.
Context-rich semantic framework for effective data-to-decisions in coalition networks
NASA Astrophysics Data System (ADS)
Grueneberg, Keith; de Mel, Geeth; Braines, Dave; Wang, Xiping; Calo, Seraphin; Pham, Tien
2013-05-01
In a coalition context, data fusion involves combining of soft (e.g., field reports, intelligence reports) and hard (e.g., acoustic, imagery) sensory data such that the resulting output is better than what it would have been if the data are taken individually. However, due to the lack of explicit semantics attached with such data, it is difficult to automatically disseminate and put the right contextual data in the hands of the decision makers. In order to understand the data, explicit meaning needs to be added by means of categorizing and/or classifying the data in relationship to each other from base reference sources. In this paper, we present a semantic framework that provides automated mechanisms to expose real-time raw data effectively by presenting appropriate information needed for a given situation so that an informed decision could be made effectively. The system utilizes controlled natural language capabilities provided by the ITA (International Technology Alliance) Controlled English (CE) toolkit to provide a human-friendly semantic representation of messages so that the messages can be directly processed in human/machine hybrid environments. The Real-time Semantic Enrichment (RTSE) service adds relevant contextual information to raw data streams from domain knowledge bases using declarative rules. The rules define how the added semantics and context information are derived and stored in a semantic knowledge base. The software framework exposes contextual information from a variety of hard and soft data sources in a fast, reliable manner so that an informed decision can be made using semantic queries in intelligent software systems.
NASA Astrophysics Data System (ADS)
Bedrina, T.; Parodi, A.; Quarati, A.; Clematis, A.
2012-06-01
It is widely recognised that an effective exploitation of Information and Communication Technologies (ICT) is an enabling factor to achieve major advancements in Hydro-Meteorological Research (HMR). Recently, a lot of attention has been devoted to the use of ICT in HMR activities, e.g. in order to facilitate data exchange and integration, to improve computational capabilities and consequently model resolution and quality. Nowadays, ICT technologies have demonstrated that it is possible to extend monitoring networks by integrating sensors and other sources of data managed by volunteer's communities. These networks are constituted by peers that span a wide portion of the territory in many countries. The peers are "location aware" in the sense that they provide information strictly related with their geospatial location. The coverage of these networks, in general, is not uniform and the location of peers may follow random distribution. The ICT features used to set up the network are lightweight and user friendly, thus, permitting the peers to join the network without the necessity of specialised ICT knowledge. In this perspective it is of increasing interest for HMR activities to elaborate of Personal Weather Station (PWS) networks, capable to provide almost real-time, location aware, weather data. Moreover, different big players of the web arena are now providing world-wide backbones, suitable to present on detailed map location aware information, obtained by mashing up data from different sources. This is the case, for example, with Google Earth and Google Maps. This paper presents the design of a mashup application aimed at aggregating, refining and visualizing near real-time hydro-meteorological datasets. In particular, we focused on the integration of instant precipitation depths, registered either by widespread semi-professional weather stations and official ones. This sort of information has high importance and usefulness in decision support systems and Civil Protection applications. As a significant case study, we analysed the rainfall data observed during the severe flash-flood event of 4 November 2011 over Liguria region, Italy. The joint use of official observation network with PWS networks and meteorological radar allowed for the making of evident finger-like convection structure.
Luminescent light source for laser pumping and laser system containing same
Hamil, Roy A.; Ashley, Carol S.; Brinker, C. Jeffrey; Reed, Scott; Walko, Robert J.
1994-01-01
The invention relates to a pumping lamp for use with lasers comprising a porous substrate loaded with a component capable of emitting light upon interaction of the component with exciting radiation and a source of exciting radiation. Preferably, the pumping lamp comprises a source of exciting radiation, such as an electron beam, and an aerogel or xerogel substrate loaded with a component capable of interacting with the exciting radiation, e.g., a phosphor, to produce light, e.g., visible light, of a suitable band width and of a sufficient intensity to generate a laser beam from a laser material.
ABM Drag_Pass Report Generator
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
dragREPORT software was developed in parallel with abmREPORT, which is described in the preceding article. Both programs were built on the capabilities created during that process. This tool generates a drag_pass report that summarizes vital information from the MRO aerobreaking drag_pass build process to facilitate both sequence reviews and provide a high-level summarization of the sequence for mission management. The script extracts information from the ENV, SSF, FRF, SCMFmax, and OPTG files, presenting them in a single, easy-to-check report providing the majority of parameters needed for cross check and verification as part of the sequence review process. Prior to dragReport, all the needed information was spread across a number of different files, each in a different format. This software is a Perl script that extracts vital summarization information and build-process details from a number of source files into a single, concise report format used to aid the MPST sequence review process and to provide a high-level summarization of the sequence for mission management reference. This software could be adapted for future aerobraking missions to provide similar reports, review and summarization information.
US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access
NASA Astrophysics Data System (ADS)
Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.
2012-04-01
The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.
Combined corona discharge and UV photoionization source for ion mobility spectrometry.
Bahrami, Hamed; Tabrizchi, Mahmoud
2012-08-15
An ion mobility spectrometer is described which is equipped with two non-radioactive ion sources, namely an atmospheric pressure photoionization and a corona discharge ionization source. The two sources cannot only run individually but are additionally capable of operating simultaneously. For photoionization, a UV lamp was mounted parallel to the axis of the ion mobility cell. The corona discharge electrode was mounted perpendicular to the UV radiation. The total ion current from the photoionization source was verified as a function of lamp current, sample flow rate, and drift field. Simultaneous operation of the two ionization sources was investigated by recording ion mobility spectra of selected samples. The design allows one to observe peaks from either the corona discharge or photoionization individually or simultaneously. This makes it possible to accurately compare peaks in the ion mobility spectra from each individual source. Finally, the instrument's capability for discriminating two peaks appearing in approximately identical drift times using each individual ionization source is demonstrated. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Daigle, Matthew; Goebel, Kai; Spirkovska, Lilly; Sankararaman, Shankar; Ossenfort, John; Kulkarni, Chetan; McDermott, William; Poll, Scott
2016-01-01
As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have an automated framework to provide an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecast, predicted health of assets in the airspace, and so on. Towards this end, as part of our earlier work, we formulated the Real-Time Safety Monitoring (RTSM) framework for monitoring and predicting the state of safety and to predict unsafe events. In our previous work, the RTSM framework was demonstrated in simulation on three different constructed scenarios. In this paper, we further develop the framework and demonstrate it on real flight data from multiple data sources. Specifically, the flight data is obtained through the Shadow Mode Assessment using Realistic Technologies for the National Airspace System (SMART-NAS) Testbed that serves as a central point of collection, integration, and access of information from these different data sources. By testing and evaluating using real-world scenarios, we may accelerate the acceptance of the RTSM framework towards deployment. In this paper we demonstrate the framework's capability to not only estimate the state of safety in the NAS, but predict the time and location of unsafe events such as a loss of separation between two aircraft, or an aircraft encountering convective weather. The experimental results highlight the capability of the approach, and the kind of information that can be provided to operators to improve their situational awareness in the context of safety.
Laireiter, Anton Rupert
2017-01-01
Background In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Objective Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. Methods In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. Results The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. Conclusions With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. PMID:28729233
NASA Astrophysics Data System (ADS)
Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.
2009-04-01
The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.
NASA Astrophysics Data System (ADS)
De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel
2015-04-01
Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014, Bishkek, Kyrgyz Republic. [2] UNISDR, "Living with Risk", Geneva, Switzerland, 2004. [3] P. Bisch, E. Carvalho, H. Degree, P. Fajfar, M. Fardis, P. Franchin, M. Kreslin, A. Pecker, "Eurocode 8: Seismic Design of Buildings", Lisbon, 2011. (SENSUM: www.sensum-project.eu, grant number: 312972 ) (RASOR: www.rasor-project.eu, grant number: 606888 )
Snow, Mathew S; Snyder, Darin C; Clark, Sue B; Kelley, Morgan; Delmore, James E
2015-03-03
Radiometric and mass spectrometric analyses of Cs contamination in the environment can reveal the location of Cs emission sources, release mechanisms, modes of transport, prediction of future contamination migration, and attribution of contamination to specific generator(s) and/or process(es). The Subsurface Disposal Area (SDA) at Idaho National Laboratory (INL) represents a complicated case study for demonstrating the current capabilities and limitations to environmental Cs analyses. (137)Cs distribution patterns, (135)Cs/(137)Cs isotope ratios, known Cs chemistry at this site, and historical records enable narrowing the list of possible emission sources and release events to a single source and event, with the SDA identified as the emission source and flood transport of material from within Pit 9 and Trench 48 as the primary release event. These data combined allow refining the possible number of waste generators from dozens to a single generator, with INL on-site research and reactor programs identified as the most likely waste generator. A discussion on the ultimate limitations to the information that (135)Cs/(137)Cs ratios alone can provide is presented and includes (1) uncertainties in the exact date of the fission event and (2) possibility of mixing between different Cs source terms (including nuclear weapons fallout and a source of interest).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snow, Mathew S.; Snyder, Darin C.; Clark, Sue B.
2015-03-03
Radiometric and mass spectrometric analyses of Cs contamination in the environment can reveal the location of Cs emission sources, release mechanisms, modes of transport, prediction of future contamination migration, and attribution of contamination to specific generator(s) and/or process(es). The Subsurface Disposal Area (SDA) at Idaho National Laboratory (INL) represents a complicated case study for demonstrating the current capabilities and limitations to environmental Cs analyses. 137Cs distribution patterns, 135Cs/ 137Cs isotope ratios, known Cs chemistry at this site, and historical records enable narrowing the list of possible emission sources and release events to a single source and event, with the SDAmore » identified as the emission source and flood transport of material from within Pit 9 and Trench 48 as the primary release event. These data combined allow refining the possible number of waste generators from dozens to a single generator, with INL on-site research and reactor programs identified as the most likely waste generator. A discussion on the ultimate limitations to the information that 135Cs/ 137Cs ratios alone can provide is presented and includes (1) uncertainties in the exact date of the fission event and (2) possibility of mixing between different Cs source terms (including nuclear weapons fallout and a source of interest).« less
Understanding Learning and Learning Design in MOOCs: A Measurement-Based Interpretation
ERIC Educational Resources Information Center
Milligan, Sandra; Griffin, Patrick
2016-01-01
The paper describes empirical investigations of how participants in a MOOC learn, and the implications for MOOC design. A learner capability to generate higher order learning in MOOCs--called crowd-sourced learning (C-SL) capability--was defined from learning science literature. The capability comprised a complex yet interrelated array of…
Software Capability Evaluation (SCE) Version 2.0 Implementation Guide
1994-02-01
Affected By SCE B-40 Figure 3-1 SCE Usage Decision Making Criteria 3-44 Figure 3-2 Estimated SCE Labor For One Source Selection 3-53 Figure 3-3 SCE...incorporated into the source selection sponsoring organization’s technical/management team for incorporation into acquisition decisions . The SCE team...expertise, past performance, and organizational capacity in acquisition decisions . The Capability Maturity Model Basic Concepts The CMM is based on the
2006-07-13
echnology T ervicesS Integrated Systems & Solutions Integrated yste s S olutionsS • 30,000 Employees • 5 Principal Businesses Organized Into 3 Major...Solutions -Data Sources -Search Engine -Notification Policy -DSCC/IST Lead -Industry Capabilities - Organic Capabilities -Pro-active upgrades -DMEA Lead...needs to TLCSM EC. Cathi Crabtree Voting Members Accomplishments WG Organization we can build on DAU Distance Leaning Modules WG Strategic Plan Re
New evaporator station for the center for accelerator target science
NASA Astrophysics Data System (ADS)
Greene, John P.; Labib, Mina
2018-05-01
As part of an equipment grant provided by DOE-NP for the Center for Accelerator Target Science (CATS) initiative, the procurement of a new, electron beam, high-vacuum deposition system was identified as a priority to insure reliable and continued availability of high-purity targets. The apparatus is designed to contain TWO electron beam guns; a standard 4-pocket 270° geometry source as well as an electron bombardment source. The acquisition of this new system allows for the replacement of TWO outdated and aging vacuum evaporators. Also included is an additional thermal boat source, enhancing our capability within this deposition unit. Recommended specifications for this system included an automated, high-vacuum pumping station, a deposition chamber with a rotating and heated substrate holder for uniform coating capabilities and incorporating computer-controlled state-of-the-art thin film technologies. Design specifications, enhanced capabilities and the necessary mechanical modifications for our target work are discussed.
30 CFR 56.4500 - Heat sources.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 57.4500 - Heat sources.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 57.4500 - Heat sources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 57.4500 - Heat sources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 56.4500 - Heat sources.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 56.4500 - Heat sources.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 57.4500 - Heat sources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Heat sources. 57.4500 Section 57.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 57.4500 Heat sources. Heat sources capable of producing combustion...
30 CFR 56.4500 - Heat sources.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Heat sources. 56.4500 Section 56.4500 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND NONMETAL MINE SAFETY AND... Installation/construction/maintenance § 56.4500 Heat sources. Heat sources capable of producing combustion...
Optical tracking of nanoscale particles in microscale environments
NASA Astrophysics Data System (ADS)
Mathai, P. P.; Liddle, J. A.; Stavis, S. M.
2016-03-01
The trajectories of nanoscale particles through microscale environments record useful information about both the particles and the environments. Optical microscopes provide efficient access to this information through measurements of light in the far field from nanoparticles. Such measurements necessarily involve trade-offs in tracking capabilities. This article presents a measurement framework, based on information theory, that facilitates a more systematic understanding of such trade-offs to rationally design tracking systems for diverse applications. This framework includes the degrees of freedom of optical microscopes, which determine the limitations of tracking measurements in theory. In the laboratory, tracking systems are assemblies of sources and sensors, optics and stages, and nanoparticle emitters. The combined characteristics of such systems determine the limitations of tracking measurements in practice. This article reviews this tracking hardware with a focus on the essential functions of nanoparticles as optical emitters and microenvironmental probes. Within these theoretical and practical limitations, experimentalists have implemented a variety of tracking systems with different capabilities. This article reviews a selection of apparatuses and techniques for tracking multiple and single particles by tuning illumination and detection, and by using feedback and confinement to improve the measurements. Prior information is also useful in many tracking systems and measurements, which apply across a broad spectrum of science and technology. In the context of the framework and review of apparatuses and techniques, this article reviews a selection of applications, with particle diffusion serving as a prelude to tracking measurements in biological, fluid, and material systems, fabrication and assembly processes, and engineered devices. In so doing, this review identifies trends and gaps in particle tracking that might influence future research.
NASA Astrophysics Data System (ADS)
Yu, Lingfeng; Liu, Gangjun; Rubinstein, Marc; Saidi, Arya; Wong, Brian J. F.; Chen, Zhongping
2009-11-01
Optical coherence tomography (OCT) is an evolving noninvasive imaging modality that has been used to image the human larynx during surgical endoscopy. The design of a long gradient index (GRIN) lens-based probe capable of capturing images of the human larynx by use of swept-source OCT during a typical office-based laryngoscopy examination is presented. In vivo OCT imaging of the human larynx is demonstrated with a rate of 40 frames per second. Dynamic vibration of the vocal folds is recorded to provide not only high-resolution cross-sectional tissue structures but also vibration parameters, such as the vibration frequency and magnitude of the vocal cords, which provides important information for clinical diagnosis and treatment, as well as fundamental research of the voice itself. Office-based OCT is a promising imaging modality to study the larynx for physicians in otolaryngology.
Stereoscopic augmented reality with pseudo-realistic global illumination effects
NASA Astrophysics Data System (ADS)
de Sorbier, Francois; Saito, Hideo
2014-03-01
Recently, augmented reality has become very popular and has appeared in our daily life with gaming, guiding systems or mobile phone applications. However, inserting object in such a way their appearance seems natural is still an issue, especially in an unknown environment. This paper presents a framework that demonstrates the capabilities of Kinect for convincing augmented reality in an unknown environment. Rather than pre-computing a reconstruction of the scene like proposed by most of the previous method, we propose a dynamic capture of the scene that allows adapting to live changes of the environment. Our approach, based on the update of an environment map, can also detect the position of the light sources. Combining information from the environment map, the light sources and the camera tracking, we can display virtual objects using stereoscopic devices with global illumination effects such as diffuse and mirror reflections, refractions and shadows in real time.
Putora, Paul Martin; Oldenburg, Jan
2013-09-19
Occasionally, medical decisions have to be taken in the absence of evidence-based guidelines. Other sources can be drawn upon to fill in the gaps, including experience and intuition. Authorities or experts, with their knowledge and experience, may provide further input--known as "eminence-based medicine". Due to the Internet and digital media, interactions among physicians now take place at a higher rate than ever before. With the rising number of interconnected individuals and their communication capabilities, the medical community is obtaining the properties of a swarm. The way individual physicians act depends on other physicians; medical societies act based on their members. Swarm behavior might facilitate the generation and distribution of knowledge as an unconscious process. As such, "swarm-based medicine" may add a further source of information to the classical approaches of evidence- and eminence-based medicine. How to integrate swarm-based medicine into practice is left to the individual physician, but even this decision will be influenced by the swarm.
Yu, Lingfeng; Liu, Gangjun; Rubinstein, Marc; Saidi, Arya; Wong, Brian J F; Chen, Zhongping
2009-01-01
Optical coherence tomography (OCT) is an evolving noninvasive imaging modality that has been used to image the human larynx during surgical endoscopy. The design of a long gradient index (GRIN) lens-based probe capable of capturing images of the human larynx by use of swept-source OCT during a typical office-based laryngoscopy examination is presented. In vivo OCT imaging of the human larynx is demonstrated with a rate of 40 frames per second. Dynamic vibration of the vocal folds is recorded to provide not only high-resolution cross-sectional tissue structures but also vibration parameters, such as the vibration frequency and magnitude of the vocal cords, which provides important information for clinical diagnosis and treatment, as well as fundamental research of the voice itself. Office-based OCT is a promising imaging modality to study the larynx for physicians in otolaryngology.
Yu, Lingfeng; Liu, Gangjun; Rubinstein, Marc; Saidi, Arya; Wong, Brian J.F.; Chen, Zhongping
2009-01-01
Optical coherence tomography (OCT) is an evolving noninvasive imaging modality that has been used to image the human larynx during surgical endoscopy. The design of a long gradient index (GRIN) lens–based probe capable of capturing images of the human larynx by use of swept-source OCT during a typical office-based laryngoscopy examination is presented. In vivo OCT imaging of the human larynx is demonstrated with a rate of 40 frames per second. Dynamic vibration of the vocal folds is recorded to provide not only high-resolution cross-sectional tissue structures but also vibration parameters, such as the vibration frequency and magnitude of the vocal cords, which provides important information for clinical diagnosis and treatment, as well as fundamental research of the voice itself. Office-based OCT is a promising imaging modality to study the larynx for physicians in otolaryngology. PMID:20059258
Reliability of self-reported weight and height among state bank employees.
Chor, D; Coutinho, E da S; Laurenti, R
1999-02-01
Self-reported weight and height were compared with direct measurements in order to evaluate the agreement between the two sources. Data were obtained from a cross-sectional study on health status from a probabilistic sample of 1,183 employees of a bank, in Rio de Janeiro State, Brazil. Direct measurements were made of 322 employees. Differences between the two sources were evaluated using mean differences, limits of agreement and intraclass correlation coefficient (ICC). Men and women tended to underestimate their weight while differences between self-reported and measured height were insignificant. Body mass index (BMI) mean differences were smaller than those observed for weight. ICC was over 0.98 for weight and 0.95 for BMI, expressing close agreement. Combining a graphical method with ICC may be useful in pilot studies to detect populational groups capable of providing reliable information on weight and height, thus minimizing resources needed for field work.
Automated motion artifact removal for intravital microscopy, without a priori information.
Lee, Sungon; Vinegoni, Claudio; Sebas, Matthew; Weissleder, Ralph
2014-03-28
Intravital fluorescence microscopy, through extended penetration depth and imaging resolution, provides the ability to image at cellular and subcellular resolution in live animals, presenting an opportunity for new insights into in vivo biology. Unfortunately, physiological induced motion components due to respiration and cardiac activity are major sources of image artifacts and impose severe limitations on the effective imaging resolution that can be ultimately achieved in vivo. Here we present a novel imaging methodology capable of automatically removing motion artifacts during intravital microscopy imaging of organs and orthotopic tumors. The method is universally applicable to different laser scanning modalities including confocal and multiphoton microscopy, and offers artifact free reconstructions independent of the physiological motion source and imaged organ. The methodology, which is based on raw data acquisition followed by image processing, is here demonstrated for both cardiac and respiratory motion compensation in mice heart, kidney, liver, pancreas and dorsal window chamber.
Automated motion artifact removal for intravital microscopy, without a priori information
Lee, Sungon; Vinegoni, Claudio; Sebas, Matthew; Weissleder, Ralph
2014-01-01
Intravital fluorescence microscopy, through extended penetration depth and imaging resolution, provides the ability to image at cellular and subcellular resolution in live animals, presenting an opportunity for new insights into in vivo biology. Unfortunately, physiological induced motion components due to respiration and cardiac activity are major sources of image artifacts and impose severe limitations on the effective imaging resolution that can be ultimately achieved in vivo. Here we present a novel imaging methodology capable of automatically removing motion artifacts during intravital microscopy imaging of organs and orthotopic tumors. The method is universally applicable to different laser scanning modalities including confocal and multiphoton microscopy, and offers artifact free reconstructions independent of the physiological motion source and imaged organ. The methodology, which is based on raw data acquisition followed by image processing, is here demonstrated for both cardiac and respiratory motion compensation in mice heart, kidney, liver, pancreas and dorsal window chamber. PMID:24676021
Effective Materials Property Information Management for the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju; Cebon, David; Arnold, Steve
2010-01-01
This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less
Manifestations of "Capabilities Poverty" with Learners Attending Informal Settlement Schools
ERIC Educational Resources Information Center
Maarman, Rouaan
2009-01-01
In this study I use the notion of "capabilities poverty", as theorised by Sen, to examine the experiences of learners attending informal settlement schools in North-West Province, South Africa. Sen distinguishes between functionings (what people do or their ability to do something) and capabilities (various combinations of what people…
Integrated Information Increases with Fitness in the Evolution of Animats
Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph
2011-01-01
One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639
2012-01-01
Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440
Programmable LED-based integrating sphere light source for wide-field fluorescence microscopy.
Rehman, Aziz Ul; Anwer, Ayad G; Goldys, Ewa M
2017-12-01
Wide-field fluorescence microscopy commonly uses a mercury lamp, which has limited spectral capabilities. We designed and built a programmable integrating sphere light (PISL) source which consists of nine LEDs, light-collecting optics, a commercially available integrating sphere and a baffle. The PISL source is tuneable in the range 365-490nm with a uniform spatial profile and a sufficient power at the objective to carry out spectral imaging. We retrofitted a standard fluorescence inverted microscope DM IRB (Leica) with a PISL source by mounting it together with a highly sensitive low- noise CMOS camera. The capabilities of the setup have been demonstrated by carrying out multispectral autofluorescence imaging of live BV2 cells. Copyright © 2017 Elsevier B.V. All rights reserved.
Performance testing of lidar receivers
NASA Technical Reports Server (NTRS)
Shams, M. Y.
1986-01-01
In addition to the considerations about the different types of noise sources, dynamic range, and linearity of a lidar receiver, one requires information about the pulse shape retaining capabilities of the receiver. For this purpose, relatively precise information about the height resolution as well as the recovery time of the receiver, due both to large transients and to fast changes in the received signal, is required. As more and more analog receivers using fast analog to digital converters and transient recorders will be used in the future lidar systems, methods to test these devices are essential. The method proposed for this purpose is shown. Tests were carried out using LCW-10, LT-20, and FTVR-2 as optical parts of the optical pulse generator circuits. A commercial optical receiver, LNOR, and a transient recorder, VK 220-4, were parts of the receiver system.
Modular Chemical Descriptor Language (MCDL): Stereochemical modules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gakh, Andrei A; Burnett, Michael N; Trepalin, Sergei V.
2011-01-01
In our previous papers we introduced the Modular Chemical Descriptor Language (MCDL) for providing a linear representation of chemical information. A subsequent development was the MCDL Java Chemical Structure Editor which is capable of drawing chemical structures from linear representations and generating MCDL descriptors from structures. In this paper we present MCDL modules and accompanying software that incorporate unique representation of molecular stereochemistry based on Cahn-Ingold-Prelog and Fischer ideas in constructing stereoisomer descriptors. The paper also contains additional discussions regarding canonical representation of stereochemical isomers, and brief algorithm descriptions of the open source LINDES, Java applet, and Open Babel MCDLmore » processing module software packages. Testing of the upgraded MCDL Java Chemical Structure Editor on compounds taken from several large and diverse chemical databases demonstrated satisfactory performance for storage and processing of stereochemical information in MCDL format.« less
NASA Technical Reports Server (NTRS)
Babiak-Vazquez, Adriana; Ruffaner, Lanie; Wear, Mary; Crucian Brian; Sams, Clarence; Lee, Lesley R.; Van Baalen, Mary
2016-01-01
Space medicine presents unique challenges and opportunities for epidemiologists, such as the use of telemedicine during spaceflight. Medical capabilities aboard the International Space Station (ISS) are limited due to severe restrictions on power, volume, and mass. Consequently, inflight health information is based heavily on crewmember (CM) self-report of signs and symptoms, rather than formal diagnoses. While CM's are in flight, the primary source of crew health information is verbal communication between physicians and crewmembers. In 2010 NASA implemented the Lifetime Surveillance of Astronaut Health, an occupational surveillance program for the U.S. Astronaut corps. This has shifted the epidemiological paradigm from tracking diagnoses based on traditional terrestrial clinical practice to one that incorporates symptomatology and may gain a more population-based understanding of early detection of disease process.
Imam, Neena; Barhen, Jacob
2009-01-01
For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. These sensors rely heavily on battery-operated system components to achieve highly functional automation in signal and information processing. In order to keep communication requirements minimal, it is desirable to perform as much processing on the receiver platforms as possible. However, the complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot bemore » readily met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on the optical-core digital processing platform recently introduced by Lenslet Inc. This demonstration of considerably faster signal processing capability should be of substantial significance to the design and innovation of future generations of distributed sensor networks.« less
A design approach for small vision-based autonomous vehicles
NASA Astrophysics Data System (ADS)
Edwards, Barrett B.; Fife, Wade S.; Archibald, James K.; Lee, Dah-Jye; Wilde, Doran K.
2006-10-01
This paper describes the design of a small autonomous vehicle based on the Helios computing platform, a custom FPGA-based board capable of supporting on-board vision. Target applications for the Helios computing platform are those that require lightweight equipment and low power consumption. To demonstrate the capabilities of FPGAs in real-time control of autonomous vehicles, a 16 inch long R/C monster truck was outfitted with a Helios board. The platform provided by such a small vehicle is ideal for testing and development. The proof of concept application for this autonomous vehicle was a timed race through an environment with obstacles. Given the size restrictions of the vehicle and its operating environment, the only feasible on-board sensor is a small CMOS camera. The single video feed is therefore the only source of information from the surrounding environment. The image is then segmented and processed by custom logic in the FPGA that also controls direction and speed of the vehicle based on visual input.
High-resolution single-shot spectral monitoring of hard x-ray free-electron laser radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makita, M.; Karvinen, P.; Zhu, D.
We have developed an on-line spectrometer for hard x-ray free-electron laser (XFEL) radiation based on a nanostructured diamond diffraction grating and a bent crystal analyzer. Our method provides high spectral resolution, interferes negligibly with the XFEL beam, and can withstand the intense hard x-ray pulses at high repetition rates of >100 Hz. The spectrometer is capable of providing shot-to-shot spectral information for the normalization of data obtained in scientific experiments and optimization of the accelerator operation parameters. We have demonstrated these capabilities of the setup at the Linac Coherent Light Source, in self-amplified spontaneous emission mode at full energy ofmore » >1 mJ with a 120 Hz repetition rate, obtaining a resolving power of Ε/δΕ > 3 × 10 4. In conclusion, the device was also used to monitor the effects of pulse duration down to 8 fs by analysis of the spectral spike width.« less
The Bright, Artificial Intelligence-Augmented Future of Neuroimaging Reading.
Hainc, Nicolin; Federau, Christian; Stieltjes, Bram; Blatow, Maria; Bink, Andrea; Stippich, Christoph
2017-01-01
Radiologists are among the first physicians to be directly affected by advances in computer technology. Computers are already capable of analyzing medical imaging data, and with decades worth of digital information available for training, will an artificial intelligence (AI) one day signal the end of the human radiologist? With the ever increasing work load combined with the looming doctor shortage, radiologists will be pushed far beyond their current estimated 3 s allotted time-of-analysis per image; an AI with super-human capabilities might seem like a logical replacement. We feel, however, that AI will lead to an augmentation rather than a replacement of the radiologist. The AI will be relied upon to handle the tedious, time-consuming tasks of detecting and segmenting outliers while possibly generating new, unanticipated results that can then be used as sources of medical discovery. This will affect not only radiologists but all physicians and also researchers dealing with medical imaging. Therefore, we must embrace future technology and collaborate interdisciplinary to spearhead the next revolution in medicine.
Infrastructure development for radioactive materials at the NSLS-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprouster, D. J.; Weidner, R.; Ghose, S. K.
2018-02-01
The X-ray Powder Diffraction (XPD) Beamline at the National Synchrotron Light Source-II is a multipurpose instrument designed for high-resolution, high-energy X-ray scattering techniques. In this article, the capabilities, opportunities and recent developments in the characterization of radioactive materials at XPD are described. The overarching goal of this work is to provide researchers access to advanced synchrotron techniques suited to the structural characterization of materials for advanced nuclear energy systems. XPD is a new beamline providing high photon flux for X-ray Diffraction, Pair Distribution Function analysis and Small Angle X-ray Scattering. The infrastructure and software described here extend the existing capabilitiesmore » at XPD to accommodate radioactive materials. Such techniques will contribute crucial information to the characterization and quantification of advanced materials for nuclear energy applications. We describe the automated radioactive sample collection capabilities and recent X-ray Diffraction and Small Angle X-ray Scattering results from neutron irradiated reactor pressure vessel steels and oxide dispersion strengthened steels.« less
High-resolution single-shot spectral monitoring of hard x-ray free-electron laser radiation
Makita, M.; Karvinen, P.; Zhu, D.; ...
2015-10-16
We have developed an on-line spectrometer for hard x-ray free-electron laser (XFEL) radiation based on a nanostructured diamond diffraction grating and a bent crystal analyzer. Our method provides high spectral resolution, interferes negligibly with the XFEL beam, and can withstand the intense hard x-ray pulses at high repetition rates of >100 Hz. The spectrometer is capable of providing shot-to-shot spectral information for the normalization of data obtained in scientific experiments and optimization of the accelerator operation parameters. We have demonstrated these capabilities of the setup at the Linac Coherent Light Source, in self-amplified spontaneous emission mode at full energy ofmore » >1 mJ with a 120 Hz repetition rate, obtaining a resolving power of Ε/δΕ > 3 × 10 4. In conclusion, the device was also used to monitor the effects of pulse duration down to 8 fs by analysis of the spectral spike width.« less
Innovative Near Real-Time Data Dissemination Tools Developed by the Space Weather Research Center
NASA Astrophysics Data System (ADS)
Mullinix, R.; Maddox, M. M.; Berrios, D.; Kuznetsova, M.; Pulkkinen, A.; Rastaetter, L.; Zheng, Y.
2012-12-01
Space weather affects virtually all of NASA's endeavors, from robotic missions to human exploration. Knowledge and prediction of space weather conditions are therefore essential to NASA operations. The diverse nature of currently available space environment measurements and modeling products compels the need for a single access point to such information. The Integrated Space Weather Analysis (iSWA) System provides this single point access along with the capability to collect and catalog a vast range of sources including both observational and model data. NASA Goddard Space Weather Research Center heavily utilizes the iSWA System daily for research, space weather model validation, and forecasting for NASA missions. iSWA provides the capabilities to view and analyze near real-time space weather data from any where in the world. This presentation will describe the technology behind the iSWA system and describe how to use the system for space weather research, forecasting, training, education, and sharing.
Tiret, Brice; Shestov, Alexander A.; Valette, Julien; Henry, Pierre-Gilles
2017-01-01
Most current brain metabolic models are not capable of taking into account the dynamic isotopomer information available from fine structure multiplets in 13C spectra, due to the difficulty of implementing such models. Here we present a new approach that allows automatic implementation of multi-compartment metabolic models capable of fitting any number of 13C isotopomer curves in the brain. The new automated approach also makes it possible to quickly modify and test new models to best describe the experimental data. We demonstrate the power of the new approach by testing the effect of adding separate pyruvate pools in astrocytes and neurons, and adding a vesicular neuronal glutamate pool. Including both changes reduced the global fit residual by half and pointed to dilution of label prior to entry into the astrocytic TCA cycle as the main source of glutamine dilution. The glutamate-glutamine cycle rate was particularly sensitive to changes in the model. PMID:26553273
Infrastructure development for radioactive materials at the NSLS-II
Sprouster, David J.; Weidner, R.; Ghose, S. K.; ...
2017-11-04
The X-ray Powder Diffraction (XPD) Beamline at the National Synchrotron Light Source-II is a multipurpose instrument designed for high-resolution, high-energy X-ray scattering techniques. In this paper, the capabilities, opportunities and recent developments in the characterization of radioactive materials at XPD are described. The overarching goal of this work is to provide researchers access to advanced synchrotron techniques suited to the structural characterization of materials for advanced nuclear energy systems. XPD is a new beamline providing high photon flux for X-ray Diffraction, Pair Distribution Function analysis and Small Angle X-ray Scattering. The infrastructure and software described here extend the existing capabilitiesmore » at XPD to accommodate radioactive materials. Such techniques will contribute crucial information to the characterization and quantification of advanced materials for nuclear energy applications. Finally, we describe the automated radioactive sample collection capabilities and recent X-ray Diffraction and Small Angle X-ray Scattering results from neutron irradiated reactor pressure vessel steels and oxide dispersion strengthened steels.« less
Infrasound's capability to detect and characterise volcanic events, from local to regional scale.
NASA Astrophysics Data System (ADS)
Taisne, Benoit; Perttu, Anna
2017-04-01
Local infrasound and seismic networks have been successfully used for identification and quantification of explosions at single volcanoes. However the February, 2014 eruption of Kelud volcano, Indonesia, destroyed most of the local monitoring network. The use of remote seismic and infrasound sensors proved to be essential in the reconstruction of the eruptive sequence. The first recorded explosive event, with relatively weak seismic and infrasonic signature, was followed by a 2 hour sustained signal detected as far away as 11,000 km by infrasound sensors and up to 2,300 km away by seismometers. The volcanic intensity derived from these observations places the 2014 Kelud eruption between the intensity of the 1980 Mount St. Helens and the 1991 Pinatubo eruptions. The use of remote seismic stations and infrasound arrays in deriving valuable information about the onset, evolution, and intensity of volcanic eruptions is clear from the Kelud example. After this eruption the Singapore Infrasound Array became operational. This array, along with the other regional infrasound arrays which are part of the International Monitoring System, have recorded events from fireballs and regional volcanoes. The detection capability of this network for any specific volcanic event is not only dependent on the amplitude of the source, but also the propagation effects, noise level at each station, and characteristics of the regional persistent noise sources (like the microbarum). Combining the spatial and seasonal characteristics of this noise, within the same frequency band as significant eruptive events, with the probability of such events to occur, gives us a comprehensive understanding of detection capability for any of the 750 active or potentially active volcanoes in Southeast Asia.
NASA Astrophysics Data System (ADS)
Loose, B.; O'Shea, R.
2016-02-01
We describe the design and deployment of a water quality sonde that utilizes mobile phone networks for near-real time data telemetry. The REOL or Realtime Estuary Ocean Logger has the unique and valuable capability of logging data internally and simultaneously relaying the information to a webserver using a cellular modem. The internal circuitry consists of a GSM cellular modem, a microcontroller, and an SD card for data storage - these components are low cost, and backed up with circuit diagrams and programming libraries that are published under open source license. This configuration is versatile and is capable of reading instrument output from a broad spectrum of devices, including serial, TTL, analog voltage (0 - 5V), and analog current (typically 4-20 mA). We find the greatest challenges lie in development of smart software that is capable of handling the conditions brought on by this harsh environment. We have programmed the sonde to first determine whether it is submerged by water, and record the temperature on the electronics before deciding whether to telemeter measurements over the cellular network. The Google App EngineTM provides an interactive visualization platform. We have tested the REOL with a variety of water quality sensors. In the configuration described here, we use a thermistor, depth gauge and torroidal conductivity sensor to measure water temperature, water level and conductivity up to 200 mS/cm. The latter is necessary for studies in hypersaline estuaries, where porewater salinity can exceed 100 g/kg. We present data from two estuaries in West Africa and from a longer-term deployment in the Narragansett Bay, Rhode Island.
The role of social media in the intelligence cycle
NASA Astrophysics Data System (ADS)
Forrester, Bruce; den Hollander, Kees
2016-05-01
Social Media (SM) is a relatively new phenomenon. Intelligence agencies have been struggling to understand how to exploit the social pulse that flows from this source. The paper starts with a brief overview of SM with some examples of how it is being used by adversaries and how we might be able to exploit this usage. Often treated as another form of open source intelligence (OSINT), we look at some of the differences with traditional OSINT compared to SM then outline the possible uses by military intelligence. The next section looks at how SM fits into the different phases of the intelligence cycle: Direction, Collection, Processing and Dissemination. For the first phase, Direction, a number of questions are identified that can be answered typically by SM. For the second phase, the Collection, it is explained how SM, as an asset, transfers questions into methods and the use of different SM resources (e.g. marketer, cognitive behavioral psychologist) and sources to seek the required information. SM is exploited as a multi-intelligence capability. For the Processing phase some aspects are described in how to deal with this capacity (e.g. enabling other intelligence sources) and also which techniques are used to be able to validate the SM sources used.
Friendly Extensible Transfer Tool Beta Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.
2016-04-15
Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).
Evaluating Lignocellulosic Biomass, Its Derivatives, and Downstream Products with Raman Spectroscopy
Lupoi, Jason S.; Gjersing, Erica; Davis, Mark F.
2015-01-01
The creation of fuels, chemicals, and materials from plants can aid in replacing products fabricated from non-renewable energy sources. Before using biomass in downstream applications, it must be characterized to assess chemical traits, such as cellulose, lignin, or lignin monomer content, or the sugars released following an acid or enzymatic hydrolysis. The measurement of these traits allows researchers to gage the recalcitrance of the plants and develop efficient deconstruction strategies to maximize yields. Standard methods for assessing biomass phenotypes often have experimental protocols that limit their use for screening sizeable numbers of plant species. Raman spectroscopy, a non-destructive, non-invasive vibrational spectroscopy technique, is capable of providing qualitative, structural information and quantitative measurements. Applications of Raman spectroscopy have aided in alleviating the constraints of standard methods by coupling spectral data with multivariate analysis to construct models capable of predicting analytes. Hydrolysis and fermentation products, such as glucose and ethanol, can be quantified off-, at-, or on-line. Raman imaging has enabled researchers to develop a visual understanding of reactions, such as different pretreatment strategies, in real-time, while also providing integral chemical information. This review provides an overview of what Raman spectroscopy is, and how it has been applied to the analysis of whole lignocellulosic biomass, its derivatives, and downstream process monitoring. PMID:25941674
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCord, R.A.; Olson, R.J.
1988-01-01
Environmental research and assessment activities at Oak Ridge National Laboratory (ORNL) include the analysis of spatial and temporal patterns of ecosystem response at a landscape scale. Analysis through use of geographic information system (GIS) involves an interaction between the user and thematic data sets frequently expressed as maps. A portion of GIS analysis has a mathematical or statistical aspect, especially for the analysis of temporal patterns. ARC/INFO is an excellent tool for manipulating GIS data and producing the appropriate map graphics. INFO also has some limited ability to produce statistical tabulation. At ORNL we have extended our capabilities by graphicallymore » interfacing ARC/INFO and SAS/GRAPH to provide a combined mapping and statistical graphics environment. With the data management, statistical, and graphics capabilities of SAS added to ARC/INFO, we have expanded the analytical and graphical dimensions of the GIS environment. Pie or bar charts, frequency curves, hydrographs, or scatter plots as produced by SAS can be added to maps from attribute data associated with ARC/INFO coverages. Numerous, small, simplified graphs can also become a source of complex map ''symbols.'' These additions extend the dimensions of GIS graphics to include time, details of the thematic composition, distribution, and interrelationships. 7 refs., 3 figs.« less
Evaluating lignocellulosic biomass, its derivatives, and downstream products with Raman spectroscopy
Lupoi, Jason S.; Gjersing, Erica; Davis, Mark F.
2015-04-20
The creation of fuels, chemicals, and materials from plants can aid in replacing products fabricated from non-renewable energy sources. Before using biomass in downstream applications, it must be characterized to assess chemical traits, such as cellulose, lignin, or lignin monomer content, or the sugars released following an acid or enzymatic hydrolysis. The measurement of these traits allows researchers to gage the recalcitrance of the plants and develop efficient deconstruction strategies to maximize yields. Standard methods for assessing biomass phenotypes often have experimental protocols that limit their use for screening sizeable numbers of plant species. Raman spectroscopy, a non-destructive, non-invasive vibrationalmore » spectroscopy technique, is capable of providing qualitative, structural information and quantitative measurements. Applications of Raman spectroscopy have aided in alleviating the constraints of standard methods by coupling spectral data with multivariate analysis to construct models capable of predicting analytes. Hydrolysis and fermentation products, such as glucose and ethanol, can be quantified off-, at-, or on-line. Raman imaging has enabled researchers to develop a visual understanding of reactions, such as different pretreatment strategies, in real-time, while also providing integral chemical information. Finally, this review provides an overview of what Raman spectroscopy is, and how it has been applied to the analysis of whole lignocellulosic biomass, its derivatives, and downstream process monitoring.« less
Barton, Jennifer Kehlet; Guzman, Francisco; Tumlinson, Alexandre
2004-01-01
We develop a dual-modality device that combines the anatomical imaging capabilities of optical coherence tomography (OCT) with the functional capabilities of laser-induced fluorescence (LIF) spectroscopy. OCT provides cross-sectional images of tissue structure to a depth of up to 2 mm with approximately 10-microm resolution. LIF spectroscopy provides histochemical information in the form of emission spectra from a given tissue location. The OCT subsystem utilizes a superluminescent diode with a center wavelength of 1300 nm, whereas a helium cadmium laser provides the LIF excitation source at wavelengths of 325 and 442 nm. Preliminary data are obtained on eight postmortem aorta samples, each 10 mm in length. OCT images and LIF spectra give complementary information from normal and atherosclerotic portions of aorta wall. OCT images show structures such as intima, media, internal elastic lamina, and fibrotic regions. Emission spectra ratios of 520/490 (325-nm excitation) and 595/635 (442-nm excitation) could be used to identify normal and plaque regions with 97 and 91% correct classification rates, respectively. With miniaturization of the delivery probe and improvements in system speed, this dual-modality device could provide a valuable tool for identification and characterization of atherosclerotic plaques. (c) 2004 Society of Photo-Optical Instrumentation Engineers.
Jiang, Jiefeng; Egner, Tobias
2014-01-01
Resolving conflicting sensory and motor representations is a core function of cognitive control, but it remains uncertain to what degree control over different sources of conflict is implemented by shared (domain general) or distinct (domain specific) neural resources. Behavioral data suggest conflict–control to be domain specific, but results from neuroimaging studies have been ambivalent. Here, we employed multivoxel pattern analyses that can decode a brain region's informational content, allowing us to distinguish incidental activation overlap from actual shared information processing. We trained independent sets of “searchlight” classifiers on functional magnetic resonance imaging data to decode control processes associated with stimulus-conflict (Stroop task) and ideomotor-conflict (Simon task). Quantifying the proportion of domain-specific searchlights (capable of decoding only one type of conflict) and domain-general searchlights (capable of decoding both conflict types) in each subject, we found both domain-specific and domain-general searchlights, though the former were more common. When mapping anatomical loci of these searchlights across subjects, neural substrates of stimulus- and ideomotor-specific conflict–control were found to be anatomically consistent across subjects, whereas the substrates of domain-general conflict–control were not. Overall, these findings suggest a hybrid neural architecture of conflict–control that entails both modular (domain specific) and global (domain general) components. PMID:23402762
Koetsenruijter, Jan; van Eikelenboom, Nathalie; van Lieshout, Jan; Vassilev, Ivo; Lionis, Christos; Todorova, Elka; Portillo, Mari Carmen; Foss, Christina; Serrano Gil, Manuel; Roukova, Poli; Angelaki, Agapi; Mujika, Agurtzane; Knutsen, Ingrid Ruud; Rogers, Anne; Wensing, Michel
2016-04-01
The objective of this study was to explore which aspects of social networks are related to self-management capabilities and if these networks have the potential to reduce the adverse health effects of deprivation. In a cross-sectional study we recruited type 2 diabetes patients in six European countries. Data on self-management capabilities was gathered through written questionnaires and data on social networks characteristics and social support through subsequent personal/telephone interviews. We used regression modelling to assess the effect of social support and education on self-management capabilities. In total 1692 respondents completed the questionnaire and the interview. Extensive informational networks, emotional networks, and attendance of community organisations were linked to better self-management capabilities. The association of self-management capabilities with informational support was especially strong in the low education group, whereas the association with emotional support was stronger in the high education group. Some of the social network characteristics showed a positive relation to self-management capabilities. The effect of informational support was strongest in low education populations and may therefore provide a possibility to reduce the adverse impact of low education on self-management capabilities. Self-management support interventions that take informational support in patients' networks into account may be most effective, especially in deprived populations. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Hussey, Peter S.; Ringel, Jeanne S.; Ahluwalia, Sangeeta; Price, Rebecca Anhang; Buttorff, Christine; Concannon, Thomas W.; Lovejoy, Susan L.; Martsolf, Grant R.; Rudin, Robert S.; Schultz, Dana; Sloss, Elizabeth M.; Watkins, Katherine E.; Waxman, Daniel; Bauman, Melissa; Briscombe, Brian; Broyles, James R.; Burns, Rachel M.; Chen, Emily K.; DeSantis, Amy Soo Jin; Ecola, Liisa; Fischer, Shira H.; Friedberg, Mark W.; Gidengil, Courtney A.; Ginsburg, Paul B.; Gulden, Timothy; Gutierrez, Carlos Ignacio; Hirshman, Samuel; Huang, Christina Y.; Kandrack, Ryan; Kress, Amii; Leuschner, Kristin J.; MacCarthy, Sarah; Maksabedian, Ervant J.; Mann, Sean; Matthews, Luke Joseph; May, Linnea Warren; Mishra, Nishtha; Miyashiro, Lisa; Muchow, Ashley N.; Nelson, Jason; Naranjo, Diana; O'Hanlon, Claire E.; Pillemer, Francesca; Predmore, Zachary; Ross, Rachel; Ruder, Teague; Rutter, Carolyn M.; Uscher-Pines, Lori; Vaiana, Mary E.; Vesely, Joseph V.; Hosek, Susan D.; Farmer, Carrie M.
2016-01-01
Abstract The Veterans Access, Choice, and Accountability Act of 2014 addressed the need for access to timely, high-quality health care for veterans. Section 201 of the legislation called for an independent assessment of various aspects of veterans' health care. The RAND Corporation was tasked with an assessment of the Department of Veterans Affairs (VA) current and projected health care capabilities and resources. An examination of data from a variety of sources, along with a survey of VA medical facility leaders, revealed the breadth and depth of VA resources and capabilities: fiscal resources, workforce and human resources, physical infrastructure, interorganizational relationships, and information resources. The assessment identified barriers to the effective use of these resources and capabilities. Analysis of data on access to VA care and the quality of that care showed that almost all veterans live within 40 miles of a VA health facility, but fewer have access to VA specialty care. Veterans usually receive care within 14 days of their desired appointment date, but wait times vary considerably across VA facilities. VA has long played a national leadership role in measuring the quality of health care. The assessment showed that VA health care quality was as good or better on most measures compared with other health systems, but quality performance lagged at some VA facilities. VA will require more resources and capabilities to meet a projected increase in veterans' demand for VA care over the next five years. Options for increasing capacity include accelerated hiring, full nurse practice authority, and expanded use of telehealth. PMID:28083424
CIRSS vertical data integration, San Bernardino study
NASA Technical Reports Server (NTRS)
Hodson, W.; Christenson, J.; Michel, R. (Principal Investigator)
1982-01-01
The creation and use of a vertically integrated data base, including LANDSAT data, for local planning purposes in a portion of San Bernardino County, California are described. The project illustrates that a vertically integrated approach can benefit local users, can be used to identify and rectify discrepancies in various data sources, and that the LANDSAT component can be effectively used to identify change, perform initial capability/suitability modeling, update existing data, and refine existing data in a geographic information system. Local analyses were developed which produced data of value to planners in the San Bernardino County Planning Department and the San Bernardino National Forest staff.
High-Resolution Spectroscopy with the Chandra X-ray Observatory
Canizares, Claude R. [MIT, Cambridge, Massachusetts, United States
2017-12-09
The capabilities of the Chandra X-ray Observatory and XMM-Newton for high-resolution spectroscopy have brought tradition plasma diagnostic techniques to the study of cosmic plasma. Observations have probed nearly every class of astronomical object, from young proto-starts through massive O starts and black hole binaries, supernova remnants, active galactic nuclei, and the intergalactic medium. Many of these sources show remarkable rich spectra that reveal new physical information, such as emission measure distributions, elemental abundances, accretion disk and wind signatures, and time variability. This talk will present an overview of the Chandra instrumentaton and selected examples of spectral observations of astrophysical and cosmological importance.
Space vehicle Viterbi decoder. [data converters, algorithms
NASA Technical Reports Server (NTRS)
1975-01-01
The design and fabrication of an extremely low-power, constraint-length 7, rate 1/3 Viterbi decoder brassboard capable of operating at information rates of up to 100 kb/s is presented. The brassboard is partitioned to facilitate a later transition to an LSI version requiring even less power. The effect of soft-decision thresholds, path memory lengths, and output selection algorithms on the bit error rate is evaluated. A branch synchronization algorithm is compared with a more conventional approach. The implementation of the decoder and its test set (including all-digital noise source) are described along with the results of various system tests and evaluations. Results and recommendations are presented.
Mosaic of coded aperture arrays
Fenimore, Edward E.; Cannon, Thomas M.
1980-01-01
The present invention pertains to a mosaic of coded aperture arrays which is capable of imaging off-axis sources with minimum detector size. Mosaics of the basic array pattern create a circular on periodic correlation of the object on a section of the picture plane. This section consists of elements of the central basic pattern as well as elements from neighboring patterns and is a cyclic version of the basic pattern. Since all object points contribute a complete cyclic version of the basic pattern, a section of the picture, which is the size of the basic aperture pattern, contains all the information necessary to image the object with no artifacts.
Using Spatial Correlations of SPDC Sources for Increasing the Signal to Noise Ratio in Images
NASA Astrophysics Data System (ADS)
Ruíz, A. I.; Caudillo, R.; Velázquez, V. M.; Barrios, E.
2017-05-01
We experimentally show that, by using spatial correlations of photon pairs produced by Spontaneous Parametric Down-Conversion, it is possible to increase the Signal to Noise Ratio in images of objects illuminated with those photons; in comparison, objects illuminated with light from a laser present a minor ratio. Our simple experimental set-up was capable to produce an average improvement in signal to noise ratio of 11dB of Parametric Down-Converted light over laser light. This simple method can be easily implemented for obtaining high contrast images of faint objects and for transmitting information with low noise.
NASA Astrophysics Data System (ADS)
Van Liew, Seth; Bertozzi, William; D'Olympia, Nathan; Franklin, Wilbur A.; Korbly, Stephen E.; Ledoux, Robert J.; Wilson, Cody M.
A x-ray inspection system utilizing a continuous-wave 9 MeV rhodotron x-ray source for scanning cargo containers is presented. This system scans for contraband, anomalies, stowaway passengers, and nuclear threats for trucks and towed cargo containers. A transmission image is generated concurrently with a 3D image of the cargo, the latter presenting material information in the form of atomic number and density. Neutrons from photofission are also detected during each scan. In addition, nuclear resonance fluorescence detectors are capable of identifying specific isotopes. This system has recently been deployed at the Port of Boston.
Fungal photobiology: visible light as a signal for stress, space and time
Fuller, Kevin K.; Loros, Jennifer J.; Dunlap, Jay C.
2014-01-01
Visible light is an important source of energy and information for much of life on this planet. Though fungi are neither photosynthetic nor capable of observing adjacent objects, it is estimated that the majority of fungal species display some form of light response, ranging from developmental decision making to metabolic reprogramming to pathogenesis. As such, advances in our understanding of fungal photobiology will likely reach the broad fields impacted by these organisms, including agriculture, industry and medicine. In this review, we will first describe the mechanisms by which fungi sense light and then discuss the selective advantages likely imparted by their ability to do so. PMID:25323429
NASA Technical Reports Server (NTRS)
Cornish, C. R.
1983-01-01
Following reception and analog to digital conversion (A/D) conversion, atmospheric radar backscatter echoes need to be processed so as to obtain desired information about atmospheric processes and to eliminate or minimize contaminating contributions from other sources. Various signal processing techniques have been implemented at mesosphere-stratosphere-troposphere (MST) radar facilities to estimate parameters of interest from received spectra. Such estimation techniques need to be both accurate and sufficiently efficient to be within the capabilities of the particular data-processing system. The various techniques used to parameterize the spectra of received signals are reviewed herein. Noise estimation, electromagnetic interference, data smoothing, correlation, and the Doppler effect are among the specific points addressed.
Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition
NASA Astrophysics Data System (ADS)
LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.
2013-12-01
Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites with depths of exploration ranging from 150 to 450 m. The sites included shallow geothermal sites near Reno Nevada, Pomarance Italy, and Volterra Italy; a mineral exploration site near Timmins Quebec; and a landslide investigation near Vajont Dam in northern Italy. These sites provided a series of challenges in survey design and deployment including some extremely difficult terrain and a broad range of background resistivity and induced values. Despite these challenges, comparison of multi-source results to resistivity and induced polarization data collection with more traditional methods support the thesis that the multi-source approach is capable of providing substantial improvements in both depth of penetration and resolution over conventional approaches.