Sample records for observer tool capable

  1. Expert systems tools for Hubble Space Telescope observation scheduling

    NASA Technical Reports Server (NTRS)

    Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark

    1987-01-01

    The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.

  2. Problem-Solving Environments (PSEs) to Support Innovation Clustering

    NASA Technical Reports Server (NTRS)

    Gill, Zann

    1999-01-01

    This paper argues that there is need for high level concepts to inform the development of Problem-Solving Environment (PSE) capability. A traditional approach to PSE implementation is to: (1) assemble a collection of tools; (2) integrate the tools; and (3) assume that collaborative work begins after the PSE is assembled. I argue for the need to start from the opposite premise, that promoting human collaboration and observing that process comes first, followed by the development of supporting tools, and finally evolution of PSE capability through input from collaborating project teams.

  3. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Technical Reports Server (NTRS)

    Boyer, Jeffrey S.

    1994-01-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  4. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Astrophysics Data System (ADS)

    Boyer, Jeffrey S.

    1994-11-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  5. The Capabilities of the Graphical Observation Scheduling System (GROSS) as Used by the Astro-2 Spacelab Mission

    NASA Technical Reports Server (NTRS)

    Phillips, Shaun

    1996-01-01

    The Graphical Observation Scheduling System (GROSS) and its functionality and editing capabilities are reported on. The GROSS system was developed as a replacement for a suite of existing programs and associated processes with the aim of: providing a software tool that combines the functionality of several of the existing programs, and provides a Graphical User Interface (GUI) that gives greater data visibility and editing capabilities. It is considered that the improved editing capability provided by this approach enhanced the efficiency of the second astronomical Spacelab mission's (ASTRO-2) mission planning.

  6. Bridging the Gap Between NASA Earth Observations and Decision Makers Through the NASA Develop National Program

    NASA Astrophysics Data System (ADS)

    Remillard, C. M.; Madden, M.; Favors, J.; Childs-Gleason, L.; Ross, K. W.; Rogers, L.; Ruiz, M. L.

    2016-06-01

    The NASA DEVELOP National Program bridges the gap between NASA Earth Science and society by building capacity in both participants and partner organizations that collaborate to conduct projects. These rapid feasibility projects highlight the capabilities of satellite and aerial Earth observations. Immersion of decision and policy makers in these feasibility projects increases awareness of the capabilities of Earth observations and contributes to the tools and resources available to support enhanced decision making. This paper will present the DEVELOP model, best practices, and two case studies, the Colombia Ecological Forecasting project and the Miami-Dade County Ecological Forecasting project, that showcase the successful adoption of tools and methods for decision making. Through over 90 projects each year, DEVELOP is always striving for the innovative, practical, and beneficial use of NASA Earth science data.

  7. JWST NIRCam Time Series Observations

    NASA Technical Reports Server (NTRS)

    Greene, Tom; Schlawin, E.

    2017-01-01

    We explain how to make time-series observations with the Near-Infrared camera (NIRCam) science instrument of the James Webb Space Telescope. Both photometric and spectroscopic observations are described. We present the basic capabilities and performance of NIRCam and show examples of how to set its observing parameters using the Space Telescope Science Institute's Astronomer's Proposal Tool (APT).

  8. The APIS service : a tool for accessing value-added HST planetary auroral observations over 1997-2015

    NASA Astrophysics Data System (ADS)

    Lamy, L.; Henry, F.; Prangé, R.; Le Sidaner, P.

    2015-10-01

    The Auroral Planetary Imaging and Spectroscopy (APIS) service http://obspm.fr/apis/ provides an open and interactive access to processed auroral observations of the outer planets and their satellites. Such observations are of interest for a wide community at the interface between planetology, magnetospheric and heliospheric physics. APIS consists of (i) a high level database, built from planetary auroral observations acquired by the Hubble Space Telescope (HST) since 1997 with its mostly used Far-Ultraviolet spectro- imagers, (ii) a dedicated search interface aimed at browsing efficiently this database through relevant conditional search criteria (Figure 1) and (iii) the ability to interactively work with the data online through plotting tools developed by the Virtual Observatory (VO) community, such as Aladin and Specview. This service is VO compliant and can therefore also been queried by external search tools of the VO community. The diversity of available data and the capability to sort them out by relevant physical criteria shall in particular facilitate statistical studies, on long-term scales and/or multi-instrumental multispectral combined analysis [1,2]. We will present the updated capabilities of APIS with several examples. Several tutorials are available online.

  9. Evaluation of meteorological airborne Doppler radar

    NASA Technical Reports Server (NTRS)

    Hildebrand, P. H.; Mueller, C. K.

    1984-01-01

    This paper will discuss the capabilities of airborne Doppler radar for atmospheric sciences research. The evaluation is based on airborne and ground based Doppler radar observations of convective storms. The capability of airborne Doppler radar to measure horizontal and vertical air motions is evaluated. Airborne Doppler radar is shown to be a viable tool for atmospheric sciences research.

  10. The Importance of Earth Observations and Data Collaboration within Environmental Intelligence Supporting Arctic Research

    NASA Technical Reports Server (NTRS)

    Casas, Joseph

    2017-01-01

    Within the IARPC Collaboration Team activities of 2016, Arctic in-situ and remote earth observations advanced topics such as :1) exploring the role for new and innovative autonomous observing technologies in the Arctic; 2) advancing catalytic national and international community based observing efforts in support of the National Strategy for the Arctic Region; and 3) enhancing the use of discovery tools for observing system collaboration such as the U.S. National Oceanic and Atmospheric Administration (NOAA) Arctic Environmental Response Management Application (ERMA) and the U.S. National Aeronautics and Space Administration (NASA) Arctic Collaborative Environment (ACE) project geo reference visualization decision support and exploitation internet based tools. Critical to the success of these earth observations for both in-situ and remote systems is the emerging of new and innovative data collection technologies and comprehensive modeling as well as enhanced communications and cyber infrastructure capabilities which effectively assimilate and dissemination many environmental intelligence products in a timely manner. The Arctic Collaborative Environment (ACE) project is well positioned to greatly enhance user capabilities for accessing, organizing, visualizing, sharing and producing collaborative knowledge for the Arctic.

  11. A Web-based Google-Earth Coincident Imaging Tool for Satellite Calibration and Validation

    NASA Astrophysics Data System (ADS)

    Killough, B. D.; Chander, G.; Gowda, S.

    2009-12-01

    The Group on Earth Observations (GEO) is coordinating international efforts to build a Global Earth Observation System of Systems (GEOSS) to meet the needs of its nine “Societal Benefit Areas”, of which the most demanding, in terms of accuracy, is climate. To accomplish this vision, satellite on-orbit and ground-based data calibration and validation (Cal/Val) of Earth observation measurements are critical to our scientific understanding of the Earth system. Existing tools supporting space mission Cal/Val are often developed for specific campaigns or events with little desire for broad application. This paper describes a web-based Google-Earth based tool for the calculation of coincident satellite observations with the intention to support a diverse international group of satellite missions to improve data continuity, interoperability and data fusion. The Committee on Earth Observing Satellites (CEOS), which includes 28 space agencies and 20 other national and international organizations, are currently operating and planning over 240 Earth observation satellites in the next 15 years. The technology described here will better enable the use of multiple sensors to promote increased coordination toward a GEOSS. The CEOS Systems Engineering Office (SEO) and the Working Group on Calibration and Validation (WGCV) support the development of the CEOS Visualization Environment (COVE) tool to enhance international coordination of data exchange, mission planning and Cal/Val events. The objective is to develop a simple and intuitive application tool that leverages the capabilities of Google-Earth web to display satellite sensor coverage areas and for the identification of coincident scene locations along with dynamic menus for flexibility and content display. Key features and capabilities include user-defined evaluation periods (start and end dates) and regions of interest (rectangular areas) and multi-user collaboration. Users can select two or more CEOS missions from a database including Satellite Tool Kit (STK) generated orbit information and perform rapid calculations to identify coincident scenes where the groundtracks of the CEOS mission instrument fields-of-view intersect. Calculated results are displayed on a customized Google-Earth web interface to view location and time information along with optional output to EXCEL table format. In addition, multiple viewports can be used for comparisons. COVE was first introduced to the CEOS WGCV community in May 2009. Since that time, the development of a prototype version has progressed. It is anticipated that the capabilities and applications of COVE can support a variety of international Cal/Val activities as well as provide general information on Earth observation coverage for education and societal benefit. This project demonstrates the utility of a systems engineering tool with broad international appeal for enhanced communication and data evaluation opportunities among international CEOS agencies. The COVE tool is publicly accessible via NASA servers.

  12. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  13. ISOON + SOLIS: Merging the Data Products

    NASA Astrophysics Data System (ADS)

    Radick, R.; Dalrymple, N.; Mozer, J.; Wiborg, P.; Harvey, J.; Henney, C.; Neidig, D.

    2005-05-01

    The combination of AFRL's ISOON and NSO's SOLIS offers significantly greater capability than the individual instruments. We are working toward merging the SOLIS and ISOON data products in a single central facility. The ISOON system currently includes both an observation facility and a remote analysis center (AC). The AC is capable of receiving data from both the ISOON observation facility as well as external sources. It archives the data and displays corrected images and time-lapse animations. The AC has a large number of digital tools that can be applied to solar images to provide quantitative information quickly and easily. Because of its convenient tools and ready archival capability, the ISOON AC is a natural place to merge products from SOLIS and ISOON. We have completed a preliminary integration of the ISOON and SOLIS data products. Eventually, we intend to distribute viewing stations to various users and academic institutions, install the AC software tools at a number of user locations, and publish ISOON/SOLIS data products jointly on a common web page. In addition, SOLIS data products, separately, are and will continue to be fully available on the NSO,s Digital Library and SOLIS web pages, and via the Virtual Solar Observatory. This work is being supported by the National Science Foundation and the Air Force Office of Scientific Research.

  14. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  15. NASA Tools for Climate Impacts on Water Resources

    NASA Technical Reports Server (NTRS)

    Toll, David; Doorn, Brad

    2010-01-01

    Climate and environmental change are expected to fundamentally alter the nation's hydrological cycle and water availability. Satellites provide global or near-global coverage using instruments, allowing for consistent, well-calibrated, and equivalent-quality data of the Earth system. A major goal for NASA climate and environmental change research is to create multi-instrument data sets to span the multi-decadal time scales of climate change and to combine these data with those from modeling and surface-based observing systems to improve process understanding and predictions. NASA and Earth science data and analyses will ultimately enable more accurate climate prediction, and characterization of uncertainties. NASA's Applied Sciences Program works with other groups, including other federal agencies, to transition demonstrated observational capabilities to operational capabilities. A summary of some of NASA tools for improved water resources management will be presented.

  16. New 30-50 Ghz Wideband Receiver for Nobeyama 45-M Telescope with Capability to Observe Three Zeeman

    NASA Astrophysics Data System (ADS)

    Huang, Yau De

    2018-01-01

    Zeeman measurement is the only tool to probe the magnetic field strengths directly. A new receiver covering 30-50 GHz frequency range is proposed for Nobeyama 45-m telescope based on the design of the ALMA Band 1 receiver. With dual linear polarization feed, wide IF bandwidth and state-of-the-art noise performance, it is capable to observe three Zeeman transitions (SO at 30.0 GHz and CCS at 33.7 and 45.4 GHz) toward the pre-protostellar cores simultaneously. This feature will not only increase the survey efficiency but also provide a reliable tool to calibrate the unwanted instrumental cross-polarization. Slim receiver layout also allows easy expansion to form focal plane array. We will present the receiver design and the current status of the pro

  17. Comparison of Artificial Immune System and Particle Swarm Optimization Techniques for Error Optimization of Machine Vision Based Tool Movements

    NASA Astrophysics Data System (ADS)

    Mahapatra, Prasant Kumar; Sethi, Spardha; Kumar, Amod

    2015-10-01

    In conventional tool positioning technique, sensors embedded in the motion stages provide the accurate tool position information. In this paper, a machine vision based system and image processing technique for motion measurement of lathe tool from two-dimensional sequential images captured using charge coupled device camera having a resolution of 250 microns has been described. An algorithm was developed to calculate the observed distance travelled by the tool from the captured images. As expected, error was observed in the value of the distance traversed by the tool calculated from these images. Optimization of errors due to machine vision system, calibration, environmental factors, etc. in lathe tool movement was carried out using two soft computing techniques, namely, artificial immune system (AIS) and particle swarm optimization (PSO). The results show better capability of AIS over PSO.

  18. Distributed Observer Network (DON), Version 3.0, User's Guide

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    The Distributed Observer Network (DON) is a data presentation tool developed by the National Aeronautics and Space Administration (NASA) to distribute and publish simulation results. Leveraging the display capabilities inherent in modern gaming technology, DON places users in a fully navigable 3-D environment containing graphical models and allows the users to observe how those models evolve and interact over time in a given scenario. Each scenario is driven with data that has been generated by authoritative NASA simulation tools and exported in accordance with a published data interface specification. This decoupling of the data from the source tool enables DON to faithfully display a simulator's results and ensure that every simulation stakeholder will view the exact same information every time.

  19. HiRel - Reliability/availability integrated workstation tool

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Dugan, Joanne B.

    1992-01-01

    The HiRel software tool is described and demonstrated by application to the mission avionics subsystem of the Advanced System Integration Demonstrations (ASID) system that utilizes the PAVE PILLAR approach. HiRel marks another accomplishment toward the goal of producing a totally integrated computer-aided design (CAD) workstation design capability. Since a reliability engineer generally represents a reliability model graphically before it can be solved, the use of a graphical input description language increases productivity and decreases the incidence of error. The graphical postprocessor module HARPO makes it possible for reliability engineers to quickly analyze huge amounts of reliability/availability data to observe trends due to exploratory design changes. The addition of several powerful HARP modeling engines provides the user with a reliability/availability modeling capability for a wide range of system applications all integrated under a common interactive graphical input-output capability.

  20. Integrated Measurements and Characterization | Photovoltaic Research | NREL

    Science.gov Websites

    Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool

  1. Observer's Interface for Solar System Target Specification

    NASA Astrophysics Data System (ADS)

    Roman, Anthony; Link, Miranda; Moriarty, Christopher; Stansberry, John A.

    2016-10-01

    When observing an asteroid or comet with HST, it has been necessary for the observer to manually enter the target's orbital elements into the Astronomer's Proposal Tool (APT). This allowed possible copy/paste transcription errors from the observer's source of orbital elements data. In order to address this issue, APT has now been improved with the capability to identify targets in and then download orbital elements from JPL Horizons. The observer will first use a target name resolver to choose the intended target from the Horizons database, and then download the orbital elements from Horizons directly into APT. A manual entry option is also still retained if the observer does not wish to use elements from Horizons. This new capability is available for HST observing, and it will also be supported for JWST observing. The poster shows examples of this new interface.

  2. Observer's Interface for Solar System Target Specification

    NASA Astrophysics Data System (ADS)

    Roman, Anthony; Link, Miranda; Moriarty, Christopher; Stansberry, John A.

    2016-01-01

    When observing an asteroid or comet with HST, it has been necessary for the observer to manually enter the target's orbital elements into the Astronomer's Proposal Tool (APT). This allowed possible copy/paste transcription errors from the observer's source of orbital elements data. In order to address this issue, APT has now been improved with the capability to identify targets in and then download orbital elements from JPL Horizons. The observer will first use a target name resolver to choose the intended target from the Horizons database, and then download the orbital elements from Horizons directly into APT. A manual entry option is also still retained if the observer does not wish to use elements from Horizons. This new capability is available for HST observing, and it will also be supported for JWST observing. The poster shows examples of this new interface.

  3. Users Guide for the Anvil Threat Corridor Forecast Tool V1.7.0 for AWIPS

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2007-01-01

    The Applied Meteorology Unit (AMU) originally developed the Anvil Threat Sector Tool for the Meteorological Interactive Data Display System (MIDDS) and delivered the capability in three phases beginning with a feasibility study in 2000 and delivering the operational final product in December 2003. This tool is currently used operationally by the 45th Weather Squadron (45 WS) Launch Weather Officers (LWO) and Spaceflight Meteorology Group (SMG) forecasters. Phase I of the task established the technical feasibility of developing an objective, observations-based tool for short-range anvil forecasting. The AMU was subsequently tasked to develop short-term anvil forecasting tools to improve predictions of the threat of triggered lightning to space launch and landing vehicles. Under the Phase II effort, the AMU developed a nowcasting anvil threat sector tool, which provided the user with a threat sector based on the most current radiosonde upper wind data from a co-located or upstream station. The Phase II Anvil Threat Sector Tool computes the average wind speed and direction in the layer between 300 and 150 mb from the latest radiosonde for a user-designated station. The following threat sector properties are consistent with the propagation and lifetime characteristics of thunderstorm anvil clouds observed over Florida and its coastal waters (Short et al. 2002): a) 20 n mi standoff circle, b) 30 degree sector width, c) Orientation given by 300 to 150 mb average wind direction, d) 1-, 2-, and 3- hour arcs in upwind direction, and e) Arc distances given by 300 to 150 mb average wind speed. Figure 1 is an example of the MIDDS Anvil Threat Sector tool overlaid on a visible satellite image at 2132 UTC 13 May 2001. Space Launch Complex 39A was selected as the center point and the Anvil Threat Sector was determined from upper-level wind data at 1500 UTC in the preconvective environment. Narrow thunderstorm anvil clouds extend from central Florida to the space launch and landing facilities at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) and beyond. The anvil clouds were generated around 1930 UTC (1430 EDT) by thunderstorm activity over central Florida and transported 90 n mi east-northeastward within 2 hours, as diagnosed by the anvil forecast tool. Phase III, delivered in February 2003, built upon the results of Phase II by enhancing the Anvil Threat Sector Tool with the capability to use national model forecast winds for depiction of potential anvil lengths and orientations over the KSC/CCAFS area with lead times from 3 through 168 hours (7 days). In September 2003, AMU customers requested the capability to use data from the KSC 50 MHz Doppler Radar Wind Profiler (DRWP) in the Anvil Threat Sector Tool and this capability was delivered by the AMU in December 2003. In March 2005, the AMU was tasked to migrate the MIDDS Anvil Threat Sector Tool capabilities onto the Advanced Weather Interactive Processing System (AWIPS) as the Anvil Threat Corridor Forecast Tool.

  4. A Few Observations and Remarks on Time Effectiveness of Interactive Electronic Testing

    ERIC Educational Resources Information Center

    Magdin, Martin; Turcáni, Milan

    2015-01-01

    In the paper, we point out several observations and remarks on time effectiveness of electronic testing, in particular of its new form (interactive tests). A test is often used as an effective didactic tool for evaluating the extent of gained cognitive capabilities. According to authors Rudman (1989) and Wang (2003) it is provable that the…

  5. SOAP-T: a tool to study the light curve and radial velocity of a system with a transiting planet and a rotating spotted star

    NASA Astrophysics Data System (ADS)

    Oshagh, M.; Boisse, I.; Boué, G.; Montalto, M.; Santos, N. C.; Bonfils, X.; Haghighipour, N.

    2013-01-01

    We present an improved version of SOAP named "SOAP-T", which can generate the radial velocity variations and light curves for systems consisting of a rotating spotted star with a transiting planet. This tool can be used to study the anomalies inside transit light curves and the Rossiter-McLaughlin effect, to better constrain the orbital configuration and properties of planetary systems and the active zones of their host stars. Tests of the code are presented to illustrate its performance and to validate its capability when compared with analytical models and real data. Finally, we apply SOAP-T to the active star, HAT-P-11, observed by the NASA Kepler space telescope and use this system to discuss the capability of this tool in analyzing light curves for the cases where the transiting planet overlaps with the star's spots. The tool's public interface is available at http://www.astro.up.pt/resources/soap-t/

  6. Providing Observation Context via Kernel Visualization and Informatics for Planning and Data Analysis

    NASA Astrophysics Data System (ADS)

    Kidd, J. N.; Selznick, S.; Hergenrother, C. W.

    2018-04-01

    From our lessons learned and SPICE expertise, we lay out the features and capabilities of a new web-based tool to provide an accessible platform to obtain context and informatics from a planetary mission's SPICE kernels.

  7. Computing Linear Mathematical Models Of Aircraft

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.

    1991-01-01

    Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.

  8. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  9. Real-time algorithm for acoustic imaging with a microphone array.

    PubMed

    Huang, Xun

    2009-05-01

    Acoustic phased array has become an important testing tool in aeroacoustic research, where the conventional beamforming algorithm has been adopted as a classical processing technique. The computation however has to be performed off-line due to the expensive cost. An innovative algorithm with real-time capability is proposed in this work. The algorithm is similar to a classical observer in the time domain while extended for the array processing to the frequency domain. The observer-based algorithm is beneficial mainly for its capability of operating over sampling blocks recursively. The expensive experimental time can therefore be reduced extensively since any defect in a testing can be corrected instantaneously.

  10. Mirage: a visible signature evaluation tool

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.; Meehan, Alaster J.; Shao, Q. T.; Richards, Noel

    2017-10-01

    This paper presents the Mirage visible signature evaluation tool, designed to provide a visible signature evaluation capability that will appropriately reflect the effect of scene content on the detectability of targets, providing a capability to assess visible signatures in the context of the environment. Mirage is based on a parametric evaluation of input images, assessing the value of a range of image metrics and combining them using the boosted decision tree machine learning method to produce target detectability estimates. It has been developed using experimental data from photosimulation experiments, where human observers search for vehicle targets in a variety of digital images. The images used for tool development are synthetic (computer generated) images, showing vehicles in many different scenes and exhibiting a wide variation in scene content. A preliminary validation has been performed using k-fold cross validation, where 90% of the image data set was used for training and 10% of the image data set was used for testing. The results of the k-fold validation from 200 independent tests show a prediction accuracy between Mirage predictions of detection probability and observed probability of detection of r(262) = 0:63, p < 0:0001 (Pearson correlation) and a MAE = 0:21 (mean absolute error).

  11. PST and PARR: Plan specification tools and a planning and resource reasoning shell for use in satellite mission planning

    NASA Technical Reports Server (NTRS)

    Mclean, David; Yen, Wen

    1989-01-01

    Plan Specification Tools (PST) are tools that allow the user to specify satellite mission plans in terms of satellite activities, relevent orbital events, and targets for observation. The output of these tools is a set of knowledge bases and environmental events which can then be used by a Planning And Resource Reasoning (PARR) shell to build a schedule. PARR is a reactive planning shell which is capable of reasoning about actions in the satellite mission planning domain. Each of the PST tools and PARR are described as well as the use of PARR for scheduling computer usage in the multisatellite operations control center at Goddard Space Flight Center.

  12. Design of capability measurement instruments pedagogic content knowledge (PCK) for prospective mathematics teachers

    NASA Astrophysics Data System (ADS)

    Aminah, N.; Wahyuni, I.

    2018-05-01

    The purpose of this study is to find out how the process of designing a tool of measurement Pedagogical Content Knowledge (PCK) capabilities, especially for prospective mathematics teachers are valid and practical. The design study of this measurement appliance uses modified Plomp development step, which consists of (1) initial assessment stage, (2) design stage at this stage, the researcher designs the measuring grille of PCK capability, (3) realization stage that is making measurement tool ability of PCK, (4) test phase, evaluation, and revision that is testing validation of measurement tools conducted by experts. Based on the results obtained that the design of PCK capability measurement tool is valid as indicated by the assessment of expert validator, and the design of PCK capability measurement tool, shown based on the assessment of teachers and lecturers as users of states strongly agree the design of PCK measurement tools can be used.

  13. MODA A Framework for Memory Centric Performance Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shrestha, Sunil; Su, Chun-Yi; White, Amanda M.

    2012-06-29

    In the age of massive parallelism, the focus of performance analysis has switched from the processor and related structures to the memory and I/O resources. Adapting to this new reality, a performance analysis tool has to provide a way to analyze resource usage to pinpoint existing and potential problems in a given application. This paper provides an overview of the Memory Observant Data Analysis (MODA) tool, a memory-centric tool first implemented on the Cray XMT supercomputer. Throughout the paper, MODA's capabilities have been showcased with experiments done on matrix multiply and Graph-500 application codes.

  14. Modeling AWSoM CMEs with EEGGL: A New Approach for Space Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Jin, M.; Manchester, W.; van der Holst, B.; Sokolov, I.; Toth, G.; Vourlidas, A.; de Koning, C. A.; Gombosi, T. I.

    2015-12-01

    The major source of destructive space weather is coronal mass ejections (CMEs). However, our understanding of CMEs and their propagation in the heliosphere is limited by the insufficient observations. Therefore, the development of first-principals numerical models plays a vital role in both theoretical investigation and providing space weather forecasts. Here, we present results of the simulation of CME propagation from the Sun to 1AU by combining the analytical Gibson & Low (GL) flux rope model with the state-of-art solar wind model AWSoM. We also provide an approach for transferring this research model to a space weather forecasting tool by demonstrating how the free parameters of the GL flux rope can be prescribed based on remote observations via the new Eruptive Event Generator by Gibson-Low (EEGGL) toolkit. This capability allows us to predict the long-term evolution of the CME in interplanetary space. We perform proof-of-concept case studies to show the capability of the model to capture physical processes that determine CME evolution while also reproducing many observed features both in the corona and at 1 AU. We discuss the potential and limitations of this model as a future space weather forecasting tool.

  15. MODIS Interactive Subsetting Tool (MIST)

    NASA Astrophysics Data System (ADS)

    McAllister, M.; Duerr, R.; Haran, T.; Khalsa, S. S.; Miller, D.

    2008-12-01

    In response to requests from the user community, NSIDC has teamed with the Oak Ridge National Laboratory Distributive Active Archive Center (ORNL DAAC) and the Moderate Resolution Data Center (MrDC) to provide time series subsets of satellite data covering stations in the Greenland Climate Network (GC-NET) and the International Arctic Systems for Observing the Atmosphere (IASOA) network. To serve these data NSIDC created the MODIS Interactive Subsetting Tool (MIST). MIST works with 7 km by 7 km subset time series of certain Version 5 (V005) MODIS products over GC-Net and IASOA stations. User- selected data are delivered in a text Comma Separated Value (CSV) file format. MIST also provides online analysis capabilities that include generating time series and scatter plots. Currently, MIST is a Beta prototype and NSIDC intends that user requests will drive future development of the tool. The intent of this poster is to introduce MIST to the MODIS data user audience and illustrate some of the online analysis capabilities.

  16. Extending the Virtual Solar Observatory (VSO) to Incorporate Data Analysis Capabilities (III)

    NASA Astrophysics Data System (ADS)

    Csillaghy, A.; Etesi, L.; Dennis, B.; Zarro, D.; Schwartz, R.; Tolbert, K.

    2008-12-01

    We will present a progress report on our activities to extend the data analysis capabilities of the VSO. Our efforts to date have focused on three areas: 1. Extending the data retrieval capabilities by developing a centralized data processing server. The server is built with Java, IDL (Interactive Data Language), and the SSW (Solar SoftWare) package with all SSW-related instrument libraries and required calibration data. When a user requests VSO data that requires preprocessing, the data are transparently sent to the server, processed, and returned to the user's IDL session for viewing and analysis. It is possible to have any Java or IDL client connect to the server. An IDL prototype for preparing and calibrating SOHO/EIT data wll be demonstrated. 2. Improving the solar data search in SHOW SYNOP, a graphical user tool connected to VSO in IDL. We introduce the Java-IDL interface that allows a flexible dynamic, and extendable way of searching the VSO, where all the communication with VSO are managed dynamically by standard Java tools. 3. Improving image overlay capability to support coregistration of solar disk observations obtained from different orbital view angles, position angles, and distances - such as from the twin STEREO spacecraft.

  17. Chandra mission scheduling on-orbit experience

    NASA Astrophysics Data System (ADS)

    Bucher, Sabina; Williams, Brent; Pendexter, Misty; Balke, David

    2008-07-01

    Scheduling observatory time to maximize both day-to-day science target integration time and the lifetime of the observatory is a formidable challenge. Furthermore, it is not a static problem. Of course, every schedule brings a new set of observations, but the boundaries of the problem change as well. As spacecraft ages, its capabilities may degrade. As in-flight experience grows, capabilities may expand. As observing programs are completed, the needs and expectations of the science community may evolve. Changes such as these impact the rules by which a mission scheduled. In eight years on orbit, the Chandra X-Ray Observatory Mission Planning process has adapted to meet the challenge of maximizing day-to-day and mission lifetime science return, despite a consistently evolving set of scheduling constraints. The success of the planning team has been achieved, not through the use of complex algorithms and optimization routines, but through processes and home grown tools that help individuals make smart short term and long term Mission Planning decisions. This paper walks through the processes and tools used to plan and produce mission schedules for the Chandra X-Ray Observatory. Nominal planning and scheduling, target of opportunity response, and recovery from on-board autonomous safing actions are all addressed. Evolution of tools and processes, best practices, and lessons learned are highlighted along the way.

  18. A novel form of spontaneous tool use displayed by several captive greater vasa parrots (Coracopsis vasa).

    PubMed

    Lambert, Megan L; Seed, Amanda M; Slocombe, Katie E

    2015-12-01

    Parrots are frequently cited for their sophisticated problem-solving abilities, but cases of habitual tool use among psittacines are scarce. We report the first evidence, to our knowledge, of tool use by greater vasa parrots (Coracopsis vasa). Several members of a captive population spontaneously adopted a novel tool-using technique by using pebbles and date pits either (i) to scrape on the inner surface of seashells, subsequently licking the resulting calcium powder from the tool, or (ii) as a wedge to break off smaller pieces of the shell for ingestion. Tool use occurred most frequently just prior to the breeding season, during which time numerous instances of tool transfer were also documented. These observations provide new insights into the tool-using capabilities of parrots and highlight the greater vasa parrot as a species of interest for studies of physical cognition. © 2015 The Author(s).

  19. Semantic markup of sensor capabilities: how simple it too simple?

    NASA Astrophysics Data System (ADS)

    Rueda-Velasquez, C. A.; Janowicz, K.; Fredericks, J.

    2016-12-01

    Semantics plays a key role for the publication, retrieval, integration, and reuse of observational data across the geosciences. In most cases, one can safely assume that the providers of such data, e.g., individual scientists, understand the observation context in which their data are collected,e.g., the used observation procedure, the sampling strategy, the feature of interest being studied, and so forth. However, can we expect that the same is true for the technical details of the used sensors and especially the nuanced changes that can impact observations in often unpredictable ways? Should the burden of annotating the sensor capabilities, firmware, operation ranges, and so forth be really part of a scientist's responsibility? Ideally, semantic annotations should be provided by the parties that understand these details and have a vested interest in maintaining these data. With manufactures providing semantically-enabled metadata for their sensors and instruments, observations could more easily be annotated and thereby enriched using this information. Unfortunately, today's sensor ontologies and tool chains developed for the Semantic Web community require expertise beyond the knowledge and interest of most manufacturers. Consequently, knowledge engineers need to better understand the sweet spot between simple ontologies/vocabularies and sufficient expressivity as well as the tools required to enable manufacturers to share data about their sensors. Here, we report on the current results of EarthCube's X-Domes project that aims to address the questions outlined above.

  20. Versatile Friction Stir Welding/Friction Plug Welding System

    NASA Technical Reports Server (NTRS)

    Carter, Robert

    2006-01-01

    A proposed system of tooling, machinery, and control equipment would be capable of performing any of several friction stir welding (FSW) and friction plug welding (FPW) operations. These operations would include the following: Basic FSW; FSW with automated manipulation of the length of the pin tool in real time [the so-called auto-adjustable pin-tool (APT) capability]; Self-reacting FSW (SRFSW); SR-FSW with APT capability and/or real-time adjustment of the distance between the front and back shoulders; and Friction plug welding (FPW) [more specifically, friction push plug welding] or friction pull plug welding (FPPW) to close out the keyhole of, or to repair, an FSW or SR-FSW weld. Prior FSW and FPW systems have been capable of performing one or two of these operations, but none has thus far been capable of performing all of them. The proposed system would include a common tool that would have APT capability for both basic FSW and SR-FSW. Such a tool was described in Tool for Two Types of Friction Stir Welding (MFS- 31647-1), NASA Tech Briefs, Vol. 30, No. 10 (October 2006), page 70. Going beyond what was reported in the cited previous article, the common tool could be used in conjunction with a plug welding head to perform FPW or FPPW. Alternatively, the plug welding head could be integrated, along with the common tool, into a FSW head that would be capable of all of the aforementioned FSW and FPW operations. Any FSW or FPW operation could be performed under any combination of position and/or force control.

  1. Earth: Earth Science and Health

    NASA Technical Reports Server (NTRS)

    Maynard, Nancy G.

    2001-01-01

    A major new NASA initiative on environmental change and health has been established to promote the application of Earth science remote sensing data, information, observations, and technologies to issues of human health. NASA's Earth Sciences suite of Earth observing instruments are now providing improved observations science, data, and advanced technologies about the Earth's land, atmosphere, and oceans. These new space-based resources are being combined with other agency and university resources, data integration and fusion technologies, geographic information systems (GIS), and the spectrum of tools available from the public health community, making it possible to better understand how the environment and climate are linked to specific diseases, to improve outbreak prediction, and to minimize disease risk. This presentation is an overview of NASA's tools, capabilities, and research advances in this initiative.

  2. MAGIC: Model and Graphic Information Converter

    NASA Technical Reports Server (NTRS)

    Herbert, W. C.

    2009-01-01

    MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.

  3. Use of real-time tools to support field operations of NSF's Lower Atmosphere Observing Facilities

    NASA Astrophysics Data System (ADS)

    Daniels, M.; Stossmeister, G.; Johnson, E.; Martin, C.; Webster, C.; Dixon, M.; Maclean, G.

    2012-12-01

    NCAR's Earth Observing Laboratory (EOL) operates Lower Atmosphere Observing Facilities (LAOF) for the scientific community, under sponsorship of the National Science Foundation. In order to obtain the highest quality dataset during field campaigns, real-time decision-making critically depends on the availability of timely data and reliable communications between field operations staff and instrument operators. EOL incorporates the latest technologies to monitor the health of instrumentation, facilitate remote operations of instrumentation and keep project participants abreast of changing conditions in the field. As the availability of bandwidth on mobile communication networks and the capabilities of their associated devices (smart phone, tablets, etc.) improved, so has the ability of researchers to respond to rapidly changing conditions and coordinate ever more detailed measurements from multiple remote fixed, portable and airborne platforms. This presentation will describe several new tools that EOL is making available to project investigators and how these tools are being used in a mobile computing environment to support enhanced data collection during field campaigns. LAOF platforms such as radars, aircraft, sondes, balloons and surface stations all rely on displays of real-time data for their operations. Data from sondes are ingested into the Global Telecommunications System (GTS) for assimilation into regional forecasting models that help guide project operations. Since many of EOL's projects occur around the globe and at the same time instrument complexity has increased, automated monitoring of instrumentation platforms and systems has become essential. Tools are being developed to allow remote instrument control of our suite of observing systems where feasible. The Computing, Data and Software (CDS) Facility of EOL develops and supports a Field Catalog used in field campaigns for nearly two decades. Today, the Field Catalog serves as a hub for the collection and browsing of field research products, related operational and forecast imagery, project documentation as well as tools for real-time decision-making, communication, mission planning and post analysis. Incorporation of new capabilities into the Field Catalog to support the mobile computing environment and devices has led to the development of new tools which will be described. EOL/CDS has also developed a customized Internet Relay Chat (IRC) chat system to enable communication between all project participants distributed across various land-based, shipboard and airborne remote sites. The CDS chat system has incorporated aspects of fault tolerance in order to handle intermittent communications links. NOAA and NASA have used this chat system for their field missions as well. These new tools were recently deployed in support of the Deep Convective Clouds and Chemistry (DC3) field campaign that took place May - June 2012 in the Central United States. This presentation will show examples of these real-time tools from recent projects. We will also describe some of the challenges, problems and surprises, as well as improvements that have been made to the tools. The capabilities of this system continue to advance, taking advantage of new technology and guided by our experience and feedback from users participating in field campaigns.

  4. Process Simulation and Modeling for Advanced Intermetallic Alloys.

    DTIC Science & Technology

    1994-06-01

    calorimetry, using a Stanton Redfera/Omnitherm DOC 1500 thermal analysis system, was the primary experimental tool for this investigation...samples during both heating and cooling in a high purity argon atmosphere at a rate of 20K/min. The DSC instrumental baseline was obtained using both empty...that is capable of fitting the observed data to given cell structures using a least squares procedure. RESULTS The results of the DOC observations are

  5. Development of Creativity: The Influence of Varying Levels of Implementation of the DISCOVER Curriculum Model, a Non-Traditional Pedagogical Approach

    ERIC Educational Resources Information Center

    Maker, C. June; Jo, Sonmi; Muammar, Omar M.

    2008-01-01

    Development of creativity is influenced by multiple factors, including the environment, developmental changes, and measurement tools. In this study, we investigated the relationship between creativity development and implementation of the Discovering Intellectual Strengths and Capabilities while Observing Varied Ethnic Responses (DISCOVER)…

  6. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  7. Validation of the first peoples cultural capability measurement tool with undergraduate health students: A descriptive cohort study.

    PubMed

    West, Roianne; Mills, Kyly; Rowland, Dale; Creedy, Debra K

    2018-05-01

    Health professional graduates require the capacity to work safely, both clinically and culturally, when delivering care to Indigenous peoples worldwide. In the Australian context, the Aboriginal and Torres Strait Islander Health Curriculum Framework (The Framework) provides guidance for health professional programs to integrate, teach and assess Aboriginal and Torres Strait Islander peoples' (First Peoples) health content. There is, however, a lack of validated tools that measure the development of students' cultural capabilities. To validate the Cultural Capability Measurement Tool with a cohort of health professional students. A descriptive cohort design was used. All students (N = 753) enrolled in a discrete First Peoples Health course at an Australian university were invited to complete the Cultural Capability Measurement Tool. The tool was tested for reliability, content and construct validity using confirmatory factor analysis; and concurrent validity using and the Cultural Understanding Self-Assessment Tool. A sample of 418 (73% response rate) was recruited. Most participants were enrolled in the Bachelor of Nursing program (n = 369, 82%). The Cultural Capability Measurement Tool had a Cronbach's alpha coefficient of 0.86. A five-factor solution was confirmed which reflected the cultural capability domains and accounted for 51% of the variance. Scores correlated with students' cultural understanding (r = 0.28, p < 0.001). Successful implementation of The Framework requires instruments to measure changes in students' cultural capabilities. Measuring nursing students' cultural capabilities can inform their development, identify areas of strengths and deficits for educators, and will ultimately contribute to the development of a culturally safe nursing workforce. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Multiscale Data Assimilation for Large-Eddy Simulations

    NASA Astrophysics Data System (ADS)

    Li, Z.; Cheng, X.; Gustafson, W. I., Jr.; Xiao, H.; Vogelmann, A. M.; Endo, S.; Toto, T.

    2017-12-01

    Large-eddy simulation (LES) is a powerful tool for understanding atmospheric turbulence, boundary layer physics and cloud development, and there is a great need for developing data assimilation methodologies that can constrain LES models. The U.S. Department of Energy Atmospheric Radiation Measurement (ARM) User Facility has been developing the capability to routinely generate ensembles of LES. The LES ARM Symbiotic Simulation and Observation (LASSO) project (https://www.arm.gov/capabilities/modeling/lasso) is generating simulations for shallow convection days at the ARM Southern Great Plains site in Oklahoma. One of major objectives of LASSO is to develop the capability to observationally constrain LES using a hierarchy of ARM observations. We have implemented a multiscale data assimilation (MSDA) scheme, which allows data assimilation to be implemented separately for distinct spatial scales, so that the localized observations can be effectively assimilated to constrain the mesoscale fields in the LES area of about 15 km in width. The MSDA analysis is used to produce forcing data that drive LES. With such LES workflow we have examined 13 days with shallow convection selected from the period May-August 2016. We will describe the implementation of MSDA, present LES results, and address challenges and opportunities for applying data assimilation to LES studies.

  9. Trajectory-Based Takeoff Time Predictions Applied to Tactical Departure Scheduling: Concept Description, System Design, and Initial Observations

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn A.; Capps, Alan

    2011-01-01

    Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.

  10. NEON's Mobile Deployment Platform: A research tool for integrating ecological processes across scales

    NASA Astrophysics Data System (ADS)

    Sanclements, M.

    2016-12-01

    Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  12. Introducing PLIA: Planetary Laboratory for Image Analysis

    NASA Astrophysics Data System (ADS)

    Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.

    2005-08-01

    We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.

  13. A combined scanning tunneling microscope-atomic layer deposition tool.

    PubMed

    Mack, James F; Van Stockum, Philip B; Iwadate, Hitoshi; Prinz, Fritz B

    2011-12-01

    We have built a combined scanning tunneling microscope-atomic layer deposition (STM-ALD) tool that performs in situ imaging of deposition. It operates from room temperature up to 200 °C, and at pressures from 1 × 10(-6) Torr to 1 × 10(-2) Torr. The STM-ALD system has a complete passive vibration isolation system that counteracts both seismic and acoustic excitations. The instrument can be used as an observation tool to monitor the initial growth phases of ALD in situ, as well as a nanofabrication tool by applying an electric field with the tip to laterally pattern deposition. In this paper, we describe the design of the tool and demonstrate its capability for atomic resolution STM imaging, atomic layer deposition, and the combination of the two techniques for in situ characterization of deposition.

  14. Going beyond the NASA Earthdata website: Reaching out to new audiences via social media and webinars

    NASA Astrophysics Data System (ADS)

    Bagwell, R.; Wong, M. M.; Brennan, J.; Murphy, K. J.; Behnke, J.

    2014-12-01

    This poster will introduce and explore the various social media efforts and monthly webinar series recently established by the National Aeronautics and Space Administration (NASA) Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. Some of the capabilities include twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), a data discovery and service access client (Reverb), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative, and a host of other discipline specific data discovery, data access, data subsetting and visualization tools and services. We have embarked on these efforts to reach out to new audiences and potential new users and to engage our diverse end user communities world-wide. One of the key objectives is to increase awareness of the breadth of Earth science data information, services, and tools that are publicly available while also highlighting how these data and technologies enable scientific research.

  15. Future missions for observing Earth's changing gravity field: a closed-loop simulation tool

    NASA Astrophysics Data System (ADS)

    Visser, P. N.

    2008-12-01

    The GRACE mission has successfully demonstrated the observation from space of the changing Earth's gravity field at length and time scales of typically 1000 km and 10-30 days, respectively. Many scientific communities strongly advertise the need for continuity of observing Earth's gravity field from space. Moreover, a strong interest is being expressed to have gravity missions that allow a more detailed sampling of the Earth's gravity field both in time and in space. Designing a gravity field mission for the future is a complicated process that involves making many trade-offs, such as trade-offs between spatial, temporal resolution and financial budget. Moreover, it involves the optimization of many parameters, such as orbital parameters (height, inclination), distinction between which gravity sources to observe or correct for (for example are gravity changes due to ocean currents a nuisance or a signal to be retrieved?), observation techniques (low-low satellite-to-satellite tracking, satellite gravity gradiometry, accelerometers), and satellite control systems (drag-free?). A comprehensive tool has been developed and implemented that allows the closed-loop simulation of gravity field retrievals for different satellite mission scenarios. This paper provides a description of this tool. Moreover, its capabilities are demonstrated by a few case studies. Acknowledgments. The research that is being done with the closed-loop simulation tool is partially funded by the European Space Agency (ESA). An important component of the tool is the GEODYN software, kindly provided by NASA Goddard Space Flight Center in Greenbelt, Maryland.

  16. 76 FR 71341 - BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... report presents a series of short case studies designed to illustrate the capabilities of these tools for... change impacts on water. This report presents a series of short case studies using the BASINS and WEPP climate assessment tools. The case studies are designed to illustrate the capabilities of these tools for...

  17. 76 FR 4708 - Agency Information Collection Activities: Submission for OMB Review; Comment Request, OMB No...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... assess disaster logistics planning and response capabilities and identify areas of relative strength and...; Logistics Capability Assessment Tool (LCAT) AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice...: Collection of Information Title: Logistics Capability Assessment Tool (LCAT). Type of Information Collection...

  18. Development of the SOFIA Image Processing Tool

    NASA Technical Reports Server (NTRS)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  19. New Tools for New Missions - Unmanned Aircraft Systems Offer Exciting Capabilities

    NASA Astrophysics Data System (ADS)

    Bland, G.; Miles, T.; Pieri, D. C.; Coronado, P. L.; Fladeland, M. M.; Diaz, J. A.; Cione, J.; Maslanik, J. A.; Roman, M. O.; de Boer, G.; Argrow, B. M.; Novara, J.; Stachura, M.; Neal, D.; Moisan, J. R.

    2015-12-01

    There are numerous emerging possibilities for utilizing unmanned aircraft systems (UAS) to investigate a variety of natural hazards, both for prediction and analysis of specific events. Additionally, quick response capabilities will provide affordable, low risk support for emergency management teams. NASA's partnerships with commercial, university and other government agency teams are bringing new capabilities to research and emergency management communities. New technology platforms and instrument systems are gaining momentum for stand-off remote sensing observations, as well as penetration and detailed in-situ examination of natural and anthropogenic phenomena. Several pioneering investigations have provided the foundation for this development, including NASA projects with Aerosonde, Dragon Eye, and SIERRA platforms. With miniaturized instrument and platform technologies, these experiments demonstrated that previously unobtainable observations may significantly aid in the understanding, prediction, and assessment of natural hazards such as storms, volcanic eruptions, floods, and the potential impact of environmental changes. Remote sensing observations of storms and fires have also been successfully demonstrated through NASA's efforts with larger UAS such as the Global Hawk and Ikhana platforms. The future may unfold with new high altitude and/or long endurance capabilities, in some cases with less size and costs as payload capacity requirements are reduced through further miniaturization, and alternatively with expanded instrumentation and mission profiles. Several new platforms and instrument development projects are underway that will enable affordable, quick response observations. Additionally, distributed measurements that will provide near-simultaneous coverage at multiple locations will be possible - an exciting new mission concept that will greatly aid many observation scenarios. Partnerships with industry, academia, and other government agencies are all making significant contributions to these new capabilities.

  20. Modeling and Visualizing Flow of Chemical Agents Across Complex Terrain

    NASA Technical Reports Server (NTRS)

    Kao, David; Kramer, Marc; Chaderjian, Neal

    2005-01-01

    Release of chemical agents across complex terrain presents a real threat to homeland security. Modeling and visualization tools are being developed that capture flow fluid terrain interaction as well as point dispersal downstream flow paths. These analytic tools when coupled with UAV atmospheric observations provide predictive capabilities to allow for rapid emergency response as well as developing a comprehensive preemptive counter-threat evacuation plan. The visualization tools involve high-end computing and massive parallel processing combined with texture mapping. We demonstrate our approach across a mountainous portion of North California under two contrasting meteorological conditions. Animations depicting flow over this geographical location provide immediate assistance in decision support and crisis management.

  1. Tool making, hand morphology and fossil hominins.

    PubMed

    Marzke, Mary W

    2013-11-19

    Was stone tool making a factor in the evolution of human hand morphology? Is it possible to find evidence in fossil hominin hands for this capability? These questions are being addressed with increasingly sophisticated studies that are testing two hypotheses; (i) that humans have unique patterns of grip and hand movement capabilities compatible with effective stone tool making and use of the tools and, if this is the case, (ii) that there exist unique patterns of morphology in human hands that are consistent with these capabilities. Comparative analyses of human stone tool behaviours and chimpanzee feeding behaviours have revealed a distinctive set of forceful pinch grips by humans that are effective in the control of stones by one hand during manufacture and use of the tools. Comparative dissections, kinematic analyses and biomechanical studies indicate that humans do have a unique pattern of muscle architecture and joint surface form and functions consistent with the derived capabilities. A major remaining challenge is to identify skeletal features that reflect the full morphological pattern, and therefore may serve as clues to fossil hominin manipulative capabilities. Hominin fossils are evaluated for evidence of patterns of derived human grip and stress-accommodation features.

  2. Tool making, hand morphology and fossil hominins

    PubMed Central

    Marzke, Mary W.

    2013-01-01

    Was stone tool making a factor in the evolution of human hand morphology? Is it possible to find evidence in fossil hominin hands for this capability? These questions are being addressed with increasingly sophisticated studies that are testing two hypotheses; (i) that humans have unique patterns of grip and hand movement capabilities compatible with effective stone tool making and use of the tools and, if this is the case, (ii) that there exist unique patterns of morphology in human hands that are consistent with these capabilities. Comparative analyses of human stone tool behaviours and chimpanzee feeding behaviours have revealed a distinctive set of forceful pinch grips by humans that are effective in the control of stones by one hand during manufacture and use of the tools. Comparative dissections, kinematic analyses and biomechanical studies indicate that humans do have a unique pattern of muscle architecture and joint surface form and functions consistent with the derived capabilities. A major remaining challenge is to identify skeletal features that reflect the full morphological pattern, and therefore may serve as clues to fossil hominin manipulative capabilities. Hominin fossils are evaluated for evidence of patterns of derived human grip and stress-accommodation features. PMID:24101624

  3. Increases in efficiency and enhancements to the Mars Observer non-stored commanding process

    NASA Technical Reports Server (NTRS)

    Brooks, Robert N., Jr.; Torgerson, J. Leigh

    1994-01-01

    The Mars Observer team was, until the untimely loss of the spacecraft on August 21, 1993, performing flight operations with greater efficiency and speed than any previous JPL mission of its size. This level of through-put was made possible by a mission operations system which was composed of skilled personnel using sophisticated sequencing and commanding tools. During cruise flight operations, however, it was realized by the project that this commanding level was not going to be sufficient to support the activities planned for mapping operations. The project had committed to providing the science instrument principle investigators with a much higher level of commanding during mapping. Thus, the project began taking steps to enhance the capabilities of the flight team. One mechanism used by project management was a tool available from total quality management (TQM). This tool is known as a process action team (PAT). The Mars Observer PAT was tasked to increase the capacity of the flight team's nonstored commanding process by fifty percent with no increase in staffing and a minimal increase in risk. The outcome of this effort was, in fact, to increase the capacity by a factor of 2.5 rather than the desired fifty percent and actually reduce risk. The majority of these improvements came from the automation of the existing command process. These results required very few changes to the existing mission operations system. Rather, the PAT was able to take advantage of automation capabilities inherent in the existing system and make changes to the existing flight team procedures.

  4. Interoperability science cases with the CDPP tools

    NASA Astrophysics Data System (ADS)

    Nathanaël, J.; Cecconi, B.; André, N.; Bouchemit, M.; Gangloff, M.; Budnik, E.; Jacquey, C.; Pitout, F.; Durand, J.; Rouillard, A.; Lavraud, B.; Genot, V. N.; Popescu, D.; Beigbeder, L.; Toniutti, J. P.; Caussarieu, S.

    2017-12-01

    Data exchange protocols are never as efficient as when they are invisible for the end user who is then able to discover data, to cross compare observations and modeled data and finally to perform in depth analysis. Over the years these protocols, including SAMP from IVOA, EPN-TAP from the Europlanet 2020 RI community, backed by standard web-services, have been deployed in tools designed by the French Centre de Données de la Physique des Plasmas (CDPP) including AMDA, the Propagation Tool, 3DView, ... . This presentation will focus on science cases which show the capability of interoperability in the planetary and heliophysics contexts, involving both CDPP and companion tools. Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  5. Providing Common Access Mechanisms for Dissimilar Network Interconnection Nodes

    DTIC Science & Technology

    1991-02-01

    Network management involves both maintaining adequate data transmission capabilities in the face of growing and changing needs and keeping the network...Display Only tools are able to obtain information from an IN or a set of INs and display this information, but are not able to change the...configuration or state of an IN. 2. Display and Control tools have the same capabilities as Display Only tools, but in addition are capable of changing the

  6. Geospatial Data as a Service: The GEOGLAM Rangelands and Pasture Productivity Map Experience

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Antony, J.; Guerschman, J. P.; Larraondo, P. R.; Richards, C. J.

    2017-12-01

    Empowering end-users like pastoralists, land management specialists and land policy makers in the use of earth observation data for both day-to-day and seasonal planning needs both interactive delivery of multiple geospatial datasets and the capability of supporting on-the-fly dynamic queries while simultaneously fostering a community around the effort. The use of and wide adoption of large data archives, like those produced by earth observation missions, are often limited by compute and storage capabilities of the remote user. We demonstrate that wide-scale use of large data archives can be facilitated by end-users dynamically requesting value-added products using open standards (WCS, WMS, WPS), with compute running in the cloud or dedicated data-centres and visualizing outputs on web-front ends. As an example, we will demonstrate how a tool called GSKY can empower a remote end-user by providing the data delivery and analytics capabilities for the GEOGLAM Rangelands and Pasture Productivity (RAPP) Map tool. The GEOGLAM RAPP initiative from the Group on Earth Observations (GEO) and its Agricultural Monitoring subgroup aims at providing practical tools to end-users focusing on the important role of rangelands and pasture systems in providing food production security from both agricultural crops and animal protein. Figure 1, is a screen capture from the RAPP Map interface for an important pasture area in the Namibian rangelands. The RAPP Map has been in production for six months and has garnered significant interest from groups and users all over the world. GSKY, being formulated around the theme of Open Geospatial Data-as-a-Service capabilities uses distributed computing and storage to facilitate this. It works behind the scenes, accepting OGC standard requests in WCS, WMS and WPS. Results from these requests are rendered on a web-front end. In this way, the complexities of data locality and compute execution are masked from an end user. On-the-fly computation of products such as NDVI, Leaf Area Index, vegetation cover and others from original source data including MODIS are achived, with Landsat and Sentinel-2 on the horizon. Innovative use of cloud computing and storage along with flexible front-ends, allow the democratization of data dissemination and we hope better outcomes for the planet.

  7. Development and Application of a Tool for Optimizing Composite Matrix Viscoplastic Material Parameters

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Naghipour Ghezeljeh, Paria; Bednarcyk, Brett A.

    2018-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) and its application. MAC/GMC is a composite material and laminate analysis software package developed at NASA Glenn Research Center. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that helps users optimize highly nonlinear viscoplastic constitutive law parameters by fitting experimentally observed/measured stress-strain responses under various thermo-mechanical conditions for braided composites. The tool has been developed utilizing the MATrix LABoratory (MATLAB) (The Mathworks, Inc., Natick, MA) programming language. Illustrative examples shown are for a specific braided composite system wherein the matrix viscoplastic behavior is represented by a constitutive law described by seven parameters. The tool is general enough to fit any number of experimentally observed stress-strain responses of the material. The number of parameters to be optimized, as well as the importance given to each stress-strain response, are user choice. Three different optimization algorithms are included: (1) Optimization based on gradient method, (2) Genetic algorithm (GA) based optimization and (3) Particle Swarm Optimization (PSO). The user can mix and match the three algorithms. For example, one can start optimization with either 2 or 3 and then use the optimized solution to further fine tune with approach 1. The secondary focus of this paper is to demonstrate the application of this tool to optimize/calibrate parameters for a nonlinear viscoplastic matrix to predict stress-strain curves (for constituent and composite levels) at different rates, temperatures and/or loading conditions utilizing the Generalized Method of Cells. After preliminary validation of the tool through comparison with experimental results, a detailed virtual parametric study is presented wherein the combined effects of temperature and loading rate on the predicted response of a braided composite is investigated.

  8. Developing Toolsets for AirBorne Data (TAD): Overview of Design Concept

    NASA Astrophysics Data System (ADS)

    Parker, L.; Perez, J.; Chen, G.; Benson, A.; Peeters, M. C.

    2013-12-01

    NASA has conducted airborne tropospheric chemistry studies for about three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. Even though the spatial and temporal coverage is limited, the aircraft data offer high resolution and comprehensive simultaneous coverage of many variables, e.g. ozone precursors, intermediate photochemical species, and photochemical products. The recent NASA Earth Venture Program has generated an unprecedented amount of aircraft observations in terms of the sheer number of measurements and data volume. The ASDC Toolset for Airborne Data (TAD) is being designed to meet the user community needs for aircraft data for scientific research on climate change and air quality relevant issues, particularly: 1) Provide timely access to a broad user community, 2) Provide an intuitive user interface to facilitate quick discovery of the variables and data, 3) Provide data products and tools to facilitate model assessment activities, e.g., merge files and data subsetting capabilities, 4) Provide simple utility 'calculators', e.g., unit conversion and aerosol size distribution processing, and 5) Provide Web Coverage Service capable tools to enhance the data usability. The general strategy and design of TAD will be presented.

  9. Reducing the Schizophrenia Stigma: A New Approach Based on Augmented Reality

    PubMed Central

    Silva, Rafael D. de C.; Albuquerque, Saulo G. C.; Muniz, Artur de V.; Filho, Pedro P. Rebouças; Ribeiro, Sidarta

    2017-01-01

    Schizophrenia is a chronic mental disease that usually manifests psychotic symptoms and affects an individual's functionality. The stigma related to this disease is a serious obstacle for an adequate approach to its treatment. Stigma can, for example, delay the start of treatment, and it creates difficulties in interpersonal and professional relationships. This work proposes a new tool based on augmented reality to reduce the stigma related to schizophrenia. The tool is capable of simulating the psychotic symptoms typical of schizophrenia and simulates sense perception changes in order to create an immersive experience capable of generating pathological experiences of a patient with schizophrenia. The integration into the proposed environment occurs through immersion glasses and an embedded camera. Audio and visual effects can also be applied in real time. To validate the proposed environment, medical students experienced the virtual environment and then answered three questionnaires to assess (i) stigmas related to schizophrenia, (ii) the efficiency and effectiveness of the tool, and, finally (iii) stigma after simulation. The analysis of the questionnaires showed that the proposed model is a robust tool and quite realistic and, thus, very promising in reducing stigma associated with schizophrenia by instilling in the observer a greater comprehension of any person during an schizophrenic outbreak, whether a patient or a family member. PMID:29317860

  10. Exploration Medical System Trade Study Tools Overview

    NASA Technical Reports Server (NTRS)

    Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.

    2018-01-01

    ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.

  11. High-resolution imaging of living mammalian cells bound by nanobeads-connected antibodies in a medium using scanning electron-assisted dielectric microscopy

    NASA Astrophysics Data System (ADS)

    Okada, Tomoko; Ogura, Toshihiko

    2017-02-01

    Nanometre-scale-resolution imaging technologies for liquid-phase specimens are indispensable tools in various scientific fields. In biology, observing untreated living cells in a medium is essential for analysing cellular functions. However, nanoparticles that bind living cells in a medium are hard to detect directly using traditional optical or electron microscopy. Therefore, we previously developed a novel scanning electron-assisted dielectric microscope (SE-ADM) capable of nanoscale observations. This method enables observation of intact cells in aqueous conditions. Here, we use this SE-ADM system to clearly observe antibody-binding nanobeads in liquid-phase. We also report the successful direct detection of streptavidin-conjugated nanobeads binding to untreated cells in a medium via a biotin-conjugated anti-CD44 antibody. Our system is capable of obtaining clear images of cellular organelles and beads on the cells at the same time. The direct observation of living cells with nanoparticles in a medium allowed by our system may contribute the development of carriers for drug delivery systems (DDS).

  12. KARMA: the observation preparation tool for KMOS

    NASA Astrophysics Data System (ADS)

    Wegner, Michael; Muschielok, Bernard

    2008-08-01

    KMOS is a multi-object integral field spectrometer working in the near infrared which is currently being built for the ESO VLT by a consortium of UK and German institutes. It is capable of selecting up to 24 target fields for integral field spectroscopy simultaneously by means of 24 robotic pick-off arms. For the preparation of observations with KMOS a dedicated preparation tool KARMA ("KMOS Arm Allocator") will be provided which optimizes the assignment of targets to these arms automatically, thereby taking target priorities and several mechanical and optical constraints into account. For this purpose two efficient algorithms, both being able to cope with the underlying optimization problem in a different way, were developed. We present the concept and architecture of KARMA in general and the optimization algorithms in detail.

  13. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  14. VizieR Online Data Catalog: Jame Clerk Maxwell Telescope Science Archive (CADC, 2003)

    NASA Astrophysics Data System (ADS)

    Canadian Astronomy Data, Centre

    2018-01-01

    The JCMT Science Archive (JSA), a collaboration between the CADC and EOA, is the official distribution site for observational data obtained with the James Clerk Maxwell Telescope (JCMT) on Mauna Kea, Hawaii. The JSA search interface is provided by the CADC Search tool, which provides generic access to the complete set of telescopic data archived at the CADC. Help on the use of this tool is provided via tooltips. For additional information on instrument capabilities and data reduction, please consult the SCUBA-2 and ACSIS instrument pages provided on the JAC maintained JCMT pages. JCMT-specific help related to the use of the CADC AdvancedSearch tool is available from the JAC. (1 data file).

  15. The nature and evaluation of commercial expert system building tools, revision 1

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1987-01-01

    This memorandum reviews the factors that constitute an Expert System Building Tool (ESBT) and evaluates current tools in terms of these factors. Evaluation of these tools is based on their structure and their alternative forms of knowledge representation, inference mechanisms and developer end-user interfaces. Next, functional capabilities, such as diagnosis and design, are related to alternative forms of mechanization. The characteristics and capabilities of existing commercial tools are then reviewed in terms of these criteria.

  16. Using synchrotron light to accelerate EUV resist and mask materials learning

    NASA Astrophysics Data System (ADS)

    Naulleau, Patrick; Anderson, Christopher N.; Baclea-an, Lorie-Mae; Denham, Paul; George, Simi; Goldberg, Kenneth A.; Jones, Gideon; McClinton, Brittany; Miyakawa, Ryan; Mochi, Iacopo; Montgomery, Warren; Rekawa, Seno; Wallow, Tom

    2011-03-01

    As commercialization of extreme ultraviolet lithography (EUVL) progresses, direct industry activities are being focused on near term concerns. The question of long term extendibility of EUVL, however, remains crucial given the magnitude of the investments yet required to make EUVL a reality. Extendibility questions are best addressed using advanced research tools such as the SEMATECH Berkeley microfield exposure tool (MET) and actinic inspection tool (AIT). Utilizing Lawrence Berkeley National Laboratory's Advanced Light Source facility as the light source, these tools benefit from the unique properties of synchrotron light enabling research at nodes generations ahead of what is possible with commercial tools. The MET for example uses extremely bright undulator radiation to enable a lossless fully programmable coherence illuminator. Using such a system, resolution enhancing illuminations achieving k1 factors of 0.25 can readily be attained. Given the MET numerical aperture of 0.3, this translates to an ultimate resolution capability of 12 nm. Using such methods, the SEMATECH Berkeley MET has demonstrated resolution in resist to 16-nm half pitch and below in an imageable spin-on hard mask. At a half pitch of 16 nm, this material achieves a line-edge roughness of 2 nm with a correlation length of 6 nm. These new results demonstrate that the observed stall in ultimate resolution progress in chemically amplified resists is a materials issue rather than a tool limitation. With a resolution limit of 20-22 nm, the CAR champion from 2008 remains as the highest performing CAR tested to date. To enable continued advanced learning in EUV resists, SEMATECH has initiated a plan to implement a 0.5 NA microfield tool at the Advanced Light Source synchrotron facility. This tool will be capable of printing down to 8-nm half pitch.

  17. Building oceanographic and atmospheric observation networks by composition: unmanned vehicles, communication networks, and planning and execution control frameworks

    NASA Astrophysics Data System (ADS)

    Sousa, J. T.; Pinto, J.; Martins, R.; Costa, M.; Ferreira, F.; Gomes, R.

    2014-12-01

    The problem of developing mobile oceanographic and atmospheric observation networks (MOAO) with coordinated air and ocean vehicles is discussed in the framework of the communications and control software tool chain developed at Underwater Systems and Technologies Laboratory (LSTS) from Porto University. This is done with reference to field experiments to illustrate key capabilities and to assess future MOAO operations. First, the motivation for building MOAO by "composition" of air and ocean vehicles, communication networks, and planning and execution control frameworks is discussed - in networked vehicle systems information and commands are exchanged among multiple vehicles and operators, and the roles, relative positions, and dependencies of these vehicles and operators change during operations. Second, the planning and execution control framework developed at LSTS for multi-vehicle systems is discussed with reference to key concepts such as autonomy, mixed-initiative interactions, and layered organization. Third, the LSTS tool software tool chain is presented to show how to develop MOAO by composition. The tool chain comprises the Neptus command and control framework for mixed initiative interactions, the underlying IMC messaging protocol, and the DUNE on-board software. Fourth, selected LSTS operational deployments illustrate MOAO capability building. In 2012 we demonstrated the use of UAS to "ferry" data from UUVs located beyond line of sight (BLOS). In 2013 we demonstrated coordinated observations of coastal fronts with small UAS and UUVs, "bent" BLOS through the use of UAS as communication relays, and UAS tracking of juvenile hammer-head sharks. In 2014 we demonstrated UUV adaptive sampling with the closed loop controller of the UUV residing on a UAS; this was done with the help of a Wave Glider ASV with a communications gateway. The results from these experiments provide a background for assessing potential future UAS operations in a compositional MOAO.

  18. Field observations using an AOTF polarimetric imaging spectrometer

    NASA Technical Reports Server (NTRS)

    Cheng, Li-Jen; Hamilton, Mike; Mahoney, Colin; Reyes, George

    1993-01-01

    This paper reports preliminary results of recent field observations using a prototype acousto-optic tunable filter (AOTF) polarimetric imaging spectrometer. The data illustrate application potentials for geoscience. The operation principle of this instrument is different from that of current airborne multispectral imaging instruments, such as AVIRIS. The AOTF instrument takes two orthogonally polarized images at a desired wavelength at one time, whereas AVIRIS takes a spectrum over a predetermined wavelength range at one pixel at a time and the image is constructed later. AVIRIS does not have any polarization measuring capability. The AOTF instrument could be a complement tool to AVIRIS. Polarization measurement is a desired capability for many applications in remote sensing. It is well know that natural light is often polarized due to various scattering phenomena in the atmosphere. Also, scattered light from canopies is reported to have a polarized component. To characterize objects of interest correctly requires a remote sensing imaging spectrometer capable of measuring object signal and background radiation in both intensity and polarization so that the characteristics of the object can be determined. The AORF instrument has the capability to do so. The AOTF instrument has other unique properties. For example, it can provide spectral images immediately after the observation. The instrument can also allow observations to be tailored in real time to perform the desired experiments and to collect only required data. Consequently, the performance in each mission can be increased with minimal resources. The prototype instrument was completed in the beginning of this year. A number of outdoor field experiments were performed with the objective to evaluate the capability of this new technology for remote sensing applications and to determine issues for further improvements.

  19. A propagation tool to connect remote-sensing observations with in-situ measurements of heliospheric structures

    NASA Astrophysics Data System (ADS)

    Rouillard, A. P.; Lavraud, B.; Génot, V.; Bouchemit, M.; Dufourg, N.; Plotnikov, I.; Pinto, R. F.; Sanchez-Diaz, E.; Lavarra, M.; Penou, M.; Jacquey, C.; André, N.; Caussarieu, S.; Toniutti, J.-P.; Popescu, D.; Buchlin, E.; Caminade, S.; Alingery, P.; Davies, J. A.; Odstrcil, D.; Mays, L.

    2017-11-01

    The remoteness of the Sun and the harsh conditions prevailing in the solar corona have so far limited the observational data used in the study of solar physics to remote-sensing observations taken either from the ground or from space. In contrast, the 'solar wind laboratory' is directly measured in situ by a fleet of spacecraft measuring the properties of the plasma and magnetic fields at specific points in space. Since 2007, the solar-terrestrial relations observatory (STEREO) has been providing images of the solar wind that flows between the solar corona and spacecraft making in-situ measurements. This has allowed scientists to directly connect processes imaged near the Sun with the subsequent effects measured in the solar wind. This new capability prompted the development of a series of tools and techniques to track heliospheric structures through space. This article presents one of these tools, a web-based interface called the 'Propagation Tool' that offers an integrated research environment to study the evolution of coronal and solar wind structures, such as Coronal Mass Ejections (CMEs), Corotating Interaction Regions (CIRs) and Solar Energetic Particles (SEPs). These structures can be propagated from the Sun outwards to or alternatively inwards from planets and spacecraft situated in the inner and outer heliosphere. In this paper, we present the global architecture of the tool, discuss some of the assumptions made to simulate the evolution of the structures and show how the tool connects to different databases.

  20. VERCE: a productive e-Infrastructure and e-Science environment for data-intensive seismology research

    NASA Astrophysics Data System (ADS)

    Vilotte, J. P.; Atkinson, M.; Spinuso, A.; Rietbrock, A.; Michelini, A.; Igel, H.; Frank, A.; Carpené, M.; Schwichtenberg, H.; Casarotti, E.; Filgueira, R.; Garth, T.; Germünd, A.; Klampanos, I.; Krause, A.; Krischer, L.; Leong, S. H.; Magnoni, F.; Matser, J.; Moguilny, G.

    2015-12-01

    Seismology addresses both fundamental problems in understanding the Earth's internal wave sources and structures and augmented societal applications, like earthquake and tsunami hazard assessment and risk mitigation; and puts a premium on open-data accessible by the Federated Digital Seismological Networks. The VERCE project, "Virtual Earthquake and seismology Research Community e-science environment in Europe", has initiated a virtual research environment to support complex orchestrated workflows combining state-of-art wave simulation codes and data analysis tools on distributed computing and data infrastructures (DCIs) along with multiple sources of observational data and new capabilities to combine simulation results with observational data. The VERCE Science Gateway provides a view of all the available resources, supporting collaboration with shared data and methods, with data access controls. The mapping to DCIs handles identity management, authority controls, transformations between representations and controls, and access to resources. The framework for computational science that provides simulation codes, like SPECFEM3D, democratizes their use by getting data from multiple sources, managing Earth models and meshes, distilling them as input data, and capturing results with meta-data. The dispel4py data-intensive framework allows for developing data-analysis applications using Python and the ObsPy library, which can be executed on different DCIs. A set of tools allows coupling with seismology and external data services. Provenance driven tools validate results and show relationships between data to facilitate method improvement. Lessons learned from VERCE training lead us to conclude that solid-Earth scientists could make significant progress by using VERCE e-science environment. VERCE has already contributed to the European Plate Observation System (EPOS), and is part of the EPOS implementation phase. Its cross-disciplinary capabilities are being extended for the EPOS implantation phase.

  1. AstroNet: A Tool Set for Simultaneous, Multi-Site Observations of Astronomical Objects

    NASA Technical Reports Server (NTRS)

    Chakrabarti, Supriya

    1995-01-01

    Earth-based, fully automatic "robotic" telescopes have been in routine operation for a number of years. As their number grows and their distribution becomes global, increasing attention is being given to forming networks of various sorts that will allow them, as a group, to make observations 24 hours a day in both hemispheres. We have suggested that telescopes based in space be part of this network. We further suggested that any telescope on this network be capable of asking, almost in real time, that other robotic telescopes perform support observations for them. When a target of opportunity required support observations, the system would determine which telescope(s) in the network would be most appropriate to make the observations and formulate a request to do so. Because the network would be comprised of telescopes located in widely distributed regions, this system would guarantee continuity of observations This report summarizes our efforts under this contract. We proposed to develop a set of data collection and display tools to aid simultaneous observation of astronomical targets from a number of observing sites. We planned to demonstrate the usefulness of this toolset for simultaneous multi-site observation of astronomical targets. Possible candidates or the proposed demonstration included the Extreme Ultraviolet Explorer (EUVE), International Ultraviolet Explorer (IUE), and ALEXIS, sounding rocket experiments. Ground-based observatories operated by the University of California, Berkeley, the Jet Propulsion Laboratory, and Fairborn Observatory in Mesa, Arizona were to be used to demonstrate the proposed concept. Although the demonstration was to have involved astronomical investigations, the tools were to have been applicable to a large number of scientific disciplines. The software tools and systems developed as a result of the work were to have been made available to the scientific community.

  2. Building a Massive Volcano Archive and the Development of a Tool for the Science Community

    NASA Technical Reports Server (NTRS)

    Linick, Justin

    2012-01-01

    The Jet Propulsion Laboratory has traditionally housed one of the world's largest databases of volcanic satellite imagery, the ASTER Volcano Archive (10Tb), making these data accessible online for public and scientific use. However, a series of changes in how satellite imagery is housed by the Earth Observing System (EOS) Data Information System has meant that JPL has been unable to systematically maintain its database for the last several years. We have provided a fast, transparent, machine-to-machine client that has updated JPL's database and will keep it current in near real-time. The development of this client has also given us the capability to retrieve any data provided by NASA's Earth Observing System Clearinghouse (ECHO) that covers a volcanic event reported by U.S. Air Force Weather Agency (AFWA). We will also provide a publicly available tool that interfaces with ECHO that can provide functionality not available in any of ECHO's Earth science discovery tools.

  3. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  4. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  5. An enhanced MMW and SMMW/THz imaging system performance prediction and analysis tool for concealed weapon detection and pilotage obstacle avoidance

    NASA Astrophysics Data System (ADS)

    Murrill, Steven R.; Jacobs, Eddie L.; Franck, Charmaine C.; Petkie, Douglas T.; De Lucia, Frank C.

    2015-10-01

    The U.S. Army Research Laboratory (ARL) has continued to develop and enhance a millimeter-wave (MMW) and submillimeter- wave (SMMW)/terahertz (THz)-band imaging system performance prediction and analysis tool for both the detection and identification of concealed weaponry, and for pilotage obstacle avoidance. The details of the MATLAB-based model which accounts for the effects of all critical sensor and display components, for the effects of atmospheric attenuation, concealment material attenuation, and active illumination, were reported on at the 2005 SPIE Europe Security and Defence Symposium (Brugge). An advanced version of the base model that accounts for both the dramatic impact that target and background orientation can have on target observability as related to specular and Lambertian reflections captured by an active-illumination-based imaging system, and for the impact of target and background thermal emission, was reported on at the 2007 SPIE Defense and Security Symposium (Orlando). Further development of this tool that includes a MODTRAN-based atmospheric attenuation calculator and advanced system architecture configuration inputs that allow for straightforward performance analysis of active or passive systems based on scanning (single- or line-array detector element(s)) or staring (focal-plane-array detector elements) imaging architectures was reported on at the 2011 SPIE Europe Security and Defence Symposium (Prague). This paper provides a comprehensive review of a newly enhanced MMW and SMMW/THz imaging system analysis and design tool that now includes an improved noise sub-model for more accurate and reliable performance predictions, the capability to account for postcapture image contrast enhancement, and the capability to account for concealment material backscatter with active-illumination- based systems. Present plans for additional expansion of the model's predictive capabilities are also outlined.

  6. Development of an Adaptable Display and Diagnostic System for the Evaluation of Tropical Cyclone Forecasts

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Burek, T.; Halley-Gotway, J.

    2015-12-01

    NCAR's Joint Numerical Testbed Program (JNTP) focuses on the evaluation of experimental forecasts of tropical cyclones (TCs) with the goal of developing new research tools and diagnostic evaluation methods that can be transitioned to operations. Recent activities include the development of new TC forecast verification methods and the development of an adaptable TC display and diagnostic system. The next generation display and diagnostic system is being developed to support evaluation needs of the U.S. National Hurricane Center (NHC) and broader TC research community. The new hurricane display and diagnostic capabilities allow forecasters and research scientists to more deeply examine the performance of operational and experimental models. The system is built upon modern and flexible technology that includes OpenLayers Mapping tools that are platform independent. The forecast track and intensity along with associated observed track information are stored in an efficient MySQL database. The system provides easy-to-use interactive display system, and provides diagnostic tools to examine forecast track stratified by intensity. Consensus forecasts can be computed and displayed interactively. The system is designed to display information for both real-time and for historical TC cyclones. The display configurations are easily adaptable to meet the needs of the end-user preferences. Ongoing enhancements include improving capabilities for stratification and evaluation of historical best tracks, development and implementation of additional methods to stratify and compute consensus hurricane track and intensity forecasts, and improved graphical display tools. The display is also being enhanced to incorporate gridded forecast, satellite, and sea surface temperature fields. The presentation will provide an overview of the display and diagnostic system development and demonstration of the current capabilities.

  7. Exploring JWST's Capability to Constrain Habitability on Simulated Terrestrial TESS Planets

    NASA Astrophysics Data System (ADS)

    Tremblay, Luke; Britt, Amber; Batalha, Natasha; Schwieterman, Edward; Arney, Giada; Domagal-Goldman, Shawn; Mandell, Avi; Planetary Systems Laboratory; Virtual Planetary Laboratory

    2017-01-01

    In the following, we have worked to develop a flexible "observability" scale of biologically relevant molecules in the atmospheres of newly discovered exoplanets for the instruments aboard NASA's next flagship mission, the James Webb Space Telescope (JWST). We sought to create such a scale in order to provide the community with a tool with which to optimize target selection for JWST observations based on detections of the upcoming Transiting Exoplanet Satellite Survey (TESS). Current literature has laid the groundwork for defining both biologically relevant molecules as well as what characteristics would make a new world "habitable", but it has so far lacked a cohesive analysis of JWST's capabilities to observe these molecules in exoplanet atmospheres and thereby constrain habitability. In developing our Observability Scale, we utilized a range of hypothetical planets (over planetary radii and stellar insolation) and generated three self-consistent atmospheric models (of dierent molecular compositions) for each of our simulated planets. With these planets and their corresponding atmospheres, we utilized the most accurate JWST instrument simulator, created specically to process transiting exoplanet spectra. Through careful analysis of these simulated outputs, we were able to determine the relevant parameters that effected JWST's ability to constrain each individual molecular bands with statistical accuracy and therefore generate a scale based on those key parameters. As a preliminary test of our Observability Scale, we have also applied it to the list of TESS candidate stars in order to determine JWST's observational capabilities for any soon-to-be-detected planet in those solar systems.

  8. Global Snow from Space: Development of a Satellite-based, Terrestrial Snow Mission Planning Tool

    NASA Astrophysics Data System (ADS)

    Forman, B. A.; Kumar, S.; LeMoigne, J.; Nag, S.

    2017-12-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary - or perhaps contradictory - information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  9. Towards the Development of a Global, Satellite-based, Terrestrial Snow Mission Planning Tool

    NASA Technical Reports Server (NTRS)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASAs Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical orbital configuration.One objective the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include:1. What observational records are needed (in space and time) to maximize terrestrial snow experimental utility?2. How might observations be coordinated (in space and time) to maximize utility? 3. What is the additional utility associated with an additional observation?4. How can future mission costs being minimized while ensuring Science requirements are fulfilled?

  10. Towards the Development of a Global, Satellite-Based, Terrestrial Snow Mission Planning Tool

    NASA Technical Reports Server (NTRS)

    Forman, Bart; Kumar, Sujay; Le Moigne, Jacqueline; Nag, Sreeja

    2017-01-01

    A global, satellite-based, terrestrial snow mission planning tool is proposed to help inform experimental mission design with relevance to snow depth and snow water equivalent (SWE). The idea leverages the capabilities of NASA's Land Information System (LIS) and the Tradespace Analysis Tool for Constellations (TAT-C) to harness the information content of Earth science mission data across a suite of hypothetical sensor designs, orbital configurations, data assimilation algorithms, and optimization and uncertainty techniques, including cost estimates and risk assessments of each hypothetical permutation. One objective of the proposed observing system simulation experiment (OSSE) is to assess the complementary or perhaps contradictory information content derived from the simultaneous collection of passive microwave (radiometer), active microwave (radar), and LIDAR observations from space-based platforms. The integrated system will enable a true end-to-end OSSE that can help quantify the value of observations based on their utility towards both scientific research and applications as well as to better guide future mission design. Science and mission planning questions addressed as part of this concept include: What observational records are needed (in space and time) to maximize terrestrial snow experimental utility? How might observations be coordinated (in space and time) to maximize this utility? What is the additional utility associated with an additional observation? How can future mission costs be minimized while ensuring Science requirements are fulfilled?

  11. An Introduction to Exoplanets and the Kepler Mission

    NASA Technical Reports Server (NTRS)

    Lissauer, Jack

    2014-01-01

    A quarter century ago, the only planets known to humanity were the familiar objects that orbit our Sun. But improved observational techniques allowed astronomers to begin detecting planets around other stars in the 1990s. The first extrasolar planets (often referred to as exoplanets) to be discovered were quite exotic and unfamiliar objects. Most were giant objects that are hundreds of times as massive as the Earth and orbit so close to their star that they are hotter than pizza ovens. But as observational capabilities improved, smaller and cooler planets were found. The most capable planet-hunting tool developed to date is NASA's Kepler telescope, which was launched in 2009. Kepler has found that planets similar in size to our Earth are quite abundant within our galaxy. Results of Kepler's research will be summarized and placed into context within the new and growing discipline of exoplanet studies.

  12. James Webb Space Telescope (JWST) and Star Formation

    NASA Technical Reports Server (NTRS)

    Greene, Thomas P.

    2010-01-01

    The 6.5-m aperture James Webb Space Telescope (JWST) will be a powerful tool for studying and advancing numerous areas of astrophysics. Its Fine Guidance Sensor, Near-Infrared Camera, Near-Infrared Spectrograph, and Mid-Infrared Instrument will be capable of making very sensitive, high angular resolution imaging and spectroscopic observations spanning 0.7 - 28 ?m wavelength. These capabilities are very well suited for probing the conditions of star formation in the distant and local Universe. Indeed, JWST has been designed to detect first light objects as well as to study the fine details of jets, disks, chemistry, envelopes, and the central cores of nearby protostars. We will be able to use its cameras, coronagraphs, and spectrographs (including multi-object and integral field capabilities) to study many aspects of star forming regions throughout the galaxy, the Local Group, and more distant regions. I will describe the basic JWST scientific capabilities and illustrate a few ways how they can be applied to star formation issues and conditions with a focus on Galactic regions.

  13. Guidance for Mitigating Environmental Concerns During Range Siting

    DTIC Science & Technology

    2006-12-01

    facilities, processing plants, and landfills . CERCLA authorizes two kinds of response actions: (1) Short-term remov- als, where actions may be taken to...on a slightly higher elevation than the positions to minimize earthwork. Avoid siting berms on crests of high hills, mesas, or ridges, as these...Observing existing vegetation on potential sites is also an important as- sessment tool. Soils with thick vegetative cover are capable of supporting

  14. Implementation of structure-mapping inference by event-file binding and action planning: a model of tool-improvisation analogies.

    PubMed

    Fields, Chris

    2011-03-01

    Structure-mapping inferences are generally regarded as dependent upon relational concepts that are understood and expressible in language by subjects capable of analogical reasoning. However, tool-improvisation inferences are executed by members of a variety of non-human primate and other species. Tool improvisation requires correctly inferring the motion and force-transfer affordances of an object; hence tool improvisation requires structure mapping driven by relational properties. Observational and experimental evidence can be interpreted to indicate that structure-mapping analogies in tool improvisation are implemented by multi-step manipulation of event files by binding and action-planning mechanisms that act in a language-independent manner. A functional model of language-independent event-file manipulations that implement structure mapping in the tool-improvisation domain is developed. This model provides a mechanism by which motion and force representations commonly employed in tool-improvisation structure mappings may be sufficiently reinforced to be available to inwardly directed attention and hence conceptualization. Predictions and potential experimental tests of this model are outlined.

  15. JOVIAL J73 Automated Verification System - Study Phase

    DTIC Science & Technology

    1980-08-01

    capabil- ities for the tool, and the high-level design of the tool are also described. Future capabilities for the tool are identified. -N CONTENTS...Implemented Test Tools 3-22 4 FUNCTIONAL DESCRIPTION OF Ji3AVS 4-1 4.1 Summary of Capabilities 4-3 4.2 J 3.AVS Operat . 4-11 5 DESIGN OF J73AVS 5-1 6...Both JOVIAL languages are primarily designed for command and control system programming. They are es- pecially well suited to large systems requiring

  16. Behavioral and functional strategies during tool use tasks in bonobos.

    PubMed

    Bardo, Ameline; Borel, Antony; Meunier, Hélène; Guéry, Jean-Pascal; Pouydebat, Emmanuelle

    2016-09-01

    Different primate species have developed extensive capacities for grasping and manipulating objects. However, the manual abilities of primates remain poorly known from a dynamic point of view. The aim of the present study was to quantify the functional and behavioral strategies used by captive bonobos (Pan paniscus) during tool use tasks. The study was conducted on eight captive bonobos which we observed during two tool use tasks: food extraction from a large piece of wood and food recovery from a maze. We focused on grasping postures, in-hand movements, the sequences of grasp postures used that have not been studied in bonobos, and the kind of tools selected. Bonobos used a great variety of grasping postures during both tool use tasks. They were capable of in-hand movement, demonstrated complex sequences of contacts, and showed more dynamic manipulation during the maze task than during the extraction task. They arrived on the location of the task with the tool already modified and used different kinds of tools according to the task. We also observed individual manual strategies. Bonobos were thus able to develop in-hand movements similar to humans and chimpanzees, demonstrated dynamic manipulation, and they responded to task constraints by selecting and modifying tools appropriately, usually before they started the tasks. These results show the necessity to quantify object manipulation in different species to better understand their real manual specificities, which is essential to reconstruct the evolution of primate manual abilities. © 2016 Wiley Periodicals, Inc.

  17. Transitioning Human, Social, Cultural Behavior (HSCB) Models and Simulations to the Operational User1

    DTIC Science & Technology

    2009-10-01

    actuelle M&S couvrant le soutien aux operations, la representation du comportement humain , la guerre asymetrique, la defense contre le terrorisme et...methods, tools, data, intellectual capital , and processes to address these capability requirements. Fourth, there is a need to compare capability...requirements to current capabilities to identify gaps that may be addressed with DoD HSCB methods, tools, data, intellectual capital , and process

  18. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  19. Utilization of extended bayesian networks in decision making under uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Eeckhout, Edward M; Leishman, Deborah A; Gibson, William L

    2009-01-01

    Bayesian network tool (called IKE for Integrated Knowledge Engine) has been developed to assess the probability of undesirable events. The tool allows indications and observables from sensors and/or intelligence to feed directly into hypotheses of interest, thus allowing one to quantify the probability and uncertainty of these events resulting from very disparate evidence. For example, the probability that a facility is processing nuclear fuel or assembling a weapon can be assessed by examining the processes required, establishing the observables that should be present, then assembling information from intelligence, sensors and other information sources related to the observables. IKE also hasmore » the capability to determine tasking plans, that is, prioritize which observable should be collected next to most quickly ascertain the 'true' state and drive the probability toward 'zero' or 'one.' This optimization capability is called 'evidence marshaling.' One example to be discussed is a denied facility monitoring situation; there is concern that certain process(es) are being executed at the site (due to some intelligence or other data). We will show how additional pieces of evidence will then ascertain with some degree of certainty the likelihood of this process(es) as each piece of evidence is obtained. This example shows how both intelligence and sensor data can be incorporated into the analysis. A second example involves real-time perimeter security. For this demonstration we used seismic, acoustic, and optical sensors linked back to IKE. We show how these sensors identified and assessed the likelihood of 'intruder' versus friendly vehicles.« less

  20. Dream project: Applications of earth observations to disaster risk management

    NASA Astrophysics Data System (ADS)

    Dyke, G.; Gill, S.; Davies, R.; Betorz, F.; Andalsvik, Y.; Cackler, J.; Dos Santos, W.; Dunlop, K.; Ferreira, I.; Kebe, F.; Lamboglia, E.; Matsubara, Y.; Nikolaidis, V.; Ostoja-Starzewski, S.; Sakita, M.; Verstappen, N.

    2011-01-01

    The field of disaster risk management is relatively new and takes a structured approach to managing uncertainty related to the threat of natural and man-made disasters. Disaster risk management consists primarily of risk assessment and the development of strategies to mitigate disaster risk. This paper will discuss how increasing both Earth observation data and information technology capabilities can contribute to disaster risk management, particularly in Belize. The paper presents the results and recommendations of a project conducted by an international and interdisciplinary team of experts at the 2009 session of the International Space University in NASA Ames Research Center (California, USA). The aim is to explore the combination of current, planned and potential space-aided, airborne, and ground-based Earth observation tools, the emergence of powerful new web-based and mobile data management tools, and how this combination can support and improve the emerging field of disaster risk management. The starting point of the project was the World Bank's Comprehensive Approach to Probabilistic Risk Assessment (CAPRA) program, focused in Central America. This program was used as a test bed to analyze current space technologies used in risk management and develop new strategies and tools to be applied in other regions around the world.

  1. 78 FR 73872 - Agency Information Collection Activities: Proposed Collection; Comment Request; Logistics...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-09

    ..., and tribal entities to evaluate their current disaster logistics readiness, identify areas for...; Logistics Capability Assistance Tool (LCAT) AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice... Reduction Act of 1995, this notice seeks comments concerning the Logistics Capability Assistance Tool (LCAT...

  2. CADMIO: computer aided design for medical information objects.

    PubMed

    Minarelli, D V; Ferri, F; Pisanelli, D M; Ricci, F L; Tittarelli, F

    1995-01-01

    The growth of the computational capability and the tools of graphic software is nowadays available in an integrated manner into the development environments, thus permitting the realization of tool kits capable of handling information that is complex and of different kinds such as the typical medical information. This has given a great impulse to the creation of electronic medical folders joining together with new and stimulating functionality with respect to the usual paper document [1]. In the present work, we propose a tool capable of defining a multimedia electronic medical folder and representing its architecture through a layout that is formed on the basis of the particular data types to be handled. This tool is capable of providing an integrated view of data that, even though they are close in cognitive sense, are often stored and represented apart in the practice. Different approaches to the browsing feature are giving within the system, thus the user can personalize the way of viewing the information stored into the folder or can let the system guide the browsing.

  3. Who uses NASA Earth Science Data? Connecting with Users through the Earthdata website and Social Media

    NASA Astrophysics Data System (ADS)

    Wong, M. M.; Brennan, J.; Bagwell, R.; Behnke, J.

    2015-12-01

    This poster will introduce and explore the various social media efforts, monthly webinar series and a redesigned website (https://earthdata.nasa.gov) established by National Aeronautics and Space Administration's (NASA) Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools. We have embarked on these efforts to reach out to new audiences and potential new users and to engage our diverse end user communities world-wide. One of the key objectives is to increase awareness of the breadth of Earth science data information, services, and tools that are publicly available while also highlighting how these data and technologies enable scientific research.

  4. EBUS-STAT Subscore Analysis to Predict the Efficacy and Assess the Validity of Virtual Reality Simulation for EBUS-TBNA Training Among Experienced Bronchoscopists.

    PubMed

    Scarlata, Simone; Palermo, Patrizio; Candoli, Piero; Tofani, Ariela; Petitti, Tommasangelo; Corbetta, Lorenzo

    2017-04-01

    Linear endobronchial ultrasound transbronchial needle aspiration (EBUS-TBNA) represents a pivotal innovation in interventional pulmonology; determining the best approach to guarantee systematic and efficient training is expected to become a main issue in the forthcoming years. Virtual reality simulators have been proposed as potential EBUS-TBNA training instruments, to avoid unskilled beginners practicing directly in real-life settings. A validated and perfected simulation program could be used before allowing beginners to practice on patients. Our goal was to test the reliability of the EBUS-Skills and Task Assessment Tool (STAT) and its subscores for measuring the competence of experienced bronchoscopists approaching EBUS-guided TBNA, using only the virtual reality simulator as both a training and an assessment tool. Fifteen experienced bronchoscopists, with poor or no experience in EBUS-TBNA, participated in this study. They were all administered the Italian version of the EBUS-STAT evaluation tool, during a high-fidelity virtual reality simulation. This was followed by a single 7-hour theoretical and practical (on simulators) session on EBUS-TBNA, at the end of which their skills were reassessed by EBUS-STAT. An overall, significant improvement in EBUS-TBNA skills was observed, thereby confirming that (a) virtual reality simulation can facilitate practical learning among practitioners, and (b) EBUS-STAT is capable of detecting these improvements. The test's overall ability to detect differences was negatively influenced by the minimal variation of the scores relating to items 1 and 2, was not influenced by the training, and improved significantly when the 2 items were not considered. Apart from these 2 items, all the remaining subscores were equally capable of revealing improvements in the learner. Lastly, we found that trainees with presimulation EBUS-STAT scores above 79 did not show any significant improvement after virtual reality training, suggesting that this score represents a cutoff value capable of predicting the likelihood that simulation can be beneficial. Virtual reality simulation is capable of providing a practical learning tool for practitioners with previous experience in flexible bronchoscopy, and the EBUS-STAT questionnaire is capable of detecting these changes. A pretraining EBUS-STAT score below 79 is a good indicator of those candidates who will benefit from the simulation training. Further studies are needed to verify whether a modified version of the questionnaire would be capable of improving its performance among experienced bronchoscopists.

  5. Invasive Species Forecasting System: A Decision Support Tool for the U.S. Geological Survey: FY 2005 Benchmarking Report v.1.6

    NASA Technical Reports Server (NTRS)

    Stohlgren, Tom; Schnase, John; Morisette, Jeffrey; Most, Neal; Sheffner, Ed; Hutchinson, Charles; Drake, Sam; Van Leeuwen, Willem; Kaupp, Verne

    2005-01-01

    The National Institute of Invasive Species Science (NIISS), through collaboration with NASA's Goddard Space Flight Center (GSFC), recently began incorporating NASA observations and predictive modeling tools to fulfill its mission. These enhancements, labeled collectively as the Invasive Species Forecasting System (ISFS), are now in place in the NIISS in their initial state (V1.0). The ISFS is the primary decision support tool of the NIISS for the management and control of invasive species on Department of Interior and adjacent lands. The ISFS is the backbone for a unique information services line-of-business for the NIISS, and it provides the means for delivering advanced decision support capabilities to a wide range of management applications. This report describes the operational characteristics of the ISFS, a decision support tool of the United States Geological Survey (USGS). Recent enhancements to the performance of the ISFS, attained through the integration of observations, models, and systems engineering from the NASA are benchmarked; i.e., described quantitatively and evaluated in relation to the performance of the USGS system before incorporation of the NASA enhancements. This report benchmarks Version 1.0 of the ISFS.

  6. The IUE Science Operations Ground System

    NASA Technical Reports Server (NTRS)

    Pitts, Ronald E.; Arquilla, Richard

    1994-01-01

    The International Ultraviolet Explorer (IUE) Science Operations System provides full realtime operations capabilities and support to the operations staff and astronomer users. The components of this very diverse and extremely flexible hardware and software system have played a major role in maintaining the scientific efficiency and productivity of the IUE. The software provides the staff and user with all the tools necessary for pre-visit and real-time planning and operations analysis for any day of the year. Examples of such tools include the effects of spacecraft constraints on target availability, maneuver times between targets, availability of guide stars, target identification, coordinate transforms, e-mail transfer of Observatory forms and messages, and quick-look analysis of image data. Most of this extensive software package can also be accessed remotely by individual users for information, scheduling of shifts, pre-visit planning, and actual observing program execution. Astronomers, with a modest investment in hardware and software, may establish remote observing sites. We currently have over 20 such sites in our remote observers' network.

  7. The Auroral Planetary Imaging and Spectroscopy (APIS) service

    NASA Astrophysics Data System (ADS)

    Lamy, L.; Prangé, R.; Henry, F.; Le Sidaner, P.

    2015-06-01

    The Auroral Planetary Imaging and Spectroscopy (APIS) service, accessible online, provides an open and interactive access to processed auroral observations of the outer planets and their satellites. Such observations are of interest for a wide community at the interface between planetology, magnetospheric and heliospheric physics. APIS consists of (i) a high level database, built from planetary auroral observations acquired by the Hubble Space Telescope (HST) since 1997 with its mostly used Far-Ultraviolet spectro-imagers, (ii) a dedicated search interface aimed at browsing efficiently this database through relevant conditional search criteria and (iii) the ability to interactively work with the data online through plotting tools developed by the Virtual Observatory (VO) community, such as Aladin and Specview. This service is VO compliant and can therefore also been queried by external search tools of the VO community. The diversity of available data and the capability to sort them out by relevant physical criteria shall in particular facilitate statistical studies, on long-term scales and/or multi-instrumental multi-spectral combined analysis.

  8. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  9. Final report for the endowment of simulator agents with human-like episodic memory LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, Ann Elizabeth; Lippitt, Carl Edward; Thomas, Edward Victor

    This report documents work undertaken to endow the cognitive framework currently under development at Sandia National Laboratories with a human-like memory for specific life episodes. Capabilities have been demonstrated within the context of three separate problem areas. The first year of the project developed a capability whereby simulated robots were able to utilize a record of shared experience to perform surveillance of a building to detect a source of smoke. The second year focused on simulations of social interactions providing a queriable record of interactions such that a time series of events could be constructed and reconstructed. The third yearmore » addressed tools to promote desktop productivity, creating a capability to query episodic logs in real time allowing the model of a user to build on itself based on observations of the user's behavior.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson, William I.; Vogelmann, Andrew M.; Cheng, Xiaoping

    The Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility began a pilot project in May 2015 to design a routine, high-resolution modeling capability to complement ARM’s extensive suite of measurements. This modeling capability has been named the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) project. The initial focus of LASSO is on shallow convection at the ARM Southern Great Plains (SGP) Climate Research Facility. The availability of LES simulations with concurrent observations will serve many purposes. LES helps bridge the scale gap between DOE ARM observations and models, and the use of routine LES addsmore » value to observations. It provides a self-consistent representation of the atmosphere and a dynamical context for the observations. Further, it elucidates unobservable processes and properties. LASSO will generate a simulation library for researchers that enables statistical approaches beyond a single-case mentality. It will also provide tools necessary for modelers to reproduce the LES and conduct their own sensitivity experiments. Many different uses are envisioned for the combined LASSO LES and observational library. For an observationalist, LASSO can help inform instrument remote sensing retrievals, conduct Observation System Simulation Experiments (OSSEs), and test implications of radar scan strategies or flight paths. For a theoretician, LASSO will help calculate estimates of fluxes and co-variability of values, and test relationships without having to run the model yourself. For a modeler, LASSO will help one know ahead of time which days have good forcing, have co-registered observations at high-resolution scales, and have simulation inputs and corresponding outputs to test parameterizations. Further details on the overall LASSO project are available at https://www.arm.gov/capabilities/modeling/lasso.« less

  11. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  12. Spallation and fracture resulting from reflected and intersecting stress waves.

    NASA Technical Reports Server (NTRS)

    Kinslow, R.

    1973-01-01

    Discussion of the effects of stress waves produced in solid by explosions or high-velocity impacts. These waves rebound from free surfaces in the form of tensile waves that are capable of causing internal fractures or spallation of the material. The high-speed framing camera is shown to be an important tool for observing the stress waves and fracture in transparent targets, and its photographs provide valuable information on the mechanics of fracture.

  13. Didactic Time, Epistemic Gain and Consistent Tool: Taking Care of Teachers' Needs for Classroom Use of CAS--A Reaction to Barzel's "New Technology? New Ways of Teaching--No Time Left for That!"

    ERIC Educational Resources Information Center

    Lagrange, Jean-Baptiste

    2007-01-01

    The relationship between teachers and Computer Algebra Systems is generally problematic. The extensive capabilities of CAS provide opportunities for learning but also bring a new complexity that makes it difficult for teachers to take advantage of these opportunities. Barzel's paper contrasts with this observation: in a "Lernwerkstatt"…

  14. Enabling Research Tools for Sustained Climate Assessment

    NASA Technical Reports Server (NTRS)

    Leidner, Allison K.; Bosilovich, Michael G.; Jasinski, Michael F.; Nemani, Ramakrishna R.; Waliser, Duane Edward; Lee, Tsengdar J.

    2016-01-01

    The U.S. Global Change Research Program Sustained Assessment process benefits from long-term investments in Earth science research that enable the scientific community to conduct assessment-relevant science. To this end, NASA initiated several research programs over the past five years to support the Earth observation community in developing indicators, datasets, research products, and tools to support ongoing and future National Climate Assessments. These activities complement NASA's ongoing Earth science research programs. One aspect of the assessment portfolio funds four "enabling tools" projects at NASA research centers. Each tool leverages existing capacity within the center, but has developed tailored applications and products for National Climate Assessments. The four projects build on the capabilities of a global atmospheric reanalysis (MERRA-2), a continental U.S. land surface reanalysis (NCA-LDAS), the NASA Earth Exchange (NEX), and a Regional Climate Model Evaluation System (RCMES). Here, we provide a brief overview of each enabling tool, highlighting the ways in which it has advanced assessment science to date. We also discuss how the assessment community can access and utilize these tools for National Climate Assessments and other sustained assessment activities.

  15. Rapid Prototyping: State of the Art Review

    DTIC Science & Technology

    2003-10-23

    Steel H13 Tool Steel CP Ti, Ti-6Al-4V Titanium Tungsten Copper Aluminum Nickel...The company’s LENS 750 and LENS 850 machines (both $440,000 to $640,000) are capable of producing parts in 16 stainless steel , H13 tool steel ...machining. 20 The Arcam EBM S12 model sells for $500,000 and is capable of processing two materials. One is H13 tool steel , while the other

  16. Certified Satisfiability Modulo Theories (SMT) Solving for System Verification

    DTIC Science & Technology

    2017-01-01

    the compositionality of trustworthiness is also a critical capability: tools must be able to trust and use the results of other tools. One approach for...multiple reasoners to work together. Thus, the compositionality of trustworthiness is also a critical capability: tools must be able to trust and use the...level of confidence in the results returned by the underlying SMT solver. Unfortunately, obtaining the high level of trust required for, e.g., safety

  17. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  18. XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò

    2017-08-01

    We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.

  19. Integrating thematic web portal capabilities into the NASA Earthdata Web Infrastructure

    NASA Astrophysics Data System (ADS)

    Wong, M. M.; McLaughlin, B. D.; Huang, T.; Baynes, K.

    2015-12-01

    The National Aeronautics and Space Administration (NASA) acquires and distributes an abundance of Earth science data on a daily basis to a diverse user community worldwide. To assist the scientific community and general public in achieving a greater understanding of the interdisciplinary nature of Earth science and of key environmental and climate change topics, the NASA Earthdata web infrastructure is integrating new methods of presenting and providing access to Earth science information, data, research and results. This poster will present the process of integrating thematic web portal capabilities into the NASA Earthdata web infrastructure, with examples from the Sea Level Change Portal. The Sea Level Change Portal will be a source of current NASA research, data and information regarding sea level change. The portal will provide sea level change information through articles, graphics, videos and animations, an interactive tool to view and access sea level change data and a dashboard showing sea level change indicators. Earthdata is a part of the Earth Observing System Data and Information System (EOSDIS) project. EOSDIS is a key core capability in NASA's Earth Science Data Systems Program. It provides end-to-end capabilities for managing NASA's Earth science data from various sources - satellites, aircraft, field measurements, and various other programs. It is comprised of twelve Distributed Active Archive Centers (DAACs), Science Computing Facilities (SCFs), data discovery and service access client (Reverb and Earthdata Search), dataset directory (Global Change Master Directory - GCMD), near real-time data (Land Atmosphere Near real-time Capability for EOS - LANCE), Worldview (an imagery visualization interface), Global Imagery Browse Services, the Earthdata Code Collaborative and a host of other discipline specific data discovery, data access, data subsetting and visualization tools.

  20. Borehole Tool for the Comprehensive Characterization of Hydrate-bearing Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Sheng; Santamarina, J. Carlos

    Reservoir characterization and simulation require reliable parameters to anticipate hydrate deposits responses and production rates. The acquisition of the required fundamental properties currently relies on wireline logging, pressure core testing, and/or laboratory observations of synthesized specimens, which are challenged by testing capabilities and innate sampling disturbances. The project reviews hydrate-bearing sediments, properties, and inherent sampling effects, albeit lessen with the developments in pressure core technology, in order to develop robust correlations with index parameters. The resulting information is incorporated into a tool for optimal field characterization and parameter selection with uncertainty analyses. Ultimately, the project develops a borehole tool formore » the comprehensive characterization of hydrate-bearing sediments at in situ, with the design recognizing past developments and characterization experience and benefited from the inspiration of nature and sensor miniaturization.« less

  1. Intelligent On-Board Processing in the Sensor Web

    NASA Astrophysics Data System (ADS)

    Tanner, S.

    2005-12-01

    Most existing sensing systems are designed as passive, independent observers. They are rarely aware of the phenomena they observe, and are even less likely to be aware of what other sensors are observing within the same environment. Increasingly, intelligent processing of sensor data is taking place in real-time, using computing resources on-board the sensor or the platform itself. One can imagine a sensor network consisting of intelligent and autonomous space-borne, airborne, and ground-based sensors. These sensors will act independently of one another, yet each will be capable of both publishing and receiving sensor information, observations, and alerts among other sensors in the network. Furthermore, these sensors will be capable of acting upon this information, perhaps altering acquisition properties of their instruments, changing the location of their platform, or updating processing strategies for their own observations to provide responsive information or additional alerts. Such autonomous and intelligent sensor networking capabilities provide significant benefits for collections of heterogeneous sensors within any environment. They are crucial for multi-sensor observations and surveillance, where real-time communication with external components and users may be inhibited, and the environment may be hostile. In all environments, mission automation and communication capabilities among disparate sensors will enable quicker response to interesting, rare, or unexpected events. Additionally, an intelligent network of heterogeneous sensors provides the advantage that all of the sensors can benefit from the unique capabilities of each sensor in the network. The University of Alabama in Huntsville (UAH) is developing a unique approach to data processing, integration and mining through the use of the Adaptive On-Board Data Processing (AODP) framework. AODP is a key foundation technology for autonomous internetworking capabilities to support situational awareness by sensors and their on-board processes. The two primary research areas for this project are (1) the on-board processing and communications framework itself, and (2) data mining algorithms targeted to the needs and constraints of the on-board environment. The team is leveraging its experience in on-board processing, data mining, custom data processing, and sensor network design. Several unique UAH-developed technologies are employed in the AODP project, including EVE, an EnVironmEnt for on-board processing, and the data mining tools included in the Algorithm Development and Mining (ADaM) toolkit.

  2. Specification and Error Pattern Based Program Monitoring

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Johnson, Scott; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2001-01-01

    We briefly present Java PathExplorer (JPAX), a tool developed at NASA Ames for monitoring the execution of Java programs. JPAX can be used not only during program testing to reveal subtle errors, but also can be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program in order to properly observe its execution. The instrumentation can be either at the bytecode level or at the source level when the source code is available. JPaX is an instance of a more general project, called PathExplorer (PAX), which is a basis for experiments rather than a fixed system, capable of monitoring various programming languages and experimenting with other logics and analysis techniques

  3. Sensitivity Observing System Experiment (SOSE)-a new effective NWP-based tool in designing the global observing system

    NASA Astrophysics Data System (ADS)

    Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan

    2008-03-01

    Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.

  4. Chimpanzees routinely fish for algae with tools during the dry season in Bakoun, Guinea.

    PubMed

    Boesch, Christophe; Kalan, Ammie K; Agbor, Anthony; Arandjelovic, Mimi; Dieguez, Paula; Lapeyre, Vincent; Kühl, Hjalmar S

    2017-03-01

    Wild chimpanzees regularly use tools, made from sticks, leaves, or stone, to find flexible solutions to the ecological challenges of their environment. Nevertheless, some studies suggest strong limitations in the tool-using capabilities of chimpanzees. In this context, we present the discovery of a newly observed tool-use behavior in a population of chimpanzees (Pan troglodytes verus) living in the Bakoun Classified Forest, Guinea, where a temporary research site was established for 15 months. Bakoun chimpanzees of every age-sex class were observed to fish for freshwater green algae, Spirogrya sp., from rivers, streams, and ponds using long sticks and twigs, ranging from 9 cm up to 4.31 m in length. Using remote camera trap footage from 11 different algae fishing sites within an 85-km 2 study area, we found that algae fishing occurred frequently during the dry season and was non-existent during the rainy season. Chimpanzees were observed algae fishing for as little as 1 min to just over an hour, with an average duration of 9.09 min. We estimate that 364 g of Spirogyra algae could be retrieved in this time, based on human trials in the field. Only one other chimpanzee population living in Bossou, Guinea, has been described to customarily scoop algae from the surface of the water using primarily herbaceous tools. Here, we describe the new behavior found at Bakoun and compare it to the algae scooping observed in Bossou chimpanzees and the occasional variant reported in Odzala, Republic of the Congo. As these algae are reported to be high in protein, carbohydrates, and minerals, we hypothesize that chimpanzees are obtaining a nutritional benefit from this seasonally available resource. © 2016 Wiley Periodicals, Inc.

  5. An optimized method to calculate error correction capability of tool influence function in frequency domain

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Hou, Xi; Wan, Yongjian; Shi, Chunyan

    2017-10-01

    An optimized method to calculate error correction capability of tool influence function (TIF) in certain polishing conditions will be proposed based on smoothing spectral function. The basic mathematical model for this method will be established in theory. A set of polishing experimental data with rigid conformal tool is used to validate the optimized method. The calculated results can quantitatively indicate error correction capability of TIF for different spatial frequency errors in certain polishing conditions. The comparative analysis with previous method shows that the optimized method is simpler in form and can get the same accuracy results with less calculating time in contrast to previous method.

  6. Multimodality hard-x-ray imaging of a chromosome with nanoscale spatial resolution

    DOE PAGES

    Yan, Hanfei; Nazaretski, Evgeny; Lauer, Kenneth R.; ...

    2016-02-05

    Here, we developed a scanning hard x-ray microscope using a new class of x-ray nano-focusing optic called a multilayer Laue lens and imaged a chromosome with nanoscale spatial resolution. The combination of the hard x-ray's superior penetration power, high sensitivity to elemental composition, high spatial-resolution and quantitative analysis creates a unique tool with capabilities that other microscopy techniques cannot provide. Using this microscope, we simultaneously obtained absorption-, phase-, and fluorescence-contrast images of Pt-stained human chromosome samples. The high spatial-resolution of the microscope and its multi-modality imaging capabilities enabled us to observe the internal ultra-structures of a thick chromosome without sectioningmore » it.« less

  7. Multimodality hard-x-ray imaging of a chromosome with nanoscale spatial resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hanfei; Nazaretski, Evgeny; Lauer, Kenneth R.

    Here, we developed a scanning hard x-ray microscope using a new class of x-ray nano-focusing optic called a multilayer Laue lens and imaged a chromosome with nanoscale spatial resolution. The combination of the hard x-ray's superior penetration power, high sensitivity to elemental composition, high spatial-resolution and quantitative analysis creates a unique tool with capabilities that other microscopy techniques cannot provide. Using this microscope, we simultaneously obtained absorption-, phase-, and fluorescence-contrast images of Pt-stained human chromosome samples. The high spatial-resolution of the microscope and its multi-modality imaging capabilities enabled us to observe the internal ultra-structures of a thick chromosome without sectioningmore » it.« less

  8. Configuration Analysis Tool (CAT). System Description and users guide (revision 1)

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.

    1982-01-01

    A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.

  9. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    DTIC Science & Technology

    2011-05-10

    concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility...existing surveillance applications or the SAGES tools may be used en masse for an end–to-end biosurveillance capability. doi:10.1371/journal.pone...health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular

  10. Demonstration of CBR Modeling and Simulation Tool (CBRSim) Capabilities. Installation Technology Transfer Program

    DTIC Science & Technology

    2009-04-01

    Capabilities Co ns tr uc tio n En gi ne er in g R es ea rc h La bo ra to ry Kathy L. Simunich, Timothy K. Perkins, David M. Bailey, David Brown, and...inversion height in convective condition is estimated with a one- dimensional model of the atmospheric boundary layer based on the Drie- donks slab model...tool and its capabilities. Installation geospatial data, in CAD format, were obtained for select buildings, roads, and topographic features in

  11. Lunar Extravehicular Activity Program

    NASA Technical Reports Server (NTRS)

    Heartsill, Amy Ellison

    2006-01-01

    Extravehicular Activity (EVA) has proven an invaluable tool for space exploration since the inception of the space program. There are situations in which the best means to evaluate, observe, explore and potentially troubleshoot space systems are accomplished by direct human intervention. EVA provides this unique capability. There are many aspects of the technology required to enable a "miniature spaceship" to support individuals in a hostile environment in order to accomplish these tasks. This includes not only the space suit assembly itself, but the tools, design interfaces of equipment on which EVA must work and the specific vehicles required to support transfer of humans between habitation areas and the external world. This lunar mission program will require EVA support in three primary areas. The first of these areas include Orbital stage EVA or micro-gravity EVA which includes both Low Earth Orbit (LEO), transfer and Lunar Orbit EVA. The second area is Lunar Lander EVA capability, which is lunar surface EVA and carries slightly different requirements from micro-gravity EVA. The third and final area is Lunar Habitat based surface EVA, which is the final system supporting a long-term presence on the moon.

  12. The New Meteor Radar at Penn State: Design and First Observations

    NASA Technical Reports Server (NTRS)

    Urbina, J.; Seal, R.; Dyrud, L.

    2011-01-01

    In an effort to provide new and improved meteor radar sensing capabilities, Penn State has been developing advanced instruments and technologies for future meteor radars, with primary objectives of making such instruments more capable and more cost effective in order to study the basic properties of the global meteor flux, such as average mass, velocity, and chemical composition. Using low-cost field programmable gate arrays (FPGAs), combined with open source software tools, we describe a design methodology enabling one to develop state-of-the art radar instrumentation, by developing a generalized instrumentation core that can be customized using specialized output stage hardware. Furthermore, using object-oriented programming (OOP) techniques and open-source tools, we illustrate a technique to provide a cost-effective, generalized software framework to uniquely define an instrument s functionality through a customizable interface, implemented by the designer. The new instrument is intended to provide instantaneous profiles of atmospheric parameters and climatology on a daily basis throughout the year. An overview of the instrument design concepts and some of the emerging technologies developed for this meteor radar are presented.

  13. Improvement of Thermal Interruption Capability in Self-blast Interrupting Chamber for New 245kV-50kA GCB

    NASA Astrophysics Data System (ADS)

    Shinkai, Takeshi; Koshiduka, Tadashi; Mori, Tadashi; Uchii, Toshiyuki; Tanaka, Tsutomu; Ikeda, Hisatoshi

    Current zero measurements are performed for 245kV-50kA-60Hz short line fault (L90) interruption tests with a self-blast interrupting chamber (double volume system) which has the interrupting capability up to 245kV-50kA-50Hz L90. Lower L90 interruption capability is observed for longer arcing time although very high pressure rise is obtained. It may be caused by higher blowing temperature and lower blowing density for longer arcing time. Interruption criteria and a optimization method of the chamber design are discussed to improve L90 interruption capability with it. The new chambers are designed at 245kV-50kA-60Hz to improve gas density in thermal volume for long arcing time. 245kV-50kA-60Hz L90 interruptions are performed with the new chamber. The suggested optimization method is an efficient tool for the self-blast interrupting chamber design although study of computing methods is required to calculate arc conductance around current zero as a direct criterion for L90 interruption capability with higher accuracy.

  14. Supporting New Missions by Observing Simulation Experiments in WACCM-X/GEOS-5 and TIME-GCM: Initial Design, Challenges and Perspectives

    NASA Astrophysics Data System (ADS)

    Yudin, V. A.; England, S.; Liu, H.; Solomon, S. C.; Immel, T. J.; Maute, A. I.; Burns, A. G.; Foster, B.; Wu, Q.; Goncharenko, L. P.

    2013-12-01

    We examine the capability of novel configurations of community models, WACCM-X and TIME-GCM, to support current and forthcoming space-borne missions to monitor the dynamics and composition of the Mesosphere-Thermosphere-Ionosphere (MTI) system. In these configurations the lower atmosphere of WACCM-X is constrained by operational analyses and/or short-term forecasts provided by the Goddard Earth Observing System (GEOS-5) of Global Modeling and Assimilation Office at NASA/GSFC. With the terrestrial weather of GEOS-5 and updated model physics the simulations in the MTI are capable to reproduce observed signatures of the perturbed wave dynamics and ion-neutral coupling during recent stratospheric warming events, short-term, annual and year-to-year variability of prevailing flows, planetary waves, tides, and composition. These 'terrestrial-weather' driven simulations with day-to-day variable solar and geomagnetic inputs can provide background state (first guess) and error statistics for the inverse algorithms of new NASA missions, ICON and GOLD at locations and time of observations in the MTI region. With two different viewing geometries (sun-synchronous and geostationary) of instruments, ICON and GOLD will provide complimentary global observations of temperature, winds and constituents to constrain the first-principle forecast models. This paper will discuss initial design of Observing Simulation Experiments (OSE) in WACCM-X/GEOS-5 and TIME-GCM. As recognized, OSE represent an excellent learning tool for designing and evaluating observing capabilities of novel sensors. They can guide on how to integrate/combine information from different instruments. The choice of assimilation schemes, forecast and observational errors will be discussed along with challenges and perspectives to constrain fast-varying tidal dynamics and their effects in models by combination of sun-synchronous and geostationary observations of ICON and GOLD. We will also discuss how correlative space-borne and ground-based observations can verify OSE results in the observable and non-observable regions of the MTI.

  15. Tool for Two Types of Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Carter, Robert

    2006-01-01

    A tool that would be useable in both conventional and self-reacting friction stir welding (FSW) has been proposed. The tool would embody both a prior tooling concept for self-reacting FSW and an auto-adjustable pin-tool (APT) capability developed previously as an augmentation for conventional FSW. Some definitions of terms are prerequisite to a meaningful description of the proposed tool. In conventional FSW, depicted in Figure 1, one uses a tool that includes (1) a rotating shoulder on top (or front) of the workpiece and (2) a rotating pin that protrudes from the shoulder into the depth of the workpiece. The main axial force exerted by the tool on the workpiece is reacted through a ridged backing anvil under (behind) the workpiece. When conventional FSW is augmented with an APT capability, the depth of penetration of the pin into the workpiece is varied in real time by a position- or force-control system that extends or retracts the pin as needed to obtain the desired effect. In self-reacting (also known as self-reacted) friction stir welding (SR-FSW), there are two rotating shoulders: one on top (or front) and one on the bottom (or back) of the workpiece. In this case, a threaded shaft protrudes from the tip of the pin to beyond the back surface of the workpiece. The back shoulder is held axially in place against tension by a nut on the threaded shaft. The main axial force exerted on the workpiece by the tool and front shoulder is reacted through the back shoulder and the threaded shaft, back into the FSW machine head, so that a backing anvil is no longer needed. A key transmits torque between the bottom shoulder and the threaded shaft, so that the bottom shoulder rotates with the shaft. A tool for SRFSW embodying this concept was reported in "Mechanism for Self-Reacted Friction Stir Welding" (MFS-31914), NASA Tech Briefs, Vol. 28, No. 10 (October 2004), page 53. In its outward appearance, the proposed tool (see Figure 2) would fit the above description of an SR-FSW tool. In this case, the FSW machine would have an APT capability and the pin would be modified to accept a bottom shoulder. The APT capability could be used to vary the distance between the front and back shoulders in real time to accommodate process and workpiece-thickness variations. The tool could readily be converted to a conventional FSW tool, with or without APT capability, by simply replacing the modified pin with a conventional FSW pin.

  16. Ball Bearing Analysis with the ORBIS Tool

    NASA Technical Reports Server (NTRS)

    Halpin, Jacob D.

    2016-01-01

    Ball bearing design is critical to the success of aerospace mechanisms. Key bearing performance parameters, such as load capability, stiffness, torque, and life all depend on accurate determination of the internal load distribution. Hence, a good analytical bearing tool that provides both comprehensive capabilities and reliable results becomes a significant asset to the engineer. This paper introduces the ORBIS bearing tool. A discussion of key modeling assumptions and a technical overview is provided. Numerous validation studies and case studies using the ORBIS tool are presented. All results suggest the ORBIS code closely correlates to predictions on bearing internal load distributions, stiffness, deflection and stresses.

  17. A probabilistic methodology for radar cross section prediction in conceptual aircraft design

    NASA Astrophysics Data System (ADS)

    Hines, Nathan Robert

    System effectiveness has increasingly become the prime metric for the evaluation of military aircraft. As such, it is the decision maker's/designer's goal to maximize system effectiveness. Industry and government research documents indicate that all future military aircraft will incorporate signature reduction as an attempt to improve system effectiveness and reduce the cost of attrition. Today's operating environments demand low observable aircraft which are able to reliably take out valuable, time critical targets. Thus it is desirable to be able to design vehicles that are balanced for increased effectiveness. Previous studies have shown that shaping of the vehicle is one of the most important contributors to radar cross section, a measure of radar signature, and must be considered from the very beginning of the design process. Radar cross section estimation should be incorporated into conceptual design to develop more capable systems. This research strives to meet these needs by developing a conceptual design tool that predicts radar cross section for parametric geometries. This tool predicts the absolute radar cross section of the vehicle as well as the impact of geometry changes, allowing for the simultaneous tradeoff of the aerodynamic, performance, and cost characteristics of the vehicle with the radar cross section. Furthermore, this tool can be linked to a campaign theater analysis code to demonstrate the changes in system and system of system effectiveness due to changes in aircraft geometry. A general methodology was developed and implemented and sample computer codes applied to prototype the proposed process. Studies utilizing this radar cross section tool were subsequently performed to demonstrate the capabilities of this method and show the impact that various inputs have on the outputs of these models. The F/A-18 aircraft configuration was chosen as a case study vehicle to perform a design space exercise and to investigate the relative impact of shaping parameters on radar cross section. Finally, two unique low observable configurations were analyzed to examine the impact of shaping for stealthiness.

  18. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  19. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  20. Implementation of a low-cost, commercial orbit determination system

    NASA Technical Reports Server (NTRS)

    Corrigan, Jim

    1994-01-01

    This paper describes the implementation and potential applications of a workstation-based orbit determination system developed by Storm Integration, Inc. called the Precision Orbit Determination System (PODS). PODS is offered as a layered product to the commercially-available Satellite Tool Kit (STK) produced by Analytical Graphics, Inc. PODS also incorporates the Workstation/Precision Orbit Determination (WS/POD) product offered by Van Martin System, Inc. The STK graphical user interface is used to access and invoke the PODS capabilities and to display the results. WS/POD is used to compute a best-fit solution to user-supplied tracking data. PODS provides the capability to simultaneously estimate the orbits of up to 99 satellites based on a wide variety of observation types including angles, range, range rate, and Global Positioning System (GPS) data. PODS can also estimate ground facility locations, Earth geopotential model coefficients, solar pressure and atmospheric drag parameters, and observation data biases. All determined data is automatically incorporated into the STK data base, which allows storage, manipulation and export of the data to other applications. PODS is offered in three levels: Standard, Basic GPS and Extended GPS. Standard allows processing of non-GPS observation types for any number of vehicles and facilities. Basic GPS adds processing of GPS pseudo-ranging data to the Standard capabilities. Extended GPS adds the ability to process GPS carrier phase data.

  1. Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges

    NASA Technical Reports Server (NTRS)

    He, Matt; Hardin, Danny; Conover, Helen; Graves, Sara; Meyer, Paul; Blakeslee, Richard; Goodman, Michael

    2012-01-01

    Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross ]platform, modular designed JavaScript ]controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.

  2. The Way Point Planning Tool: Real Time Flight Planning for Airborne Science

    NASA Technical Reports Server (NTRS)

    He, Yubin; Blakeslee, Richard; Goodman, Michael; Hall, John

    2012-01-01

    Airborne real time observation are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientist, planning a research aircraft mission within the context of meeting the science objective is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircraft are often involved in the NASA field campaigns the coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving dynamic weather conditions often determine the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientist and help them plan and modify the flight tracks successfully. Scientists at the University of Alabama Huntsville and the NASA Marshal Space Flight Center developed the Waypoint Planning Tool (WPT), an interactive software tool that enables scientist to develop their own flight plans (also known as waypoints), with point and click mouse capabilities on a digital map filled with time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analyses during and after each campaign helped identify both issues and new requirements, initiating the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities to the Google Earth Plugin and Java Web Start/Applet on web platform, as well as to the rising open source GIS tools with new JavaScript frameworks, the Waypoint planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controled Waypoint tool is planned to be integrated with the NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives. This presentation will discuss the development process of the Waypoint Planning Tool in responding to field campaign challenges, identifying new information technologies, and describing the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.

  3. Development of Way Point Planning Tool in Response to NASA Field Campaign Challenges

    NASA Astrophysics Data System (ADS)

    He, M.; Hardin, D. M.; Conover, H.; Graves, S. J.; Meyer, P.; Blakeslee, R. J.; Goodman, M. L.

    2012-12-01

    Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point-and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.

  4. Expanded Capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian; Melaina, Marc; Penev, Michael

    This presentation describes how NREL expanded the capabilities for the Hydrogen Financial Analysis Scenario Tool (H2FAST) in FY16. It was presented at the U.S. Department of Energy Hydrogen and Fuel Cells Program 2016 Annual Merit Review and Peer Evaluation Meeting on June 8, 2016, in Washington, D.C.

  5. Decision Support Methods and Tools

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.

    2006-01-01

    This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed

  6. Effects of random aspects of cutting tool wear on surface roughness and tool life

    NASA Astrophysics Data System (ADS)

    Nabil, Ben Fredj; Mabrouk, Mohamed

    2006-10-01

    The effects of random aspects of cutting tool flank wear on surface roughness and on tool lifetime, when turning the AISI 1045 carbon steel, were studied in this investigation. It was found that standard deviations corresponding to tool flank wear and to the surface roughness increase exponentially with cutting time. Under cutting conditions that correspond to finishing operations, no significant differences were found between the calculated values of the capability index C p at the steady-state region of the tool flank wear, using the best-fit method or the Box-Cox transformation, or by making the assumption that the surface roughness data are normally distributed. Hence, a method to establish cutting tool lifetime could be established that simultaneously respects the desired average of surface roughness and the required capability index.

  7. Aspects of ultra-high-precision diamond machining of RSA 443 optical aluminium

    NASA Astrophysics Data System (ADS)

    Mkoko, Z.; Abou-El-Hossein, K.

    2015-08-01

    Optical aluminium alloys such as 6061-T6 are traditionally used in ultra-high precision manufacturing for making optical mirrors for aerospace and other applications. However, the optics industry has recently witnessed the development of more advanced optical aluminium grades that are capable of addressing some of the issues encountered when turning with single-point natural monocrystalline diamond cutters. The advent of rapidly solidified aluminium (RSA) grades has generally opened up new possibilities for ultra-high precision manufacturing of optical components. In this study, experiments were conducted with single-point diamond cutters on rapidly solidified aluminium RSA 443 material. The objective of this study is to observe the effects of depth of cut and feed rate at a fixed rotational speed on the tool wear rate and resulting surface roughness of diamond turned specimens. This is done to gain further understanding of the rate of wear on the diamond cutters versus the surface texture generated on the RSA 443 material. The diamond machining experiments yielded machined surfaces which are less reflective but with consistent surface roughness values. Cutting tools were observed for wear through scanning microscopy; relatively low wear pattern was evident on the diamond tool edge. The highest tool wear were obtained at higher depth of cut and increased feed rate.

  8. The Education and Public Engagement (EPE) Component of the Ocean Observatories Initiative (OOI): Enabling Near Real-Time Data Use in Undergraduate Classrooms

    NASA Astrophysics Data System (ADS)

    Glenn, S. M.; Companion, C.; Crowley, M.; deCharon, A.; Fundis, A. T.; Kilb, D. L.; Levenson, S.; Lichtenwalner, C. S.; McCurdy, A.; McDonnell, J. D.; Overoye, D.; Risien, C. M.; Rude, A.; Wieclawek, J., III

    2011-12-01

    The National Science Foundation's Ocean Observatories Initiative (OOI) is constructing observational and computer infrastructure that will provide sustained ocean measurements to study climate variability, ocean circulation, ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics over the next ~25-30 years. To accomplish this, the Consortium for Ocean Leadership established four Implementing Organizations: (1) Regional Scale Nodes; (2) Coastal and Global Scale Nodes; (3) Cyberinfrastructure (CI); and (4) Education and Public Engagement (EPE). The EPE, which we represent, was just recently established to provide a new layer of cyber-interactivity for educators to bring near real-time data, images and videos of our Earth's oceans into their learning environments. Our focus over the next four years is engaging educators of undergraduates and free-choice learners. Demonstration projects of the OOI capabilities will use an Integrated Education Toolkit to access OOI data through the Cyberinfrastructure's On Demand Measurement Processing capability. We will present our plans to develop six education infrastructure software modules: Education Web Services (middleware), Visualization Tools, Concept Map and Lab/Lesson Builders, Collaboration Tools, and an Education Resources Database. The software release of these tools is staggered to coincide with other major OOI releases. The first release will include stand-alone versions of the first four EPE modules (Fall 2012). Next, all six EPE modules will be integrated within the OOI cyber-framework (Fall 2013). The last release will include advanced capabilities for all six modules within a collaborative network that leverages the CI's Integrated Observatory Network (Fall 2014). We are looking for undergraduate and informal science educators to provide feedback and guidance on the project, please contact us if you are interested in partnering with us.

  9. A survey and assessment of the capabilities of Cubesats for Earth observation

    NASA Astrophysics Data System (ADS)

    Selva, Daniel; Krejci, David

    2012-05-01

    In less than a decade, Cubesats have evolved from purely educational tools to a standard platform for technology demonstration and scientific instrumentation. The use of COTS (Commercial-Off-The-Shelf) components and the ongoing miniaturization of several technologies have already led to scattered instances of missions with promising scientific value. Furthermore, advantages in terms of development cost and development time with respect to larger satellites, as well as the possibility of launching several dozens of Cubesats with a single rocket launch, have brought forth the potential for radically new mission architectures consisting of very large constellations or clusters of Cubesats. These architectures promise to combine the temporal resolution of GEO missions with the spatial resolution of LEO missions, thus breaking a traditional trade-off in Earth observation mission design. This paper assesses the current capabilities of Cubesats with respect to potential employment in Earth observation missions. A thorough review of Cubesat bus technology capabilities is performed, identifying potential limitations and their implications on 17 different Earth observation payload technologies. These results are matched to an exhaustive review of scientific requirements in the field of Earth observation, assessing the possibilities of Cubesats to cope with the requirements set for each one of 21 measurement categories. Based on this review, several Earth observation measurements are identified that can potentially be compatible with the current state-of-the-art of Cubesat technology although some of them have actually never been addressed by any Cubesat mission. Simultaneously, other measurements are identified which are unlikely to be performed by Cubesats in the next few years due to insuperable constraints. Ultimately, this paper is intended to supply a box of ideas for universities to design future Cubesat missions with high scientific payoff.

  10. Emergency preparedness: community-based short-term eruption forecasting at Campi Flegrei

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Marzocchi, Warner; Civetta, Lucia; Del Pezzo, Edoardo; Papale, Paolo

    2010-05-01

    A key element in emergency preparedness is to define advance tools to assist decision makers and emergency management groups during crises. Such tools must be prepared in advance, accounting for all of expertise and scientific knowledge accumulated through time. During a pre-eruptive phase, the key for sound short-term eruption forecasting is the analysis of the monitoring signals. This involves the capability (i) to recognize anomalous signals and to relate single or combined anomalies to physical processes, assigning them probability values, and (ii) to quickly provide an answer to the observed phenomena even when unexpected. Here we present a > 4 years long process devoted to define the pre-eruptive Event Tree (ET) for Campi Flegrei. A community of about 40 experts in volcanology and volcano monitoring participating to two Italian Projects on Campi Flegrei funded by the Italian Civil Protection, has been constituted and trained during periodic meetings on the statistical methods and the model BET_EF (Marzocchi et al., 2008) that forms the statistical package tool for ET definition. Model calibration has been carried out through public elicitation sessions, preceded and followed by devoted meetings and web forum discussion on the monitoring parameters, their accuracy and relevance, and their potential meanings. The calibrated ET allows anomalies in the monitored parameters to be recognized and interpreted, assigning probability values to each set of data. This process de-personalizes the difficult task of interpreting multi-parametric sets of data during on-going emergencies, and provides a view of the observed variations that accounts for the averaged, weighted opinion of the scientific community. An additional positive outcome of the described ET calibration process is that of providing a picture of the degree of confidence by the expert community on the capability of the many different monitored quantities of recognizing significant variations in the state of the volcano. This picture is particularly useful since it can be used to guide future implementations in the monitoring network, as well as research investments aimed at substantially improving the capability to forecast the short-term volcanic hazard.

  11. Distributed Observatory Management

    NASA Astrophysics Data System (ADS)

    Godin, M. A.; Bellingham, J. G.

    2006-12-01

    A collection of tools for collaboratively managing a coastal ocean observatory have been developed and used in a multi-institutional, interdisciplinary field experiment. The Autonomous Ocean Sampling Network program created these tools to support the Adaptive Sampling and Prediction (ASAP) field experiment that occurred in Monterey Bay in the summer of 2006. ASAP involved the day-to-day participation of a large group of researchers located across North America. The goal of these investigators was to adapt an array of observational assets to optimize data collection and analysis. Achieving the goal required continual interaction, but the long duration of the observatory made sustained co-location of researchers difficult. The ASAP team needed a remote collaboration tool, the capability to add non-standard, interdisciplinary data sets to the overall data collection, and the ability to retrieve standardized data sets from the collection. Over the course of several months and "virtual experiments," the Ocean Observatory Portal (COOP) collaboration tool was created, along with tools for centralizing, cataloging, and converting data sets into common formats, and tools for generating automated plots of the common format data. Accumulating the data in a central location and converting the data to common formats allowed any team member to manipulate any data set quickly, without having to rely heavily on the expertise of data generators to read the data. The common data collection allowed for the development of a wide range of comparison plots and allowed team members to assimilate new data sources into derived outputs such as ocean models quickly. In addition to the standardized outputs, team members were able to produce their own specialized products and link to these through the collaborative portal, which made the experimental process more interdisciplinary and interactive. COOP was used to manage the ASAP vehicle program from its start in July 2006. New summaries were posted to the COOP tool on a daily basis, and updated with announcements on schedule, system status, voting results from previous day, ocean, atmosphere, hardware, adaptive sampling and coordinated control and forecast. The collection of standardized data files was used to generate daily plots of observed and predicted currents, temperature, and salinity. Team members were able to participate from any internet-accessible location using common Internet browsers, and any team member could add to the day's summary, point out trends and discuss observations, and make an adaptation proposal. If a team member submitted a proposal, team-wide discussion and voting followed. All interactions were archived and left publicly accessible so that future experiments could be made more systematic with increased automation. The need for collaboration and data handling tools is important for future ocean observatories, which will require 24-hour per day, 7-day a week interactions over many years. As demonstrated in the ASAP experiment, the COOP tool and associated data handling tools allowed scientists to coherently and collaboratively manage an ocean observatory, without being co-located at the observatory. Lessons learned from operating these collaborative tools during the ASAP experiment provide an important foundation for creating even more capable portals.

  12. Multimodal optical imaging system for in vivo investigation of cerebral oxygen delivery and energy metabolism

    PubMed Central

    Yaseen, Mohammad A.; Srinivasan, Vivek J.; Gorczynska, Iwona; Fujimoto, James G.; Boas, David A.; Sakadžić, Sava

    2015-01-01

    Improving our understanding of brain function requires novel tools to observe multiple physiological parameters with high resolution in vivo. We have developed a multimodal imaging system for investigating multiple facets of cerebral blood flow and metabolism in small animals. The system was custom designed and features multiple optical imaging capabilities, including 2-photon and confocal lifetime microscopy, optical coherence tomography, laser speckle imaging, and optical intrinsic signal imaging. Here, we provide details of the system’s design and present in vivo observations of multiple metrics of cerebral oxygen delivery and energy metabolism, including oxygen partial pressure, microvascular blood flow, and NADH autofluorescence. PMID:26713212

  13. Analyze and predict VLTI observations: the Role of 2D/3D dust continuum radiative transfer codes

    NASA Astrophysics Data System (ADS)

    Pascucci, I.; Henning, Th; Steinacker, J.; Wolf, S.

    2003-10-01

    Radiative Transfer (RT) codes with image capability are a fundamental tool for preparing interferometric observations and for interpreting visibility data. In view of the upcoming VLTI facilities, we present the first comparison of images/visibilities coming from two 3D codes that use completely different techniques to solve the problem of self-consistent continuum RT. In addition, we focus on the astrophysical case of a disk distorted by tidal interaction with by-passing stars or internal planets and investigate for which parameters the distortion can be best detected in the mid-infrared using the mid-infrared interferometric device MIDI.

  14. 2D/3D Dust Continuum Radiative Transfer Codes to Analyze and Predict VLTI Observations

    NASA Astrophysics Data System (ADS)

    Pascucci, I.; Henning, Th.; Steinacker, J.; Wolf, S.

    Radiative Transfer (RT) codes with image capability are a fundamental tool for preparing interferometric observations and for interpreting visibility data. In view of the upcoming VLTI facilities, we present the first comparison of images/visibilities coming from two 3D codes that use completely different techniques to solve the problem of self-consistent continuum RT. In addition, we focus on the astrophysical case of a disk distorted by tidal interaction with by-passing stars or internal planets and investigate for which parameters the distortion can be best detected in the mid-infrared using the mid-infrared interferometric device MIDI.

  15. Evaluation of CMIP5 Ability to Reproduce 20th Century Regional Trends in Surface Air Temperature and Precipitation over CONUS

    NASA Astrophysics Data System (ADS)

    Lee, J.; Waliser, D. E.; Lee, H.; Loikith, P. C.; Kunkel, K.

    2017-12-01

    Monitoring temporal changes in key climate variables, such as surface air temperature and precipitation, is an integral part of the ongoing efforts of the United States National Climate Assessment (NCA). Climate models participating in CMIP5 provide future trends for four different emissions scenarios. In order to have confidence in the future projections of surface air temperature and precipitation, it is crucial to evaluate the ability of CMIP5 models to reproduce observed trends for three different time periods (1895-1939, 1940-1979, and 1980-2005). Towards this goal, trends in surface air temperature and precipitation obtained from the NOAA nClimGrid 5 km gridded station observation-based product are compared during all three time periods to the 206 CMIP5 historical simulations from 48 unique GCMs and their multi-model ensemble (MME) for NCA-defined climate regions during summer (JJA) and winter (DJF). This evaluation quantitatively examines the biases of simulated trends of the spatially averaged temperature and precipitation in the NCA climate regions. The CMIP5 MME reproduces historical surface air temperature trends for JJA for all time period and all regions, except the Northern Great Plains from 1895-1939 and Southeast during 1980-2005. Likewise, for DJF, the MME reproduces historical surface air temperature trends across all time periods over all regions except the Southeast from 1895-1939 and the Midwest during 1940-1979. The Regional Climate Model Evaluation System (RCMES), an analysis tool which supports the NCA by providing access to data and tools for regional climate model validation, facilitates the comparisons between the models and observation. The RCMES Toolkit is designed to assist in the analysis of climate variables and the procedure of the evaluation of climate projection models to support the decision-making processes. This tool is used in conjunction with the above analysis and results will be presented to demonstrate its capability to access observation and model datasets, calculate evaluation metrics, and visualize the results. Several other examples of the RCMES capabilities can be found at https://rcmes.jpl.nasa.gov.

  16. Mapping the Delivery of Societal Benefit through the International Arctic Observations Assessment Framework

    NASA Astrophysics Data System (ADS)

    Lev, S. M.; Gallo, J.

    2017-12-01

    The international Arctic scientific community has identified the need for a sustained and integrated portfolio of pan-Arctic Earth-observing systems. In 2017, an international effort was undertaken to develop the first ever Value Tree framework for identifying common research and operational objectives that rely on Earth observation data derived from Earth-observing systems, sensors, surveys, networks, models, and databases to deliver societal benefits in the Arctic. A Value Tree Analysis is a common tool used to support decision making processes and is useful for defining concepts, identifying objectives, and creating a hierarchical framework of objectives. A multi-level societal benefit area value tree establishes the connection from societal benefits to the set of observation inputs that contribute to delivering those benefits. A Value Tree that relies on expert domain knowledge from Arctic and non-Arctic nations, international researchers, Indigenous knowledge holders, and other experts to develop a framework to serve as a logical and interdependent decision support tool will be presented. Value tree examples that map the contribution of Earth observations in the Arctic to achieving societal benefits will be presented in the context of the 2017 International Arctic Observations Assessment Framework. These case studies will highlight specific observing products and capability groups where investment is needed to contribute to the development of a sustained portfolio of Arctic observing systems.

  17. Clinical Parameters and Tools for Home-Based Assessment of Parkinson's Disease: Results from a Delphi study.

    PubMed

    Ferreira, Joaquim J; Santos, Ana T; Domingos, Josefa; Matthews, Helen; Isaacs, Tom; Duffen, Joy; Al-Jawad, Ahmed; Larsen, Frank; Artur Serrano, J; Weber, Peter; Thoms, Andrea; Sollinger, Stefan; Graessner, Holm; Maetzler, Walter

    2015-01-01

    Parkinson's disease (PD) is a neurodegenerative disorder with fluctuating symptoms. To aid the development of a system to evaluate people with PD (PwP) at home (SENSE-PARK system) there was a need to define parameters and tools to be applied in the assessment of 6 domains: gait, bradykinesia/hypokinesia, tremor, sleep, balance and cognition. To identify relevant parameters and assessment tools of the 6 domains, from the perspective of PwP, caregivers and movement disorders specialists. A 2-round Delphi study was conducted to select a core of parameters and assessment tools to be applied. This process included PwP, caregivers and movement disorders specialists. Two hundred and thirty-three PwP, caregivers and physicians completed the first round questionnaire, and 50 the second. Results allowed the identification of parameters and assessment tools to be added to the SENSE-PARK system. The most consensual parameters were: Falls and Near Falls; Capability to Perform Activities of Daily Living; Interference with Activities of Daily Living; Capability to Process Tasks; and Capability to Recall and Retrieve Information. The most cited assessment strategies included Walkers; the Evaluation of Performance Doing Fine Motor Movements; Capability to Eat; Assessment of Sleep Quality; Identification of Circumstances and Triggers for Loose of Balance and Memory Assessment. An agreed set of measuring parameters, tests, tools and devices was achieved to be part of a system to evaluate PwP at home. A pattern of different perspectives was identified for each stakeholder.

  18. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  19. Ionospheric Simulation System for Satellite Observations and Global Assimilative Modeling Experiments (ISOGAME)

    NASA Technical Reports Server (NTRS)

    Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.

    2013-01-01

    ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.

  20. [Development of a Text-Data Based Learning Tool That Integrates Image Processing and Displaying].

    PubMed

    Shinohara, Hiroyuki; Hashimoto, Takeyuki

    2015-01-01

    We developed a text-data based learning tool that integrates image processing and displaying by Excel. Knowledge required for programing this tool is limited to using absolute, relative, and composite cell references and learning approximately 20 mathematical functions available in Excel. The new tool is capable of resolution translation, geometric transformation, spatial-filter processing, Radon transform, Fourier transform, convolutions, correlations, deconvolutions, wavelet transform, mutual information, and simulation of proton density-, T1-, and T2-weighted MR images. The processed images of 128 x 128 pixels or 256 x 256 pixels are observed directly within Excel worksheets without using any particular image display software. The results of image processing using this tool were compared with those using C language and the new tool was judged to have sufficient accuracy to be practically useful. The images displayed on Excel worksheets were compared with images using binary-data display software. This comparison indicated that the image quality of the Excel worksheets was nearly equal to the latter in visual impressions. Since image processing is performed by using text-data, the process is visible and facilitates making contrasts by using mathematical equations within the program. We concluded that the newly developed tool is adequate as a computer-assisted learning tool for use in medical image processing.

  1. In Pursuit of Improving Airburst and Ground Damage Predictions: Recent Advances in Multi-Body Aerodynamic Testing and Computational Tools Validation

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian

    2017-01-01

    An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.

  2. Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian

    2011-01-01

    Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.

  3. WFIRST Science Operations at STScI

    NASA Astrophysics Data System (ADS)

    Gilbert, Karoline; STScI WFIRST Team

    2018-06-01

    With sensitivity and resolution comparable the Hubble Space Telescope, and a field of view 100 times larger, the Wide Field Instrument (WFI) on WFIRST will be a powerful survey instrument. STScI will be the Science Operations Center (SOC) for the WFIRST Mission, with additional science support provided by the Infrared Processing and Analysis Center (IPAC) and foreign partners. STScI will schedule and archive all WFIRST observations, calibrate and produce pipeline-reduced data products for imaging with the Wide Field Instrument, support the High Latitude Imaging and Supernova Survey Teams, and support the astronomical community in planning WFI imaging observations and analyzing the data. STScI has developed detailed concepts for WFIRST operations, including a data management system integrating data processing and the archive which will include a novel, cloud-based framework for high-level data processing, providing a common environment accessible to all users (STScI operations, Survey Teams, General Observers, and archival investigators). To aid the astronomical community in examining the capabilities of WFIRST, STScI has built several simulation tools. We describe the functionality of each tool and give examples of its use.

  4. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and powerful tools. This presentation will describe the McIDAS-V software and demonstrate some of the capabilities of McIDAS-V to analyze and display many types of global data. The presentation will also focus on describing how McIDAS-V can be used as an educational window to examine global geophysical data. Consecutive polar orbiting passes of NASA MODIS and CALIPSO observations

  5. A real-time intercepting beam-profile monitor for a medical cyclotron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendriks, C.; Uittenbosch, T.; Cameron, D.

    2013-11-15

    There is a lack of real-time continuous beam-diagnostic tools for medical cyclotrons due to high power deposition during proton irradiation. To overcome this limitation, we have developed a profile monitor that is capable of providing continuous feedback about beam shape and current in real time while it is inserted in the beam path. This enables users to optimize the beam profile and observe fluctuations in the beam over time with periodic insertion of the monitor.

  6. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  7. An extensive coronagraphic simulation applied to LBT

    NASA Astrophysics Data System (ADS)

    Vassallo, D.; Carolo, E.; Farinato, J.; Bergomi, M.; Bonavita, M.; Carlotti, A.; D'Orazi, V.; Greggio, D.; Magrin, D.; Mesa, D.; Pinna, E.; Puglisi, A.; Stangalini, M.; Verinaud, C.; Viotto, V.

    2016-08-01

    In this article we report the results of a comprehensive simulation program aimed at investigating coronagraphic capabilities of SHARK-NIR, a camera selected to proceed to the final design phase at Large Binocular Telescope. For the purpose, we developed a dedicated simulation tool based on physical optics propagation. The code propagates wavefronts through SHARK optical train in an end-to-end fashion and can implement any kind of coronagraph. Detection limits can be finally computed, exploring a wide range of Strehl values and observing conditions.

  8. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  9. Subsonic Wing Optimization for Handling Qualities Using ACSYNT

    NASA Technical Reports Server (NTRS)

    Soban, Danielle Suzanne

    1996-01-01

    The capability to accurately and rapidly predict aircraft stability derivatives using one comprehensive analysis tool has been created. The PREDAVOR tool has the following capabilities: rapid estimation of stability derivatives using a vortex lattice method, calculation of a longitudinal handling qualities metric, and inherent methodology to optimize a given aircraft configuration for longitudinal handling qualities, including an intuitive graphical interface. The PREDAVOR tool may be applied to both subsonic and supersonic designs, as well as conventional and unconventional, symmetric and asymmetric configurations. The workstation-based tool uses as its model a three-dimensional model of the configuration generated using a computer aided design (CAD) package. The PREDAVOR tool was applied to a Lear Jet Model 23 and the North American XB-70 Valkyrie.

  10. Tools and methods for experimental in-vivo measurement and biomechanical characterization of an Octopus vulgaris arm.

    PubMed

    Margheri, Laura; Mazzolai, Barbara; Cianchetti, Matteo; Dario, Paolo; Laschi, Cecilia

    2009-01-01

    This work illustrates new tools and methods for an in vivo and direct, but non-invasive, measurement of an octopus arm mechanical properties. The active elongation (longitudinal stretch) and the pulling force capability are measured on a specimen of Octopus vulgaris in order to quantitatively characterize the parameters describing the arm mechanics, for biomimetic design purposes. The novel approach consists of observing and measuring a living octopus with minimally invasive methods, which allow the animal to move with its complete ability. All tools are conceived in order to create a collaborative interaction with the animal for the acquisition of active measures. The data analysis is executed taking into account the presence of an intrinsic error due to the mobility of the subject and the aquatic environment. Using a system of two synchronized high-speed high-resolution cameras and purpose-made instruments, the maximum elongation of an arm and its rest length (when all muscles fibres are relaxed during propulsion movement) are measured and compared to define the longitudinal stretch, with the impressive average result of 194%. With a similar setup integrated with a force sensor, the pulling force capability is measured as a function of grasp point position along the arm. The measured parameters are used as real specifications for the design of an octopus-like arm with a biomimetic approach.

  11. Development of constraint-based system-level models of microbial metabolism.

    PubMed

    Navid, Ali

    2012-01-01

    Genome-scale models of metabolism are valuable tools for using genomic information to predict microbial phenotypes. System-level mathematical models of metabolic networks have been developed for a number of microbes and have been used to gain new insights into the biochemical conversions that occur within organisms and permit their survival and proliferation. Utilizing these models, computational biologists can (1) examine network structures, (2) predict metabolic capabilities and resolve unexplained experimental observations, (3) generate and test new hypotheses, (4) assess the nutritional requirements of the organism and approximate its environmental niche, (5) identify missing enzymatic functions in the annotated genome, and (6) engineer desired metabolic capabilities in model organisms. This chapter details the protocol for developing genome-scale models of metabolism in microbes as well as tips for accelerating the model building process.

  12. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  13. Data Assimilation as a Tool for Developing a Mars International Reference Atmosphere

    NASA Technical Reports Server (NTRS)

    Houben, Howard

    2005-01-01

    A new paradigm for a Mars International Reference Atmosphere is proposed. In general, as is certainly now the case for Mars, there are sufficient observational data to specify what the full atmospheric state was under a variety of circumstances (season, dustiness, etc.). There are also general circulation models capable of deter- mining the evolution of these states. If these capabilities are combined-using data assimilation techniques-the resulting analyzed states can be probed to answer a wide variety of questions, whether posed by scientists, mission planners, or others. This system would fulfill all the purposes of an international reference atmosphere and would make the scientific results of exploration missions readily available to the community. Preliminary work on a website that would incorporate this functionality has begun.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen

    Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less

  15. Organic Scintillator Detector Response Simulations with DRiFT

    DOE PAGES

    Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen; ...

    2016-06-11

    Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less

  16. Organic scintillator detector response simulations with DRiFT

    NASA Astrophysics Data System (ADS)

    Andrews, M. T.; Bates, C. R.; McKigney, E. A.; Solomon, C. J.; Sood, A.

    2016-09-01

    This work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNP® output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed-field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNP® 6 , which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discrimination plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.

  17. Geometry and gravity influences on strength capability

    NASA Technical Reports Server (NTRS)

    Poliner, Jeffrey; Wilmington, Robert P.; Klute, Glenn K.

    1994-01-01

    Strength, defined as the capability of an individual to produce an external force, is one of the most important determining characteristics of human performance. Knowledge of strength capabilities of a group of individuals can be applied to designing equipment and workplaces, planning procedures and tasks, and training individuals. In the manned space program, with the high risk and cost associated with spaceflight, information pertaining to human performance is important to ensuring mission success and safety. Knowledge of individual's strength capabilities in weightlessness is of interest within many areas of NASA, including workplace design, tool development, and mission planning. The weightless environment of space places the human body in a completely different context. Astronauts perform a variety of manual tasks while in orbit. Their ability to perform these tasks is partly determined by their strength capability as demanded by that particular task. Thus, an important step in task planning, development, and evaluation is to determine the ability of the humans performing it. This can be accomplished by utilizing quantitative techniques to develop a database of human strength capabilities in weightlessness. Furthermore, if strength characteristics are known, equipment and tools can be built to optimize the operators' performance. This study examined strength in performing a simple task, specifically, using a tool to apply a torque to a fixture.

  18. Space Weather Products at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Kuznetsova, M.; Pulkkinen, A.; Maddox, M.; Rastaetter, L.; Berrios, D.; MacNeice, P.

    2010-01-01

    The Community Coordinated Modeling Center (CCMC) is a US inter-agency activity aiming at research in support of the generation of advanced space weather models. As one of its main functions, the CCMC provides to researchers the use of space science models, even if they are not model owners themselves. The second CCMC activity is to support Space Weather forecasting at national Space Weather Forecasting Centers. This second activity involves model evaluations, model transitions to operations, and the development of space weather forecasting tools. Owing to the pace of development in the science community, new model capabilities emerge frequently. Consequently, space weather products and tools involve not only increased validity, but often entirely new capabilities. This presentation will review the present state of space weather tools as well as point out emerging future capabilities.

  19. Additive manufacturing: Toward holistic design

    DOE PAGES

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.; ...

    2017-03-18

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  20. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jared, Bradley H.; Aguilo, Miguel A.; Beghini, Lauren L.

    Here, additive manufacturing offers unprecedented opportunities to design complex structures optimized for performance envelopes inaccessible under conventional manufacturing constraints. Additive processes also promote realization of engineered materials with microstructures and properties that are impossible via traditional synthesis techniques. Enthused by these capabilities, optimization design tools have experienced a recent revival. The current capabilities of additive processes and optimization tools are summarized briefly, while an emerging opportunity is discussed to achieve a holistic design paradigm whereby computational tools are integrated with stochastic process and material awareness to enable the concurrent optimization of design topologies, material constructs and fabrication processes.

  2. 49 CFR 563.12 - Data retrieval tools.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 6 2011-10-01 2011-10-01 false Data retrieval tools. 563.12 Section 563.12... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EVENT DATA RECORDERS § 563.12 Data retrieval tools. Each... tool(s) is commercially available that is capable of accessing and retrieving the data stored in the...

  3. Satellite laser ranging as a tool for the recovery of tropospheric gradients

    NASA Astrophysics Data System (ADS)

    Drożdżewski, M.; Sośnica, K.

    2018-11-01

    Space geodetic techniques, such as Global Navigation Satellite Systems (GNSS) and Very Long Baseline Interferometry (VLBI) have been extensively used for the recovery of the tropospheric parameters. Both techniques employ microwave observations, for which the troposphere is a non-dispersive medium and which are very sensitive to the water vapor content. Satellite laser ranging (SLR) is the only space geodetic technique used for the definition of the terrestrial reference frames which employs optical - laser observations. The SLR sensitivity to the hydrostatic part of the troposphere delay is similar to that of microwave observations, whereas the sensitivity of laser observations to non-hydrostatic part of the delay is about two orders of magnitude smaller than in the case of microwave observations. Troposphere is a dispersive medium for optical wavelengths, which means that the SLR tropospheric delay depends on the laser wavelength. This paper presents the sensitivity and capability of the SLR observations for the recovery of azimuthal asymmetry over the SLR stations, which can be described as horizontal gradients of the troposphere delay. For the first time, the horizontal gradients are estimated, together with other parameters typically estimated from the SLR observations to spherical LAGEOS satellites, i.e., station coordinates, earth rotation parameters, and satellite orbits. Most of the SLR stations are co-located with GNSS receivers, thus, a cross-correlation between both techniques is possible. We compare our SLR horizontal gradients to GNSS results and to the horizontal gradients derived from the numerical weather models (NWM). Due to a small number of the SLR observations, SLR is not capable of reconstructing short-period phenomena occurring in the atmosphere. However, the long-term analysis allows for the recovery of the atmosphere asymmetry using SLR. As a result, the mean offsets of the SLR-derived horizontal gradients agree to the level of 47%, 74%, 54% with GNSS, hydrostatic delay, and total delay from NWM, respectively. SLR can be thus employed as a tool for the recovery of the atmospheric parameters with a major sensitivity to the hydrostatic part of the delay.

  4. Multicolor Three-Dimensional Tracking for Single-Molecule Fluorescence Resonance Energy Transfer Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Aaron M.; DeVore, Matthew S.; Stich, Dominik G.

    Single-molecule fluorescence resonance energy transfer (smFRET) remains a widely utilized and powerful tool for quantifying heterogeneous interactions and conformational dynamics of biomolecules. However, traditional smFRET experiments either are limited to short observation times (typically less than 1 ms) in the case of “burst” confocal measurements or require surface immobilization which usually has a temporal resolution limited by the camera framing rate. We developed a smFRET 3D tracking microscope that is capable of observing single particles for extended periods of time with high temporal resolution. The confocal tracking microscope utilizes closed-loop feedback to follow the particle in solution by recentering itmore » within two overlapping tetrahedral detection elements, corresponding to donor and acceptor channels. We demonstrated the microscope’s multicolor tracking capability via random walk simulations and experimental tracking of 200 nm fluorescent beads in water with a range of apparent smFRET efficiency values, 0.45-0.69. We also demonstrated the microscope’s capability to track and quantify double-stranded DNA undergoing intramolecular smFRET in a viscous glycerol solution. In future experiments, the smFRET 3D tracking system will be used to study protein conformational dynamics while diffusing in solution and native biological environments with high temporal resolution.« less

  5. Multicolor Three-Dimensional Tracking for Single-Molecule Fluorescence Resonance Energy Transfer Measurements

    DOE PAGES

    Keller, Aaron M.; DeVore, Matthew S.; Stich, Dominik G.; ...

    2018-04-19

    Single-molecule fluorescence resonance energy transfer (smFRET) remains a widely utilized and powerful tool for quantifying heterogeneous interactions and conformational dynamics of biomolecules. However, traditional smFRET experiments either are limited to short observation times (typically less than 1 ms) in the case of “burst” confocal measurements or require surface immobilization which usually has a temporal resolution limited by the camera framing rate. We developed a smFRET 3D tracking microscope that is capable of observing single particles for extended periods of time with high temporal resolution. The confocal tracking microscope utilizes closed-loop feedback to follow the particle in solution by recentering itmore » within two overlapping tetrahedral detection elements, corresponding to donor and acceptor channels. We demonstrated the microscope’s multicolor tracking capability via random walk simulations and experimental tracking of 200 nm fluorescent beads in water with a range of apparent smFRET efficiency values, 0.45-0.69. We also demonstrated the microscope’s capability to track and quantify double-stranded DNA undergoing intramolecular smFRET in a viscous glycerol solution. In future experiments, the smFRET 3D tracking system will be used to study protein conformational dynamics while diffusing in solution and native biological environments with high temporal resolution.« less

  6. Integrating reliability and maintainability into a concurrent engineering environment

    NASA Astrophysics Data System (ADS)

    Phillips, Clifton B.; Peterson, Robert R.

    1993-02-01

    This paper describes the results of a reliability and maintainability study conducted at the University of California, San Diego and supported by private industry. Private industry thought the study was important and provided the university access to innovative tools under cooperative agreement. The current capability of reliability and maintainability tools and how they fit into the design process is investigated. The evolution of design methodologies leading up to today's capability is reviewed for ways to enhance the design process while keeping cost under control. A method for measuring the consequences of reliability and maintainability policy for design configurations in an electronic environment is provided. The interaction of selected modern computer tool sets is described for reliability, maintainability, operations, and other elements of the engineering design process. These tools provide a robust system evaluation capability that brings life cycle performance improvement information to engineers and their managers before systems are deployed, and allow them to monitor and track performance while it is in operation.

  7. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  8. AstroBlend: An astrophysical visualization package for Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2016-04-01

    The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.

  9. Implementation of a tree algorithm in MCNP code for nuclear well logging applications.

    PubMed

    Li, Fusheng; Han, Xiaogang

    2012-07-01

    The goal of this paper is to develop some modeling capabilities that are missing in the current MCNP code. Those missing capabilities can greatly help for some certain nuclear tools designs, such as a nuclear lithology/mineralogy spectroscopy tool. The new capabilities to be developed in this paper include the following: zone tally, neutron interaction tally, gamma rays index tally and enhanced pulse-height tally. The patched MCNP code also can be used to compute neutron slowing-down length and thermal neutron diffusion length. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Overlay Tolerances For VLSI Using Wafer Steppers

    NASA Astrophysics Data System (ADS)

    Levinson, Harry J.; Rice, Rory

    1988-01-01

    In order for VLSI circuits to function properly, the masking layers used in the fabrication of those devices must overlay each other to within the manufacturing tolerance incorporated in the circuit design. The capabilities of the alignment tools used in the masking process determine the overlay tolerances to which circuits can be designed. It is therefore of considerable importance that these capabilities be well characterized. Underestimation of the overlay accuracy results in unnecessarily large devices, resulting in poor utilization of wafer area and possible degradation of device performance. Overestimation will result in significant yield loss because of the failure to conform to the tolerances of the design rules. The proper methodology for determining the overlay capabilities of wafer steppers, the most commonly used alignment tool for the production of VLSI circuits, is the subject of this paper. Because cost-effective manufacturing process technology has been the driving force of VLSI, the impact on productivity is a primary consideration in all discussions. Manufacturers of alignment tools advertise the capabilities of their equipment. It is notable that no manufacturer currently characterizes his aligners in a manner consistent with the requirements of producing very large integrated circuits, as will be discussed. This has resulted in the situation in which the evaluation and comparison of the capabilities of alignment tools require the attention of a lithography specialist. Unfortunately, lithographic capabilities must be known by many other people, particularly the circuit designers and the managers responsible for the financial consequences of the high prices of modern alignment tools. All too frequently, the designer or manager is confronted with contradictory data, one set coming from his lithography specialist, and the other coming from a sales representative of an equipment manufacturer. Since the latter generally attempts to make his merchandise appear as attractive as possible, the lithographer is frequently placed in the position of having to explain subtle issues in order to justify his decisions. It is the purpose of this paper to provide that explanation.

  11. The Worldwide Interplanetary Scintillation (IPS) Stations (WIPSS) Network October 2016 Observing Campaign: Initial WIPSS Data Analyses

    NASA Astrophysics Data System (ADS)

    Bisi, M. M.; Fallows, R. A.; Jackson, B. V.; Tokumaru, M.; Gonzalez-Esparza, A.; Morgan, J.; Chashei, I. V.; Mejia-Ambriz, J.; Tyul'bashev, S. A.; Manoharan, P. K.; De la Luz, V.; Aguilar-Rodriguez, E.; Yu, H. S.; Barnes, D.; Chang, O.; Odstrcil, D.; Fujiki, K.; Shishov, V.

    2017-12-01

    Interplanetary Scintillation (IPS) allows for the determination of velocity and a proxy for plasma density to be made throughout the corona and inner heliosphere. Where sufficient observations are undertaken, the results can be used as input to the University of California, San Diego (UCSD) three-dimensional (3-D) time-dependent tomography suite to allow for the full 3-D reconstruction of both velocity and density throughout the inner heliosphere. By combining IPS results from multiple observing locations around the planet, we can increase both the temporal and spatial coverage across the whole of the inner heliosphere and hence improve forecast capability. During October 2016, a unique opportunity arose whereby the European-based LOw Frequency ARray (LOFAR) radio telescope was used to make nearly four weeks of continuous observations of IPS as a heliospheric space-weather trial campaign. This was expanded into a global effort to include observations of IPS from the Murchison Widefield Array (MWA) in Western Australia and many more observations from various IPS-dedicated WIPSS Network systems. LOFAR is a next-generation low-frequency radio interferometer capable of observing in the radio frequency range 10-250 MHz, nominally with up to 80 MHz bandwidth at a time. MWA in Western Australia is capable of observing in the 80-300 MHz frequency range nominally using up to 32 MHz of bandwidth. IPS data from LOFAR, ISEE, the MEXican Array Radio Telescope (MEXART), and, where possible, other WIPSS Network systems (such as LPI-BSA and Ooty), will be used in this study and we will present some initial findings for these data sets. We also make a first attempt at the 3-D reconstruction of multiple pertinent WIPSS results in the UCSD tomography. We will also try to highlight some of the potential future tools that make LOFAR a very unique system to be able to test and validate a whole plethora of IPS analysis methods with the same set of IPS data.

  12. 76 FR 11361 - Defense Federal Acquisition Regulation Supplement; Preservation of Tooling for Major Defense...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... tooling, but should include ``all property, i.e., special test equipment, ground support equipment, machine tools and machines and other intangibles to maintain capability.'' Response: DoD is fully...

  13. TU-A-17A-02: In Memoriam of Ben Galkin: Virtual Tools for Validation of X-Ray Breast Imaging Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, K; Bakic, P; Abbey, C

    2014-06-15

    This symposium will explore simulation methods for the preclinical evaluation of novel 3D and 4D x-ray breast imaging systems – the subject of AAPM taskgroup TG234. Given the complex design of modern imaging systems, simulations offer significant advantages over long and costly clinical studies in terms of reproducibility, reduced radiation exposures, a known reference standard, and the capability for studying patient and disease subpopulations through appropriate choice of simulation parameters. Our focus will be on testing the realism of software anthropomorphic phantoms and virtual clinical trials tools developed for the optimization and validation of breast imaging systems. The symposium willmore » review the stateof- the-science, as well as the advantages and limitations of various approaches to testing realism of phantoms and simulated breast images. Approaches based upon the visual assessment of synthetic breast images by expert observers will be contrasted with approaches based upon comparing statistical properties between synthetic and clinical images. The role of observer models in the assessment of realism will be considered. Finally, an industry perspective will be presented, summarizing the role and importance of virtual tools and simulation methods in product development. The challenges and conditions that must be satisfied in order for computational modeling and simulation to play a significantly increased role in the design and evaluation of novel breast imaging systems will be addressed. Learning Objectives: Review the state-of-the science in testing realism of software anthropomorphic phantoms and virtual clinical trials tools; Compare approaches based upon the visual assessment by expert observers vs. the analysis of statistical properties of synthetic images; Discuss the role of observer models in the assessment of realism; Summarize the industry perspective to virtual methods for breast imaging.« less

  14. Content and functional specifications for a standards-based multidisciplinary rounding tool to maintain continuity across acute and critical care.

    PubMed

    Collins, Sarah; Hurley, Ann C; Chang, Frank Y; Illa, Anisha R; Benoit, Angela; Laperle, Sarah; Dykes, Patricia C

    2014-01-01

    Maintaining continuity of care (CoC) in the inpatient setting is dependent on aligning goals and tasks with the plan of care (POC) during multidisciplinary rounds (MDRs). A number of locally developed rounding tools exist, yet there is a lack of standard content and functional specifications for electronic tools to support MDRs within and across settings. To identify content and functional requirements for an MDR tool to support CoC. We collected discrete clinical data elements (CDEs) discussed during rounds for 128 acute and critical care patients. To capture CDEs, we developed and validated an iPad-based observational tool based on informatics CoC standards. We observed 19 days of rounds and conducted eight group and individual interviews. Descriptive and bivariate statistics and network visualization were conducted to understand associations between CDEs discussed during rounds with a particular focus on the POC. Qualitative data were thematically analyzed. All analyses were triangulated. We identified the need for universal and configurable MDR tool views across settings and users and the provision of messaging capability. Eleven empirically derived universal CDEs were identified, including four POC CDEs: problems, plan, goals, and short-term concerns. Configurable POC CDEs were: rationale, tasks/'to dos', pending results and procedures, discharge planning, patient preferences, need for urgent review, prognosis, and advice/guidance. Some requirements differed between settings; yet, there was overlap between POC CDEs. We recommend an initial list of 11 universal CDEs for continuity in MDRs across settings and 27 CDEs that can be configured to meet setting-specific needs.

  15. An extreme events laboratory to provide network centric collaborative situation assessment and decision making

    NASA Astrophysics Data System (ADS)

    Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.

    2009-05-01

    Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.

  16. NOAA/West Coast and Alaska Tsunami Warning Center Pacific Ocean response criteria

    USGS Publications Warehouse

    Whitmore, P.; Benz, H.; Bolton, M.; Crawford, G.; Dengler, L.; Fryer, G.; Goltz, J.; Hansen, R.; Kryzanowski, K.; Malone, S.; Oppenheimer, D.; Petty, E.; Rogers, G.; Wilson, Jim

    2008-01-01

    New West Coast/Alaska Tsunami Warning Center (WCATWC) response criteria for earthquakes occurring in the Pacific basin are presented. Initial warning decisions are based on earthquake location, magnitude, depth, and - dependent on magnitude - either distance from source or precomputed threat estimates generated from tsunami models. The new criteria will help limit the geographical extent of warnings and advisories to threatened regions, and complement the new operational tsunami product suite. Changes to the previous criteria include: adding hypocentral depth dependence, reducing geographical warning extent for the lower magnitude ranges, setting special criteria for areas not well-connected to the open ocean, basing warning extent on pre-computed threat levels versus tsunami travel time for very large events, including the new advisory product, using the advisory product for far-offshore events in the lower magnitude ranges, and specifying distances from the coast for on-shore events which may be tsunamigenic. This report sets a baseline for response criteria used by the WCATWC considering its processing and observational data capabilities as well as its organizational requirements. Criteria are set for tsunamis generated by earthquakes, which are by far the main cause of tsunami generation (either directly through sea floor displacement or indirectly by triggering of slumps). As further research and development provides better tsunami source definition, observational data streams, and improved analysis tools, the criteria will continue to adjust. Future lines of research and development capable of providing operational tsunami warning centers with better tools are discussed.

  17. Re-use of Science Operations Systems around Mars: from Mars Express to ExoMars

    NASA Astrophysics Data System (ADS)

    Cardesin-Moinelo, Alejandro; Mars Express Operations Centre; ExoMars Science Operations Centre

    2017-10-01

    Mars Express and ExoMars 2016 Trace Gas Orbiter are the only two ESA planetary missions currently in operations, and they happen to be around the same planet! These two missions have great potential for synergies between their science objectives, instruments and observation capabilities and they can all be combined to improve the scientific outcome and improve our knowledge about Mars. In this contribution we will give a short summary of both missions, with an insight in its similarities and differences regarding their scientific and operational challenges, and we will summarize the lessons learned from Mars Express and how the existing science operations systems, processes and tools have been reused, redesigned and adapted in order to satisfy the operational requirements of ExoMars, with limited development resources thanks to the inherited capabilities from previous missions. In particular we will focus on the preparations done by the science operations centers at ESAC and the work within the Science Ground Segments for the re-use of the SPICE and MAPPS software tools, with the necessary modifications and upgrades to perform the geometrical and operational simulations of both spacecrafts, taking into account the specific instrument modelling, observation requirements and all the payload and spacecraft operational rules and constraints for feasibility checks. All of these system upgrades are now being finalized for ExoMars and some of them have already been rehearsed in orbit, getting ready for the nominal science operations phase starting in the first months of 2018 after the aerobraking phase

  18. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  19. Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.

    2008-12-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.

  20. Inclusion of Linearized Moist Physics in Nasa's Goddard Earth Observing System Data Assimilation Tools

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Errico, Ronald; Gelaro, Ronaldo; Kim, Jong G.

    2013-01-01

    Inclusion of moist physics in the linearized version of a weather forecast model is beneficial in terms of variational data assimilation. Further, it improves the capability of important tools, such as adjoint-based observation impacts and sensitivity studies. A linearized version of the relaxed Arakawa-Schubert (RAS) convection scheme has been developed and tested in NASA's Goddard Earth Observing System data assimilation tools. A previous study of the RAS scheme showed it to exhibit reasonable linearity and stability. This motivates the development of a linearization of a near-exact version of the RAS scheme. Linearized large-scale condensation is included through simple conversion of supersaturation into precipitation. The linearization of moist physics is validated against the full nonlinear model for 6- and 24-h intervals, relevant to variational data assimilation and observation impacts, respectively. For a small number of profiles, sudden large growth in the perturbation trajectory is encountered. Efficient filtering of these profiles is achieved by diagnosis of steep gradients in a reduced version of the operator of the tangent linear model. With filtering turned on, the inclusion of linearized moist physics increases the correlation between the nonlinear perturbation trajectory and the linear approximation of the perturbation trajectory. A month-long observation impact experiment is performed and the effect of including moist physics on the impacts is discussed. Impacts from moist-sensitive instruments and channels are increased. The effect of including moist physics is examined for adjoint sensitivity studies. A case study examining an intensifying Northern Hemisphere Atlantic storm is presented. The results show a significant sensitivity with respect to moisture.

  1. Developing Valid Measures of Emergency Management Capabilities within US Department of Veterans Affairs Hospitals.

    PubMed

    Dobalian, Aram; Stein, Judith A; Radcliff, Tiffany A; Riopelle, Deborah; Brewster, Pete; Hagigi, Farhad; Der-Martirosian, Claudia

    2016-10-01

    Introduction Hospitals play a critical role in providing health care in the aftermath of disasters and emergencies. Nonetheless, while multiple tools exist to assess hospital disaster preparedness, existing instruments have not been tested adequately for validity. Hypothesis/Problem This study reports on the development of a preparedness assessment tool for hospitals that are part of the US Department of Veterans Affairs (VA; Washington, DC USA). The authors evaluated hospital preparedness in six "Mission Areas" (MAs: Program Management; Incident Management; Safety and Security; Resiliency and Continuity; Medical Surge; and Support to External Requirements), each composed of various observable hospital preparedness capabilities, among 140 VA Medical Centers (VAMCs). This paper reports on two successive assessments (Phase I and Phase II) to assess the MAs' construct validity, or the degree to which component capabilities relate to one another to represent the associated domain successfully. This report describes a two-stage confirmatory factor analysis (CFA) of candidate items for a comprehensive survey implemented to assess emergency preparedness in a hospital setting. The individual CFAs by MA received acceptable fit statistics with some exceptions. Some individual items did not have adequate factor loadings within their hypothesized factor (or MA) and were dropped from the analyses in order to obtain acceptable fit statistics. The Phase II modified tool was better able to assess the pre-determined MAs. For each MA, except for Resiliency and Continuity (MA 4), the CFA confirmed one latent variable. In Phase I, two sub-scales (seven and nine items in each respective sub-scale) and in Phase II, three sub-scales (eight, four, and eight items in each respective sub-scale) were confirmed for MA 4. The MA 4 capabilities comprise multiple sub-domains, and future assessment protocols should consider re-classifying MA 4 into three distinct MAs. The assessments provide a comprehensive and consistent, but flexible, approach for ascertaining health system preparedness. This approach can provide an organization with a clear understanding of areas for improvement and could be adapted into a standard for hospital readiness. Dobalian A , Stein JA , Radcliff TA , Riopelle D , Brewster P , Hagigi F , Der-Martirosian C . Developing valid measures of emergency management capabilities within US Department of Veterans Affairs hospitals. Prehosp Disaster Med. 2016;31(5):475-484.

  2. The Global Modeling and Assimilation Office (GMAO) 4d-Var and its Adjoint-based Tools

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Tremolet, Yannick

    2008-01-01

    The fifth generation of the Goddard Earth Observing System (GEOS-5) Data Assimilation System (DAS) is a 3d-var system that uses the Grid-point Statistical Interpolation (GSI) system developed in collaboration with NCEP, and a general circulation model developed at Goddard, that includes the finite-volume hydrodynamics of GEOS-4 wrapped in the Earth System Modeling Framework and physical packages tuned to provide a reliable hydrological cycle for the integration of the Modern Era Retrospective-analysis for Research and Applications (MERRA). This MERRA system is essentially complete and the next generation GEOS is under intense development. A prototype next generation system is now complete and has been producing preliminary results. This prototype system replaces the GSI-based Incremental Analysis Update procedure with a GSI-based 4d-var which uses the adjoint of the finite-volume hydrodynamics of GEOS-4 together with a vertical diffusing scheme for simplified physics. As part of this development we have kept the GEOS-5 IAU procedure as an option and have added the capability to experiment with a First Guess at the Appropriate Time (FGAT) procedure, thus allowing for at least three modes of running the data assimilation experiments. The prototype system is a large extension of GEOS-5 as it also includes various adjoint-based tools, namely, a forecast sensitivity tool, a singular vector tool, and an observation impact tool, that combines the model sensitivity tool with a GSI-based adjoint tool. These features bring the global data assimilation effort at Goddard up to date with technologies used in data assimilation systems at major meteorological centers elsewhere. Various aspects of the next generation GEOS will be discussed during the presentation at the Workshop, and preliminary results will illustrate the discussion.

  3. MAPGEN: Mixed-Initiative Activity Planning for the Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Ai-Chang, Mitchell; Bresina, John; Hsu, Jennifer; Jonsson, Ari; Kanefsky, Bob; McCurdy, Michael; Morris, Paul; Rajan, Kanna; Vera, Alonso; Yglesias, Jeffrey

    2004-01-01

    This document describes the Mixed initiative Activity Plan Generation system MAPGEN. This system is one of the critical tools in the Mars Exploration Rover mission surface operations, where it is used to build activity plans for each of the rovers, each Martian day. The MAPGEN system combines an existing tool for activity plan editing and resource modeling, with an advanced constraint-based reasoning and planning framework. The constraint-based planning component provides active constraint and rule enforcement, automated planning capabilities, and a variety of tools and functions that are useful for building activity plans in an interactive fashion. In this demonstration, we will show the capabilities of the system and demonstrate how the system has been used in actual Mars rover operations. In contrast to the demonstration given at ICAPS 03, significant improvement have been made to the system. These include various additional capabilities that are based on automated reasoning and planning techniques, as well as a new Constraint Editor support tool. The Constraint Editor (CE) as part of the process for generating these command loads, the MAPGEN tool provides engineers and scientists an intelligent activity planning tool that allows them to more effectively generate complex plans that maximize the science return each day. The key to the effectiveness of the MAPGEN tool is an underlying constraint-based planning and reasoning engine.

  4. Ultimate intra-wafer critical dimension uniformity control by using lithography and etch tool corrections

    NASA Astrophysics Data System (ADS)

    Kubis, Michael; Wise, Rich; Reijnen, Liesbeth; Viatkina, Katja; Jaenen, Patrick; Luca, Melisa; Mernier, Guillaume; Chahine, Charlotte; Hellin, David; Kam, Benjamin; Sobieski, Daniel; Vertommen, Johan; Mulkens, Jan; Dusa, Mircea; Dixit, Girish; Shamma, Nader; Leray, Philippe

    2016-03-01

    With shrinking design rules, the overall patterning requirements are getting aggressively tighter. For the 7-nm node and below, allowable CD uniformity variations are entering the Angstrom region (ref [1]). Optimizing inter- and intra-field CD uniformity of the final pattern requires a holistic tuning of all process steps. In previous work, CD control with either litho cluster or etch tool corrections has been discussed. Today, we present a holistic CD control approach, combining the correction capability of the etch tool with the correction capability of the exposure tool. The study is done on 10-nm logic node wafers, processed with a test vehicle stack patterning sequence. We include wafer-to-wafer and lot-to-lot variation and apply optical scatterometry to characterize the fingerprints. Making use of all available correction capabilities (lithography and etch), we investigated single application of exposure tool corrections and of etch tool corrections as well as combinations of both to reach the lowest CD uniformity. Results of the final pattern uniformity based on single and combined corrections are shown. We conclude on the application of this holistic lithography and etch optimization to 7nm High-Volume manufacturing, paving the way to ultimate within-wafer CD uniformity control.

  5. The Heliophysics Integrated Observatory

    NASA Astrophysics Data System (ADS)

    Csillaghy, A.; Bentley, R. D.

    2009-12-01

    HELIO is a new Europe-wide, FP7-funded distributed network of services that will address the needs of a broad community of researchers in heliophysics. This new research field explores the “Sun-Solar System Connection” and requires the joint exploitation of solar, heliospheric, magnetospheric and ionospheric observations. HELIO will provide the most comprehensive integrated information system in this domain; it will coordinate access to the distributed resources needed by the community, and will provide access to services to mine and analyse the data. HELIO will be designed as a Service-oriented Architecture. The initial infrastructure will include services based on metadata and data servers deployed by the European Grid of Solar Observations (EGSO). We will extend these to address observations from all the disciplines of heliophysics; differences in the way the domains describe and handle the data will be resolved using semantic mapping techniques. Processing and storage services will allow the user to explore the data and create the products that meet stringent standards of interoperability. These capabilities will be orchestrated with the data and metadata services using the Taverna workflow tool. HELIO will address the challenges along the FP7 I3 activities model: (1) Networking: we will cooperate closely with the community to define new standards for heliophysics and the required capabilities of the HELIO system. (2) Services: we will integrate the services developed by the project and other groups to produce an infrastructure that can easily be extended to satisfy the growing and changing needs of the community. (3) Joint Research: we will develop search tools that span disciplinary boundaries and explore new types of user-friendly interfaces HELIO will be a key component of a worldwide effort to integrate heliophysics data and will coordinate closely with international organizations to exploit synergies with complementary domains.

  6. Using Petri Net Tools to Study Properties and Dynamics of Biological Systems

    PubMed Central

    Peleg, Mor; Rubin, Daniel; Altman, Russ B.

    2005-01-01

    Petri Nets (PNs) and their extensions are promising methods for modeling and simulating biological systems. We surveyed PN formalisms and tools and compared them based on their mathematical capabilities as well as by their appropriateness to represent typical biological processes. We measured the ability of these tools to model specific features of biological systems and answer a set of biological questions that we defined. We found that different tools are required to provide all capabilities that we assessed. We created software to translate a generic PN model into most of the formalisms and tools discussed. We have also made available three models and suggest that a library of such models would catalyze progress in qualitative modeling via PNs. Development and wide adoption of common formats would enable researchers to share models and use different tools to analyze them without the need to convert to proprietary formats. PMID:15561791

  7. NEO follow-up, recovery and precovery campaigns at the ESA NEO Coordination Centre

    NASA Astrophysics Data System (ADS)

    Micheli, Marco; Koschny, Detlef; Drolshagen, Gerhard; Perozzi, Ettore; Borgia, Barbara

    2016-01-01

    The NEO Coordination Centre (NEOCC) has been established within the framework of the ESA Space Situational Awareness (SSA) Programme. Among its tasks are the coordination of observational activities and the distribution of up-to-date information on NEOs through its web portal. The Centre is directly involved in observational campaigns with various telescopes, including ESO's VLT and ESA's OGS telescope. We are also developing a network of collaborating observatories, with a variety of capabilities, which are alerted when an important observational opportunity arises. From a service perspective, the system hosted at the NEOCC collects information on NEOs produced by European services and makes it available to users, with a focus on objects with possible collisions with the Earth. Among the tools provided via our portal are the Risk List of all known NEOs with impact solutions, and the Priority List, which allows observers to identify NEOs in most urgent need of observations.

  8. Design and Characterization of an Exoskeleton for Perturbing the Knee During Gait.

    PubMed

    Tucker, Michael R; Shirota, Camila; Lambercy, Olivier; Sulzer, James S; Gassert, Roger

    2017-10-01

    An improved understanding of mechanical impedance modulation in human joints would provide insights about the neuromechanics underlying functional movements. Experimental estimation of impedance requires specialized tools with highly reproducible perturbation dynamics and reliable measurement capabilities. This paper presents the design and mechanical characterization of the ETH Knee Perturbator: an actuated exoskeleton for perturbing the knee during gait. A novel wearable perturbation device was developed based on specific experimental objectives. Bench-top tests validated the device's torque limiting capability and characterized the time delays of the on-board clutch. Further tests demonstrated the device's ability to perform system identification on passive loads with static initial conditions. Finally, the ability of the device to consistently perturb human gait was evaluated through a pilot study on three unimpaired subjects. The ETH Knee Perturbator is capable of identifying mass-spring systems within 15% accuracy, accounting for over 95% of the variance in the observed torque in 10 out of 16 cases. Five-degree extension and flexion perturbations were executed on human subjects with an onset timing precision of 2.52% of swing phase duration and a rise time of 36.5 ms. The ETH Knee Perturbator can deliver safe, precisely timed, and controlled perturbations, which is a prerequisite for the estimation of knee joint impedance during gait. Tools such as this can enhance models of neuromuscular control, which may improve rehabilitative outcomes following impairments affecting gait and advance the design and control of assistive devices.

  9. Developing the Systems Engineering Experience Accelerator (SEEA) Prototype and Roadmap

    DTIC Science & Technology

    2013-12-31

    information to be automatically presented without comment. 2.2.2 NEW FEATURES AND CAPABILITIES A number of new multiplayer capabilities were...2.4.1 OVERVIEW The EA game engine has two components: the runtime engine and the tools suite. The tools suite includes the Experience Development...the Learner. Figure 6: Experience Accelerator Logical Block Diagram The EARTE is a multiuser architecture for internet gaming . It has light

  10. Developing an operational capabilities index of the emergency services sector.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, M.J.; Eaton, L.K.; Shoemaker, Z.M.

    2012-02-20

    In order to enhance the resilience of the Nation and its ability to protect itself in the face of natural and human-caused hazards, the ability of the critical infrastructure (CI) system to withstand specific threats and return to normal operations after degradation must be determined. To fully analyze the resilience of a region and the CI that resides within it, both the actual resilience of the individual CI and the capability of the Emergency Services Sector (ESS) to protect against and respond to potential hazards need to be considered. Thus, a regional resilience approach requires the comprehensive consideration of allmore » parts of the CI system as well as the characterization of emergency services. This characterization must generate reproducible results that can support decision making with regard to risk management, disaster response, business continuity, and community planning and management. To address these issues, Argonne National Laboratory, in collaboration with the U.S. Department of Homeland Security (DHS) Sector Specific Agency - Executive Management Office, developed a comprehensive methodology to create an Emergency Services Sector Capabilities Index (ESSCI). The ESSCI is a performance metric that ranges from 0 (low level of capabilities) to 100 (high). Because an emergency services program has a high ESSCI, however, does not mean that a specific event would not be able to affect a region or cause severe consequences. And because a program has a low ESSCI does not mean that a disruptive event would automatically lead to serious consequences in a region. Moreover, a score of 100 on the ESSCI is not the level of capability expected of emergency services programs; rather, it represents an optimal program that would rarely be observed. The ESSCI characterizes the state of preparedness of a jurisdiction in terms of emergency and risk management. Perhaps the index's primary benefit is that it can systematically capture, at a given point in time, the capabilities of a jurisdiction to protect itself from, mitigate, respond to, and recover from a potential incident. On the basis of this metric, an interactive tool - the ESSCI Dashboard - can identify scenarios for enhancement that can be implemented, and it can identify the repercussions of these scenarios on the jurisdiction. It can assess the capabilities of law enforcement, fire fighting, search and rescue, emergency medical services, hazardous materials response, dispatch/911, and emergency management services in a given jurisdiction and it can help guide those who need to prioritize what limited resources should be used to improve these capabilities. Furthermore, this tool can be used to compare the level of capabilities of various jurisdictions that have similar socioeconomic characteristics. It can thus help DHS define how it can support risk reduction and community preparedness at a national level. This tool aligns directly with Presidential Policy Directive 8 by giving a jurisdiction a metric of its ESS's capabilities and by promoting an interactive approach for defining options to improve preparedness and to effectively respond to a disruptive event. It can be used in combination with other CI performance metrics developed at Argonne National Laboratory, such as the vulnerability index and the resilience index for assessing regional resilience.« less

  11. Advanced imaging microscope tools applied to microgravity research investigations

    NASA Astrophysics Data System (ADS)

    Peterson, L.; Samson, J.; Conrad, D.; Clark, K.

    1998-01-01

    The inability to observe and interact with experiments on orbit has been an impediment for both basic research and commercial ventures using the shuttle. In order to open the frontiers of space, the Center for Microgravity Automation Technology has developed a unique and innovative system for conducting experiments at a distance, the ``Remote Scientist.'' The Remote Scientist extends laboratory automation capability to the microgravity environment. While the Remote Scientist conceptually encompasses a broad spectrum of elements and functionalities, the development approach taken is to: • establish a baseline capability that is both flexible and versatile • incrementally augment the baseline with additional functions over time. Since last year, the application of the Remote Scientist has changed from protein crystal growth to tissue culture, specifically, the development of skeletal muscle under varying levels of tension. This system includes a series of bioreactor chambers that allow for three-dimensional growth of muscle tissue on a membrane suspended between the two ends of a programmable force transducer that can provide automated or investigator-initiated tension on the developing tissue. A microscope objective mounted on a translation carriage allows for high-resolution microscopy along a large area of the tissue. These images will be mosaiced on orbit to detect features and structures that span multiple images. The use of fluorescence and pseudo-confocal microscopy will maximize the observational capabilities of this system. A series of ground-based experiments have been performed to validate the bioreactor, the force transducer, the translation carriage and the image acquisition capabilities of the Remote Scientist. • The bioreactor is capable of sustaining three dimensional tissue culture growth over time. • The force transducer can be programmed to provide static tension on cells or to simulate either slow or fast growth of underlying tissues in vivo, ranging from 0.2 mm per day to 32 mm per day. • The two-axis translation carriage is capable of scanning the camera along the bioreactor and adjusting the focus with 25 μm resolution. • Time-lapse sequences of images have been acquired, stored and transmitted to a remote computer system. Although the current application of the Remote Scientist technology is the observation and manipulation of a tissue culture growth system, the hardware has been designed to be easily reconfigured to accommodate a multitude of experiments, including animal observation, combustion studies, protein crystal growth, plant growth and aquatic research.

  12. Jovian System as a Demonstration of JWST’s Capabilities for Solar System Science: Status Update

    NASA Astrophysics Data System (ADS)

    Conrad, Al; Fouchet, Thierry

    2018-06-01

    Characterize Jupiter’s cloud layers, winds, composition, auroral activity, and temperature structureProduce maps of the atmosphere and surface of volcanically-active Io and icy satellite Ganymede to constrain their thermal and atmospheric structure, and search for plumesCharacterize the ring structure, and its sources, sinks and evolution.We will present our progress to date in planning these observations and provide an update on our expectations.Our program will utilize all JWST instruments in different observing modes to demonstrate the capabilities of JWST’s instruments on one of the largest and brightest sources in the Solar System and on very faint targets next to it. We will also observe weak emission/absorption bands on strong continua, and with NIRIS/AMI we will maximize the Strehl ratio on unresolved features, such as Io’s volcanoes.We will deliver a number of science enabling products that will facilitate community science, including, e.g.: i) characterizing Jupiter’s scattered light in the context of scientific observations, ii) resolve point sources with AMI in a crowded field (Io’s volcanoes), and compare this to classical observations, iii) develop tools to mosaic/visualize spectral datacubes using MIRI and NIRSpec on Jupiter. Finally, our program will also set a first temporal benchmark to study time variations in the jovian system and any interconnectivity (e.g., through its magnetic field) during JWST’s lifetime.

  13. Physics Mining of Multi-Source Data Sets

    NASA Technical Reports Server (NTRS)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  14. A Satellite Data Analysis and CubeSat Instrument Simulator Tool for Simultaneous Multi-spacecraft Measurements of Solar Energetic Particles

    NASA Astrophysics Data System (ADS)

    Vannitsen, Jordan; Rizzitelli, Federico; Wang, Kaiti; Segret, Boris; Juang, Jyh-Ching; Miau, Jiun-Jih

    2017-12-01

    This paper presents a Multi-satellite Data Analysis and Simulator Tool (MDAST), developed with the original goal to support the science requirements of a Martian 3-Unit CubeSat mission profile named Bleeping Interplanetary Radiation Determination Yo-yo (BIRDY). MDAST was firstly designed and tested by taking into account the positions, attitudes, instruments field of view and energetic particles flux measurements from four spacecrafts (ACE, MSL, STEREO A, and STEREO B). Secondly, the simulated positions, attitudes and instrument field of view from the BIRDY CubeSat have been adapted for input. And finally, this tool can be used for data analysis of the measurements from the four spacecrafts mentioned above so as to simulate the instrument trajectory and observation capabilities of the BIRDY CubeSat. The onset, peak and end time of a solar particle event is specifically defined and identified with this tool. It is not only useful for the BIRDY mission but also for analyzing data from the four satellites aforementioned and can be utilized for other space weather missions with further customization.

  15. Development of an Irrigation Scheduling Tool for the High Plains Region

    NASA Astrophysics Data System (ADS)

    Shulski, M.; Hubbard, K. G.; You, J.

    2009-12-01

    The High Plains Regional Climate Center (HPRCC) at the University of Nebraska is one of NOAA’s six regional climate centers in the U.S. Primary objectives of the HPRCC are to conduct applied climate research, engage in climate education and outreach, and increase the use and availability of climate information by developing value-added products. Scientists at the center are engaged in utilizing regional weather data to develop tools that can be used directly by area stakeholders, particularly for agricultural sectors. A new study is proposed that will combine NOAA products (short-term forecasts and seasonal outlooks of temperature and precipitation) with existing capabilities to construct an irrigation scheduling tool that can be used by producers in the region. This tool will make use of weather observations from the regional mesonet (specifically the AWDN, Automated Weather Data Network) and the nation-wide relational database and web portal (ACIS, Applied Climate Information System). The primary benefit to stakeholders will be a more efficient use of water and energy resources owing to the reduction of uncertainty in the timing of irrigation.

  16. SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots

    NASA Astrophysics Data System (ADS)

    Boisse, I.; Bonfils, X.; Santos, N. C.

    2012-09-01

    We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap

  17. BASINS and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (External Review Draft)

    EPA Science Inventory

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...

  18. Stand-Alone Measurements and Characterization | Photovoltaic Research |

    Science.gov Websites

    Science and Technology Facility cluster tools offer powerful capabilities for measuring and characterizing Characterization tool suite are supplemented by the Integrated Measurements and Characterization cluster tool the Integrated M&C cluster tool using a mobile transport pod, which can keep samples under vacuum

  19. Pendulum Phenomena and the Assessment of Scientific Inquiry Capabilities

    ERIC Educational Resources Information Center

    Zachos, Paul

    2004-01-01

    Phenomena associated with the "pendulum" present numerous opportunities for assessing higher order human capabilities related to "scientific inquiry" and the "discovery" of natural law. This paper illustrates how systematic "assessment of scientific inquiry capabilities", using "pendulum" phenomena, can provide a useful tool for classroom teachers…

  20. CDPP supporting tools to Solar Orbiter and Parker Solar Probe data exploitation

    NASA Astrophysics Data System (ADS)

    Genot, V. N.; Cecconi, B.; Dufourg, N.; Gangloff, M.; André, N.; Bouchemit, M.; Jacquey, C.; Pitout, F.; Rouillard, A.; Nathanaël, J.; Lavraud, B.; Durand, J.; Tao, C.; Buchlin, E.; Witasse, O. G.

    2017-12-01

    In recent years the French Centre de Données de la Physique des Plasmas (CDPP) has extended its data analysis capability by designing a number of new tools. In the solar and heliospheric contexts, and in direct support to the forthcoming solar ESA and NASA missions in these fields, these tools comprise of the Propagation Tool which helps linking solar perturbations observed both in remote and in-situ data; this is achieved through direct connection to the companion solar database MEDOC and the CDPP AMDA database. More recently, in the frame of Europlanet 2020 RI, a 1D MHD solar wind propagation code (Tao et al., 2005) has been interfaced to provide real time solar wind monitors at cruising probes and planetary environments using ACE real time data as inputs (Heliopropa service). Finally, simulations, models and data may be combined and visualized in a 3D context with 3DView. This presentation will overview the various functionalities of these tools and provide examples, in particular a 'CME tracking' case recently published (Witasse et al., 2017). Europlanet 2020 RI has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 654208.

  1. Optimization of the MINERVA Exoplanet Search Strategy via Simulations

    NASA Astrophysics Data System (ADS)

    Nava, Chantell; Johnson, Samson; McCrady, Nate; Minerva

    2015-01-01

    Detection of low-mass exoplanets requires high spectroscopic precision and high observational cadence. MINERVA is a dedicated observatory capable of sub meter-per-second radial velocity precision. As a dedicated observatory, MINERVA can observe with every-clear-night cadence that is essential for low-mass exoplanet detection. However, this cadence complicates the determination of an optimal observing strategy. We simulate MINERVA observations to optimize our observing strategy and maximize exoplanet detections. A dispatch scheduling algorithm provides observations of MINERVA targets every day over a three-year observing campaign. An exoplanet population with a distribution informed by Kepler statistics is assigned to the targets, and radial velocity curves induced by the planets are constructed. We apply a correlated noise model that realistically simulates stellar astrophysical noise sources. The simulated radial velocity data is fed to the MINERVA planet detection code and the expected exoplanet yield is calculated. The full simulation provides a tool to test different strategies for scheduling observations of our targets and optimizing the MINERVA exoplanet search strategy.

  2. Integration of the SSPM and STAGE with the MPACT Virtual Facility Distributed Test Bed.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipiti, Benjamin B.; Shoman, Nathan

    The Material Protection Accounting and Control Technologies (MPACT) program within DOE NE is working toward a 2020 milestone to demonstrate a Virtual Facility Distributed Test Bed. The goal of the Virtual Test Bed is to link all MPACT modeling tools, technology development, and experimental work to create a Safeguards and Security by Design capability for fuel cycle facilities. The Separation and Safeguards Performance Model (SSPM) forms the core safeguards analysis tool, and the Scenario Toolkit and Generation Environment (STAGE) code forms the core physical security tool. These models are used to design and analyze safeguards and security systems and generatemore » performance metrics. Work over the past year has focused on how these models will integrate with the other capabilities in the MPACT program and specific model changes to enable more streamlined integration in the future. This report describes the model changes and plans for how the models will be used more collaboratively. The Virtual Facility is not designed to integrate all capabilities into one master code, but rather to maintain stand-alone capabilities that communicate results between codes more effectively.« less

  3. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining

    PubMed Central

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-01-01

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322

  4. Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.

    PubMed

    Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin

    2016-11-16

    Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.

  5. NOAA's operational path forward: Developing the Coyote UASonde

    NASA Astrophysics Data System (ADS)

    Cione, J.; Twining, K.; Silah, M.; Brescia, T.; Kalina, E.; Farber, A.; Troudt, C.; Ghanooni, A.; Baker, B.; Dumas, E. J.; Hock, T. F.; Smith, J.; French, J.; Fairall, C. W.; deBoer, G.; Bland, G.

    2016-12-01

    Since 2009, NOAA has shown an interest in using the air-deployed Coyote Unmanned Aircraft System (UAS) for low-altitude hurricane reconnaissance. In September of 2014, NOAA conducted two successful missions into Hurricane Edouard using this innovative observing tool. Since then, NOAA has continued to invest time and resources into the Coyote platform. These efforts include plans to release up to 7 additional Coyote UAS into tropical cyclones using NOAA's P-3 Hurricane Hunter manned aircraft in 2016. A longer-term goal for this multi-institutional partnership will be to modify the existing UAS design such that the next generation platform will be capable of conducting routine observations in direct support of a wide array of NOAA operations that extend beyond hurricane surveillance. The vision for this potentially transformative platform, dubbed the Coyote UASonde, will be to heavily leverage NOAA's existing capabilities, incorporate significant upgrades to the existing payload and employ an expert navigation and data communication system that utilizes artificial intelligence. A brief summary of Coyote successes to date as well as a future roadmap that leads NOAA towards an operationally-viable Coyote UASonde will be presented.

  6. Assessing the Effects of Multi-Node Sensor Network Configurations on the Operational Tempo

    DTIC Science & Technology

    2014-09-01

    receiver, nP is the noise power of the receiver, and iL is the implementation loss of the receiver due to hardware manufacturing. The received...13. ABSTRACT (maximum 200 words) The LPISimNet software tool provides the capability to quantify the performance of sensor network configurations by...INTENTIONALLY LEFT BLANK v ABSTRACT The LPISimNet software tool provides the capability to quantify the performance of sensor network configurations

  7. Data reductions and data quality for the high resolution spectrograph on the Southern African Large Telescope

    NASA Astrophysics Data System (ADS)

    Crawford, S. M.; Crause, Lisa; Depagne, Éric; Ilkiewicz, Krystian; Schroeder, Anja; Kuhn, Rudolph; Hettlage, Christian; Romero Colmenaro, Encarni; Kniazev, Alexei; Väisänen, Petri

    2016-08-01

    The High Resolution Spectrograph (HRS) on the Southern African Large Telescope (SALT) is a dual beam, fiber-fed echelle spectrograph providing high resolution capabilities to the SALT observing community. We describe the available data reduction tools and the procedures put in place for regular monitoring of the data quality from the spectrograph. Data reductions are carried out through the pyhrs package. The data characteristics and instrument stability are reported as part of the SALT Dashboard to help monitor the performance of the instrument.

  8. Integrated modeling of advanced optical systems

    NASA Astrophysics Data System (ADS)

    Briggs, Hugh C.; Needels, Laura; Levine, B. Martin

    1993-02-01

    This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.

  9. Transport, Acceleration and Spatial Access of Solar Energetic Particles

    NASA Astrophysics Data System (ADS)

    Borovikov, D.; Sokolov, I.; Effenberger, F.; Jin, M.; Gombosi, T. I.

    2017-12-01

    Solar Energetic Particles (SEPs) are a major branch of space weather. Often driven by Coronal Mass Ejections (CMEs), SEPs have a very high destructive potential, which includes but is not limited to disrupting communication systems on Earth, inflicting harmful and potentially fatal radiation doses to crew members onboard spacecraft and, in extreme cases, to people aboard high altitude flights. However, currently the research community lacks efficient tools to predict such hazardous SEP events. Such a tool would serve as the first step towards improving humanity's preparedness for SEP events and ultimately its ability to mitigate their effects. The main goal of the presented research is to develop a computational tool that provides the said capabilities and meets the community's demand. Our model has the forecasting capability and can be the basis for operational system that will provide live information on the current potential threats posed by SEPs based on observations of the Sun. The tool comprises several numerical models, which are designed to simulate different physical aspects of SEPs. The background conditions in the interplanetary medium, in particular, the Coronal Mass Ejection driving the particle acceleration, play a defining role and are simulated with the state-of-the-art MHD solver, Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme (BATS-R-US). The newly developed particle code, Multiple-Field-Line-Advection Model for Particle Acceleration (M-FLAMPA), simulates the actual transport and acceleration of SEPs and is coupled to the MHD code. The special property of SEPs, the tendency to follow magnetic lines of force, is fully taken advantage of in the computational model, which substitutes a complicated 3-D model with a multitude of 1-D models. This approach significantly simplifies computations and improves the time performance of the overall model. Also, it plays an important role of mapping the affected region by connecting it with the origin of SEPs at the solar surface. Our model incorporates the effects of the near-Sun field line meandering that affects the perpendicular transport of SEPs and can explain the occurrence of large longitudinal spread observed even in the early phases of such events.

  10. Ganalyzer: A tool for automatic galaxy image analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-05-01

    Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.

  11. CEOS visualization environment (COVE) tool for intercalibration of satellite instruments

    USGS Publications Warehouse

    Kessler, P.D.; Killough, B.D.; Gowda, S.; Williams, B.R.; Chander, G.; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of international and domestic space agencies and organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration planning efforts, whether those efforts require past, present, or future predictions. This paper provides a brief overview of the COVE tool, its validation, accuracies, and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  12. Streamlining Collaborative Planning in Spacecraft Mission Architectures

    NASA Technical Reports Server (NTRS)

    Misra, Dhariti; Bopf, Michel; Fishman, Mark; Jones, Jeremy; Kerbel, Uri; Pell, Vince

    2000-01-01

    During the past two decades, the planning and scheduling community has substantially increased the capability and efficiency of individual planning and scheduling systems. Relatively recently, research work to streamline collaboration between planning systems is gaining attention. Spacecraft missions stand to benefit substantially from this work as they require the coordination of multiple planning organizations and planning systems. Up to the present time this coordination has demanded a great deal of human intervention and/or extensive custom software development efforts. This problem will become acute with increased requirements for cross-mission plan coordination and multi -spacecraft mission planning. The Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center is taking innovative steps to define collaborative planning architectures, and to identify coordinated planning tools for Cross-Mission Campaigns. Prototypes are being developed to validate these architectures and assess the usefulness of the coordination tools by the planning community. This presentation will focus on one such planning coordination too], named Visual Observation Layout Tool (VOLT), which is currently being developed to streamline the coordination between astronomical missions

  13. CEOS Visualization Environment (COVE) Tool for Intercalibration of Satellite Instruments

    NASA Technical Reports Server (NTRS)

    Kessler, Paul D.; Killough, Brian D.; Gowda, Sanjay; Williams, Brian R.; Chander, Gyanesh; Qu, Min

    2013-01-01

    Increasingly, data from multiple instruments are used to gain a more complete understanding of land surface processes at a variety of scales. Intercalibration, comparison, and coordination of satellite instrument coverage areas is a critical effort of space agencies and of international and domestic organizations. The Committee on Earth Observation Satellites Visualization Environment (COVE) is a suite of browser-based applications that leverage Google Earth to display past, present, and future satellite instrument coverage areas and coincident calibration opportunities. This forecasting and ground coverage analysis and visualization capability greatly benefits the remote sensing calibration community in preparation for multisatellite ground calibration campaigns or individual satellite calibration studies. COVE has been developed for use by a broad international community to improve the efficiency and efficacy of such calibration efforts. This paper provides a brief overview of the COVE tool, its validation, accuracies and limitations with emphasis on the applicability of this visualization tool for supporting ground field campaigns and intercalibration of satellite instruments.

  14. The utility of satellite observations for constraining fine-scale and transient methane sources

    NASA Astrophysics Data System (ADS)

    Turner, A. J.; Jacob, D.; Benmergui, J. S.; Brandman, J.; White, L.; Randles, C. A.

    2017-12-01

    Resolving differences between top-down and bottom-up emissions of methane from the oil and gas industry is difficult due, in part, to their fine-scale and often transient nature. There is considerable interest in using atmospheric observations to detect these sources. Satellite-based instruments are an attractive tool for this purpose and, more generally, for quantifying methane emissions on fine scales. A number of instruments are planned for launch in the coming years from both low earth and geostationary orbit, but the extent to which they can provide fine-scale information on sources has yet to be explored. Here we present an observation system simulation experiment (OSSE) exploring the tradeoffs between pixel resolution, measurement frequency, and instrument precision on the fine-scale information content of a space-borne instrument measuring methane. We use the WRF-STILT Lagrangian transport model to generate more than 200,000 column footprints at 1.3×1.3 km2 spatial resolution and hourly temporal resolution over the Barnett Shale in Texas. We sub-sample these footprints to match the observing characteristics of the planned TROPOMI and GeoCARB instruments as well as different hypothetical observing configurations. The information content of the various observing systems is evaluated using the Fisher information matrix and its singular values. We draw conclusions on the capabilities of the planned satellite instruments and how these capabilities could be improved for fine-scale source detection.

  15. The In-Space Propulsion Technology Project Low-Thrust Trajectory Tool Suite

    NASA Technical Reports Server (NTRS)

    Dankanich, John W.

    2008-01-01

    The ISPT project released its low-thrust trajectory tool suite in March of 2006. The LTTT suite tools range in capabilities, but represent the state-of-the art in NASA low-thrust trajectory optimization tools. The tools have all received considerable updates following the initial release, and they are available through their respective development centers or the ISPT project website.

  16. Comparison of Satellite Observations of Aerosol Optical Depth to Surface Monitor Fine Particle Concentration

    NASA Technical Reports Server (NTRS)

    Kleb, Mary M.; AlSaadi, Jassim A.; Neil, Doreen O.; Pierce, Robert B.; Pippin, Margartet R.; Roell, Marilee M.; Kittaka, Chieko; Szykman, James J.

    2004-01-01

    Under NASA's Earth Science Applications Program, the Infusing satellite Data into Environmental Applications (IDEA) project examined the relationship between satellite observations and surface monitors of air pollutants to facilitate a more capable and integrated observing network. This report provides a comparison of satellite aerosol optical depth to surface monitor fine particle concentration observations for the month of September 2003 at more than 300 individual locations in the continental US. During September 2003, IDEA provided prototype, near real-time data-fusion products to the Environmental Protection Agency (EPA) directed toward improving the accuracy of EPA s next-day Air Quality Index (AQI) forecasts. Researchers from NASA Langley Research Center and EPA used data from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument combined with EPA ground network data to create a NASA-data-enhanced Forecast Tool. Air quality forecasters used this tool to prepare their forecasts of particle pollution, or particulate matter less than 2.5 microns in diameter (PM2.5), for the next-day AQI. The archived data provide a rich resource for further studies and analysis. The IDEA project uses data sets and models developed for tropospheric chemistry research to assist federal, state, and local agencies in making decisions concerning air quality management to protect public health.

  17. Supporting ITM Missions by Observing System Simulation Experiments: Initial Design, Challenges and Perspectives

    NASA Astrophysics Data System (ADS)

    Yudin, V. A.; England, S.; Matsuo, T.; Wang, H.; Immel, T. J.; Eastes, R.; Akmaev, R. A.; Goncharenko, L. P.; Fuller-Rowell, T. J.; Liu, H.; Solomon, S. C.; Wu, Q.

    2014-12-01

    We review and discuss the capability of novel configurations of global community (WACCM-X and TIME-GCM) and planned-operational (WAM) models to support current and forthcoming space-borne missions to monitor the dynamics and composition of the Ionosphere-Thermosphere-Mesosphere (ITM) system. In the specified meteorology model configuration of WACCM-X, the lower atmosphere is constrained by operational analyses and/or short-term forecasts provided by the Goddard Earth Observing System (GEOS-5) of GMAO/NASA/GSFC. With the terrestrial weather of GEOS-5 and updated model physics, WACCM-X simulations are capable to reproduce the observed signatures of the perturbed wave dynamics and ion-neutral coupling during recent (2006-2013) stratospheric warming events, short-term, annual and year-to-year variability of prevailing flows, planetary waves, tides, and composition. With assimilation of the NWP data in the troposphere and stratosphere the planned-operational configuration of WAM can also recreate the observed features of the ITM day-to-day variability. These "terrestrial-weather" driven whole atmosphere simulations, with day-to-day variable solar and geomagnetic inputs, can provide specification of the background state (first guess) and errors for the inverse algorithms of forthcoming NASA ITM missions, such as ICON and GOLD. With two different viewing geometries (sun-synchronous, for ICON and geostationary for GOLD) these missions promise to perform complimentary global observations of temperature, winds and constituents to constrain the first-principle space weather forecast models. The paper will discuss initial designs of Observing System Simulation Experiments (OSSE) in the coupled simulations of TIME-GCM/WACCM-X/GEOS5 and WAM/GIP. As recognized, OSSE represent an excellent learning tool for designing and evaluating observing capabilities of novel sensors. The choice of assimilation schemes, forecast and observational errors will be discussed along with challenges and perspectives to constrain fast-varying dynamics of tides and planetary waves by observations made from sun-synchronous and geostationary space-borne platforms. We will also discuss how correlative space-borne and ground-based observations can evaluate OSSE results.

  18. Geospatial Information System Capability Maturity Models

    DOT National Transportation Integrated Search

    2017-06-01

    To explore how State departments of transportation (DOTs) evaluate geospatial tool applications and services within their own agencies, particularly their experiences using capability maturity models (CMMs) such as the Urban and Regional Information ...

  19. An Observation Capability Semantic-Associated Approach to the Selection of Remote Sensing Satellite Sensors: A Case Study of Flood Observations in the Jinsha River Basin

    PubMed Central

    Hu, Chuli; Li, Jie; Lin, Xin

    2018-01-01

    Observation schedules depend upon the accurate understanding of a single sensor’s observation capability and the interrelated observation capability information on multiple sensors. The general ontologies for sensors and observations are abundant. However, few observation capability ontologies for satellite sensors are available, and no study has described the dynamic associations among the observation capabilities of multiple sensors used for integrated observational planning. This limitation results in a failure to realize effective sensor selection. This paper develops a sensor observation capability association (SOCA) ontology model that is resolved around the task-sensor-observation capability (TSOC) ontology pattern. The pattern is developed considering the stimulus-sensor-observation (SSO) ontology design pattern, which focuses on facilitating sensor selection for one observation task. The core aim of the SOCA ontology model is to achieve an observation capability semantic association. A prototype system called SemOCAssociation was developed, and an experiment was conducted for flood observations in the Jinsha River basin in China. The results of this experiment verified that the SOCA ontology based association method can help sensor planners intuitively and accurately make evidence-based sensor selection decisions for a given flood observation task, which facilitates efficient and effective observational planning for flood satellite sensors. PMID:29883425

  20. An Observation Capability Semantic-Associated Approach to the Selection of Remote Sensing Satellite Sensors: A Case Study of Flood Observations in the Jinsha River Basin.

    PubMed

    Hu, Chuli; Li, Jie; Lin, Xin; Chen, Nengcheng; Yang, Chao

    2018-05-21

    Observation schedules depend upon the accurate understanding of a single sensor’s observation capability and the interrelated observation capability information on multiple sensors. The general ontologies for sensors and observations are abundant. However, few observation capability ontologies for satellite sensors are available, and no study has described the dynamic associations among the observation capabilities of multiple sensors used for integrated observational planning. This limitation results in a failure to realize effective sensor selection. This paper develops a sensor observation capability association (SOCA) ontology model that is resolved around the task-sensor-observation capability (TSOC) ontology pattern. The pattern is developed considering the stimulus-sensor-observation (SSO) ontology design pattern, which focuses on facilitating sensor selection for one observation task. The core aim of the SOCA ontology model is to achieve an observation capability semantic association. A prototype system called SemOCAssociation was developed, and an experiment was conducted for flood observations in the Jinsha River basin in China. The results of this experiment verified that the SOCA ontology based association method can help sensor planners intuitively and accurately make evidence-based sensor selection decisions for a given flood observation task, which facilitates efficient and effective observational planning for flood satellite sensors.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cashion, Avery Ted; Cieslewski, Grzegorz

    New generations of high-temperature (HT) sensors and electronics are enabling increased measurement speed and accuracy allowing collection of more accurate and relevant data by downhole tools. Unfortunately, this increased capability is often not realized due to the bottleneck in the uplink data transmission rates due to poor signal characteristics of HT wireline. The objective of this project is to enable the high transmission rate of raw data from downhole tools such as acoustic logging tools and seismic measurement devices to minimize the need for downhole signal processing. To achieve this objective, Sandia has undertaken the effort to develop an asymmetricmore » high-temperature (HT), highspeed data link system for downhole tools capable of operating at temperatures of 210°C while taking advantage of existing wireline transmission channels. Current data rates over HT single-conductor wireline are limited to approximately 200 kbps. The goal system will be capable of transmitting data from the tool to the surface (uplink) at rates of > 1Mbps over 5,000 feet of single-conductor wireline as well as automatically adapt the data rate to the longer wirelines by adapting modern telecommunications techniques to operate on high temperature electronics. The data rate from the surface to the tool (downlink) will be significantly smaller but sufficient for command and control functions. While 5,000 feet of cable is the benchmark for this effort, improvements apply to all lengths of cable.« less

  2. Description of the LASSO Alpha 1 Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson, William I.; Vogelmann, Andrew M.; Cheng, Xiaoping

    The Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility began a pilot project in May 2015 to design a routine, high-resolution modeling capability to complement ARM’s extensive suite of measurements. This modeling capability has been named the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) project. The availability of LES simulations with concurrent observations will serve many purposes. LES helps bridge the scale gap between DOE ARM observations and models, and the use of routine LES adds value to observations. It provides a self-consistent representation of the atmosphere and a dynamical context for the observations. Further,more » it elucidates unobservable processes and properties. LASSO will generate a simulation library for researchers that enables statistical approaches beyond a single-case mentality. It will also provide tools necessary for modelers to reproduce the LES and conduct their own sensitivity experiments. Many different uses are envisioned for the combined LASSO LES and observational library. For an observationalist, LASSO can help inform instrument remote-sensing retrievals, conduct Observation System Simulation Experiments (OSSEs), and test implications of radar scan strategies or flight paths. For a theoretician, LASSO will help calculate estimates of fluxes and co-variability of values, and test relationships without having to run the model yourself. For a modeler, LASSO will help one know ahead of time which days have good forcing, have co-registered observations at high-resolution scales, and have simulation inputs and corresponding outputs to test parameterizations. Further details on the overall LASSO project are available at http://www.arm. gov/science/themes/lasso.« less

  3. OM300 Direction Drilling Module

    DOE Data Explorer

    MacGugan, Doug

    2013-08-22

    OM300 – Geothermal Direction Drilling Navigation Tool: Design and produce a prototype directional drilling navigation tool capable of high temperature operation in geothermal drilling Accuracies of 0.1° Inclination and Tool Face, 0.5° Azimuth Environmental Ruggedness typical of existing oil/gas drilling Multiple Selectable Sensor Ranges High accuracy for navigation, low bandwidth High G-range & bandwidth for Stick-Slip and Chirp detection Selectable serial data communications Reduce cost of drilling in high temperature Geothermal reservoirs Innovative aspects of project Honeywell MEMS* Vibrating Beam Accelerometers (VBA) APS Flux-gate Magnetometers Honeywell Silicon-On-Insulator (SOI) High-temperature electronics Rugged High-temperature capable package and assembly process

  4. Field Information Support Tool

    DTIC Science & Technology

    2010-09-01

    35 1. ODK Collect ...34 Figure 17. ODK Collect main menu...commercially available software packages and capabilities to function effectively. The first of these tools, ODK Collect and ODK Aggregate, are

  5. Content and functional specifications for a standards-based multidisciplinary rounding tool to maintain continuity across acute and critical care

    PubMed Central

    Collins, Sarah; Hurley, Ann C; Chang, Frank Y; Illa, Anisha R; Benoit, Angela; Laperle, Sarah; Dykes, Patricia C

    2014-01-01

    Background Maintaining continuity of care (CoC) in the inpatient setting is dependent on aligning goals and tasks with the plan of care (POC) during multidisciplinary rounds (MDRs). A number of locally developed rounding tools exist, yet there is a lack of standard content and functional specifications for electronic tools to support MDRs within and across settings. Objective To identify content and functional requirements for an MDR tool to support CoC. Materials and methods We collected discrete clinical data elements (CDEs) discussed during rounds for 128 acute and critical care patients. To capture CDEs, we developed and validated an iPad-based observational tool based on informatics CoC standards. We observed 19 days of rounds and conducted eight group and individual interviews. Descriptive and bivariate statistics and network visualization were conducted to understand associations between CDEs discussed during rounds with a particular focus on the POC. Qualitative data were thematically analyzed. All analyses were triangulated. Results We identified the need for universal and configurable MDR tool views across settings and users and the provision of messaging capability. Eleven empirically derived universal CDEs were identified, including four POC CDEs: problems, plan, goals, and short-term concerns. Configurable POC CDEs were: rationale, tasks/‘to dos’, pending results and procedures, discharge planning, patient preferences, need for urgent review, prognosis, and advice/guidance. Discussion Some requirements differed between settings; yet, there was overlap between POC CDEs. Conclusions We recommend an initial list of 11 universal CDEs for continuity in MDRs across settings and 27 CDEs that can be configured to meet setting-specific needs. PMID:24081019

  6. Energy Modeling Capabilities in ORD's Air, Climate and ...

    EPA Pesticide Factsheets

    Presentation to ACE Centers Kick-Off Meeting highlighting energy modeling work, capabilities and tools that are under development in ORD/NRMRL under the ACE Program. Presentation to ACE Centers Kick-Off Meeting

  7. Blood Irradiator Interactive Tool Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howington, John; Potter, Charles; DeGroff, Tavias

    The “Blood Irradiator Interactive Tool” compares a typical Cs-137 Blood Irradiator with that of the capabilities of an average X-ray Irradiator. It is designed to inform the user about the potential capabilities that an average X-ray Irradiator could offer them. Specifically the tool compares the amount of blood bags that can be irradiated by the users’ machine with that of the average X-ray capability. It also forcasts the amount of blood that can be irradiated on yearly basis for both the users’ machine and an average X-ray Device. The Average X-ray capabilities are taken from the three X-ray devices currentlymore » on the market: The RS 3400 Rad Source X-ray Blood Irradiator and both the 2.0L and 3.5 L versions of the Best Theratronis Raycell MK2« less

  8. Social Justice Intents in Policy: An Analysis of Capability "for" and "through" Education

    ERIC Educational Resources Information Center

    Gale, Trevor; Molla, Tebeje

    2015-01-01

    Primarily developed as an alternative to narrow measures of well-being such as utility and resources, Amartya Sen's capability approach places strong emphasis on people's substantive opportunities. As a broad normative framework, the capability approach has become a valuable tool for understanding and evaluating social arrangements (e.g. education…

  9. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  10. Laser-induced Fluorescence Spectroscopy (LIFS) for Discrimination of Genetically Close Sweet Orange Accessions ( Citrus sinensis L. Osbeck).

    PubMed

    Massaiti Kuboyama Kubota, Thiago; Bebeachibuli Magalhães, Aida; Nery da Silva, Marina; Ribeiro Villas Boas, Paulino; Novelli, Valdenice M; Bastianel, Marinês; Sagawa, Cíntia H D; Cristofani-Yaly, Mariângela; Marcondes Bastos Pereira Milori, Débora

    2017-02-01

    Although there is substantial diversity among cultivated sweet oranges genotypes with respect to morphological, physiological, and agronomic traits, very little variation at DNA level has been observed. It is possible that this low DNA molecular variability is due to a narrow genetic basis commonly observed in this citrus group. The most different morphological characters observed were originated through mutations, which are maintained by vegetative propagation. Despite all molecular tools available for discrimination between these different accessions, in general, low polymorphism has been observed in all groups of sweet oranges and they may not be identified by molecular markers. In this context, this paper describes the results obtained by using laser-induced fluorescent spectroscopy (LIFS) as a tool to discriminate sweet orange accessions ( Citrus sinensis L. Osbeck) including common, low acidity, pigmented, and navel orange groups, with very little variation at DNA level. The findings showed that LIFS combined with statistical methods is capable to discriminate different accessions. The basic idea is that citrus leaves have multiple fluorophores and concentration depends on their genetics and metabolism. Thus, we consider that the optical properties of citrus leaves may be different, depending on variety. The results have shown that the developed method, for the best classification rate, reaches an average sensitivity and specificity of 95% and 97.5%, respectively. An interesting application of this study is the development of an economically viable tool for early identification in seedling certification, in citrus breeding programs, in cultivar protection, or in germplasm core collection.

  11. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  12. BASINS Climate Assessment Tool Tutorials

    EPA Pesticide Factsheets

    The BASINS Climate Assessment Tool (CAT) provides a flexible set of capabilities for exploring the potential effects of climate change on streamflow and water quality using different watershed models in BASINS.

  13. USGS Science Data Life Cycle Tools - Lessons Learned in moving to the Cloud

    NASA Astrophysics Data System (ADS)

    Frame, M. T.; Mancuso, T.; Hutchison, V.; Zolly, L.; Wheeler, B.; Urbanowski, S.; Devarakonda, R.; Palanisamy, G.

    2016-12-01

    The U.S Geological Survey (USGS) Core Science Systems has been working for the past year to design, re-architect, and implement several key tools and systems within the USGS Cloud Hosting Service supported by Amazon Web Services (AWS). As a result of emerging USGS data management policies that align with federal Open Data mandates, and as part of a concerted effort to respond to potential increasing user demand due to these policies, the USGS strategically began migrating its core data management tools and services to the AWS environment in hopes of leveraging cloud capabilities (i.e. auto-scaling, replication, etc.). The specific tools included: USGS Online Metadata Editor (OME); USGS Digital Object Identifier (DOI) generation tool; USGS Science Data Catalog (SDC); USGS ScienceBase system; and an integrative tool, the USGS Data Release Workbench, which steps bureau personnel through the process of releasing data. All of these tools existed long before the Cloud was available and presented significant challenges in migrating, re-architecting, securing, and moving to a Cloud based environment. Initially, a `lift and shift' approach, essentially moving as is, was attempted and various lessons learned about that approach will be discussed, along with recommendations that resulted from the development and eventual operational implementation of these tools. The session will discuss lessons learned related to management of these tools in an AWS environment; re-architecture strategies utilized for the tools; time investments through sprint allocations; initial benefits observed from operating within a Cloud based environment; and initial costs to support these data management tools.

  14. Prospects for using existing resists for evaluating 157-nm imaging systems

    NASA Astrophysics Data System (ADS)

    Fedynyshyn, Theodore H.; Kunz, Roderick R.; Doran, Scott P.; Goodman, Russell B.; Lind, Michele L.; Curtin, Jane E.

    2000-06-01

    Lithography at 157 nm represents the next evolutionary step in the Great Optical Continuum and is currently under investigation as a possible successor to 193-nm lithography. If successful, the photoresists used for this technology must be initially capable of 100-nm resolution and be extendable to less than 70 nm. Unfortunately, as with the transition to shorter wavelengths in the past, the photoresist materials developed for longer wavelengths appear to be too absorbent for practical use as a traditional high resolution single layer resist imageable with 157 nm radiation. Until new photoresist materials are developed that are sufficiently transparent to be used as single layer resists, the existing need for a resist to be used to evaluate 157 nm imaging systems, such as the prototype steppers now under development, will have to be met by employing existing resists. We have surveyed the commercial resist market with the dual purpose of identifying the general categories of commercial resists that have potential for use as tool evaluation resist and to baseline these resists for comparison against future 157 nm resist candidates. Little difference was observed in the 157- nm absorbance between different classes of resists with most resists having an absorbance between 6 and 8 per micron. Due to the high absorbance at 157 nm of polyhydroxystyrene, polyacrylate, and polycyclic copolymer based resists, the coated resist thickness will need to be under 100 nm. All four commercial resists evaluated for imaging at 157 nm showed that they are capable of acting as a tool testing resist to identify issues attributed focus, illumination, and vibration. Finally, an improved tool testing resist can be developed within the existing resist material base, that is capable of 100 nm imaging with a binary mask and 70 nm imaging with a phase shift mask. Minor formulation modification can greatly improve resist performance including improved resolution and reduced line edge roughness.

  15. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  16. Management aspects of Gemini's base facility operations project

    NASA Astrophysics Data System (ADS)

    Arriagada, Gustavo; Nitta, Atsuko; Adamson, A. J.; Nunez, Arturo; Serio, Andrew; Cordova, Martin

    2016-08-01

    Gemini's Base Facilities Operations (BFO) Project provided the capabilities to perform routine nighttime operations without anyone on the summit. The expected benefits were to achieve money savings and to become an enabler of the future development of remote operations. The project was executed using a tailored version of Prince2 project management methodology. It was schedule driven and managing it demanded flexibility and creativity to produce what was needed, taking into consideration all the constraints present at the time: Time available to implement BFO at Gemini North (GN), two years. The project had to be done in a matrix resources environment. There were only three resources assigned exclusively to BFO. The implementation of new capabilities had to be done without disrupting operations. And we needed to succeed, introducing the new operational model that implied Telescope and instrumentation Operators (Science Operations Specialists - SOS) relying on technology to assess summit conditions. To meet schedule we created a large number of concurrent smaller projects called Work Packages (WP). To be reassured that we would successfully implement BFO, we initially spent a good portion of time and effort, collecting and learning about user's needs. This was done through close interaction with SOSs, Observers, Engineers and Technicians. Once we had a clear understanding of the requirements, we took the approach of implementing the "bare minimum" necessary technology that would meet them and that would be maintainable in the long term. Another key element was the introduction of the "gradual descent" concept. In this, we increasingly provided tools to the SOSs and Observers to prevent them from going outside the control room during nighttime operations, giving them the opportunity of familiarizing themselves with the new tools over a time span of several months. Also, by using these tools at an early stage, Engineers and Technicians had more time for debugging, problem fixing and systems usage and servicing training as well.

  17. A tool for exploring space-time patterns: an animation user research.

    PubMed

    Ogao, Patrick J

    2006-08-29

    Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment.

  18. A tool for exploring space-time patterns : an animation user research

    PubMed Central

    Ogao, Patrick J

    2006-01-01

    Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and providing explanations about observed geospatial phenomena. Also, exploring geospatial data structures using animation is best achieved using provocative interactive tools such as was seen with the inference-based animation. The visual methods employed using the three types of animation are all related and together these patterns confirm the exploratory cognitive structure and processes for visualization tools. Conclusion The generic types of animation as defined in this paper play a crucial role in facilitating the visualization of geospatial data. These animations can be created and their contents defined based on the user's presentational and exploratory needs. For highly explorative tasks, maintaining a link between the data sets and the animation is crucial to enabling a rich and effective knowledge discovery environment. PMID:16938138

  19. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  20. Building a new space weather facility at the National Observatory of Athens

    NASA Astrophysics Data System (ADS)

    Kontogiannis, Ioannis; Belehaki, Anna; Tsiropoula, Georgia; Tsagouri, Ioanna; Anastasiadis, Anastasios; Papaioannou, Athanasios

    2016-01-01

    The PROTEAS project has been initiated at the Institute of Astronomy, Astrophysics, Space Applications and Remote Sensing (IAASARS) of the National Observatory of Athens (NOA). One of its main objectives is to provide observations, processed data and space weather nowcasting and forecasting products, designed to support the space weather research community and operators of commercial and industrial systems. The space weather products to be released by this facility, will be the result of the exploitation of ground-based, as well as space-borne observations and of model results and tools already available or under development by IAASARS researchers. The objective will be achieved through: (a) the operation of a small full-disk solar telescope to conduct regular observations of the Sun in the H-alpha line; (b) the construction of a database with near real-time solar observations which will be available to the community through a web-based facility (HELIOSERVER); (c) the development of a tool for forecasting Solar Energetic Particle (SEP) events in relation to observed solar eruptive events; (d) the upgrade of the Athens Digisonde with digital transceivers and the capability of operating in bi-static link mode and (e) the sustainable operation of the European Digital Upper Atmosphere Server (DIAS) upgraded with additional data sets integrated in an interface with the HELIOSERVER and with improved models for the real-time quantification of the effects of solar eruptive events in the ionosphere.

  1. Analysis of Sea Level Rise in Action

    NASA Astrophysics Data System (ADS)

    Gill, K. M.; Huang, T.; Quach, N. T.; Boening, C.

    2016-12-01

    NASA's Sea Level Change Portal provides scientists and the general public with "one-stop" source for current sea level change information and data. Sea Level Rise research is a multidisciplinary research and in order to understand its causes, scientists must be able to access different measurements and to be able to compare them. The portal includes an interactive tool, called the Data Analysis Tool (DAT), for accessing, visualizing, and analyzing observations and models relevant to the study of Sea Level Rise. Using NEXUS, an open source, big data analytic technology developed at the Jet Propulsion Laboratory, the DAT is able provide user on-the-fly data analysis on all relevant parameters. DAT is composed of three major components: A dedicated instance of OnEarth (a WMTS service), NEXUS deep data analytic platform, and the JPL Common Mapping Client (CMC) for web browser based user interface (UI). Utilizing the global imagery, a user is capable of browsing the data in a visual manner and isolate areas of interest for further study. The interfaces "Analysis" tool provides tools for area or point selection, single and/or comparative dataset selection, and a range of options, algorithms, and plotting. This analysis component utilizes the Nexus cloud computing platform to provide on-demand processing of the data within the user-selected parameters and immediate display of the results. A RESTful web API is exposed for users comfortable with other interfaces and who may want to take advantage of the cloud computing capabilities. This talk discuss how DAT enables on-the-fly sea level research. The talk will introduce the DAT with an end-to-end tour of the tool with exploration and animating of available imagery, a demonstration of comparative analysis and plotting, and how to share and export data along with images for use in publications/presentations. The session will cover what kind of data is available, what kind of analysis is possible, and what are the outputs.

  2. Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2

    NASA Technical Reports Server (NTRS)

    Debrunner, Linda S.

    1994-01-01

    The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.

  3. VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration

    NASA Technical Reports Server (NTRS)

    Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David; hide

    2017-01-01

    The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.

  4. Tools of the Courseware Trade: A Comparison of ToolBook 1.0 and HyperCard 2.0.

    ERIC Educational Resources Information Center

    Brader, Lorinda L.

    1990-01-01

    Compares two authoring tools that were developed to enable users without programing experience to create and modify software. HyperCard, designed for Macintosh microcomputers, and ToolBook, for microcomputers that run on MS-DOS, are compared in the areas of programing languages, graphics and printing capabilities, user interface, system…

  5. Rethinking the Development of Weapons and Their Impact

    ERIC Educational Resources Information Center

    Katsioloudis, Petros J.; Jones, Mildred V.

    2011-01-01

    As one reads about the history of humans, he/she sees very early on that humans are naturally "tool users." More specifically, humans used tools as a means of subsistence and survival. Even today humans use tools to extend their capabilities beyond imagination. These tools are even used as weapons. However primitive, these early weapons would soon…

  6. New tools: potential medical applications of data from new and old environmental satellites.

    PubMed

    Huh, O K; Malone, J B

    2001-04-27

    The last 40 years, beginning with the first TIROS (television infrared observational satellite) launched on 1 April 1960, has seen an explosion of earth environmental satellite systems and their capabilities. They can provide measurements in globe encircling arrays or small select areas, with increasing resolutions, and new capabilities. Concurrently there are expanding numbers of existing and emerging infectious diseases, many distributed according to areal patterns of physical conditions at the earth's surface. For these reasons, the medical and remote sensing communities can beneficially collaborate with the objective of making needed progress in public health activities by exploiting the advances of the national and international space programs. Major improvements in applicability of remotely sensed data are becoming possible with increases in the four kinds of resolution: spatial, temporal, radiometric and spectral, scheduled over the next few years. Much collaborative research will be necessary before data from these systems are fully exploited by the medical community.

  7. MEMS actuators and sensors: observations on their performance and selection for purpose

    NASA Astrophysics Data System (ADS)

    Bell, D. J.; Lu, T. J.; Fleck, N. A.; Spearing, S. M.

    2005-07-01

    This paper presents an exercise in comparing the performance of microelectromechanical systems (MEMS) actuators and sensors as a function of operating principle. Data have been obtained from the literature for the mechanical performance characteristics of actuators, force sensors and displacement sensors. On-chip and off-chip actuators and sensors are each sub-grouped into families, classes and members according to their principle of operation. The performance of MEMS sharing common operating principles is compared with each other and with equivalent macroscopic devices. The data are used to construct performance maps showing the capability of existing actuators and sensors in terms of maximum force and displacement capability, resolution and frequency. These can also be used as a preliminary design tool, as shown in a case study on the design of an on-chip tensile test machine for materials in thin-film form.

  8. Advances in Light Microscopy for Neuroscience

    PubMed Central

    Wilt, Brian A.; Burns, Laurie D.; Ho, Eric Tatt Wei; Ghosh, Kunal K.; Mukamel, Eran A.

    2010-01-01

    Since the work of Golgi and Cajal, light microscopy has remained a key tool for neuroscientists to observe cellular properties. Ongoing advances have enabled new experimental capabilities using light to inspect the nervous system across multiple spatial scales, including ultrastructural scales finer than the optical diffraction limit. Other progress permits functional imaging at faster speeds, at greater depths in brain tissue, and over larger tissue volumes than previously possible. Portable, miniaturized fluorescence microscopes now allow brain imaging in freely behaving mice. Complementary progress on animal preparations has enabled imaging in head-restrained behaving animals, as well as time-lapse microscopy studies in the brains of live subjects. Mouse genetic approaches permit mosaic and inducible fluorescence-labeling strategies, whereas intrinsic contrast mechanisms allow in vivo imaging of animals and humans without use of exogenous markers. This review surveys such advances and highlights emerging capabilities of particular interest to neuroscientists. PMID:19555292

  9. Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry: Spray Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.

    2009-04-26

    The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less

  10. A smarter way to search, share and utilize open-spatial online data for energy R&D - Custom machine learning and GIS tools in U.S. DOE's virtual data library & laboratory, EDX

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J.; Baker, D.; Barkhurst, A.; Bean, A.; DiGiulio, J.; Jones, K.; Jones, T.; Justman, D.; Miller, R., III; Romeo, L.; Sabbatino, M.; Tong, A.

    2017-12-01

    As spatial datasets are increasingly accessible through open, online systems, the opportunity to use these resources to address a range of Earth system questions grows. Simultaneously, there is a need for better infrastructure and tools to find and utilize these resources. We will present examples of advanced online computing capabilities, hosted in the U.S. DOE's Energy Data eXchange (EDX), that address these needs for earth-energy research and development. In one study the computing team developed a custom, machine learning, big data computing tool designed to parse the web and return priority datasets to appropriate servers to develop an open-source global oil and gas infrastructure database. The results of this spatial smart search approach were validated against expert-driven, manual search results which required a team of seven spatial scientists three months to produce. The custom machine learning tool parsed online, open systems, including zip files, ftp sites and other web-hosted resources, in a matter of days. The resulting resources were integrated into a geodatabase now hosted for open access via EDX. Beyond identifying and accessing authoritative, open spatial data resources, there is also a need for more efficient tools to ingest, perform, and visualize multi-variate, spatial data analyses. Within the EDX framework, there is a growing suite of processing, analytical and visualization capabilities that allow multi-user teams to work more efficiently in private, virtual workspaces. An example of these capabilities are a set of 5 custom spatio-temporal models and data tools that form NETL's Offshore Risk Modeling suite that can be used to quantify oil spill risks and impacts. Coupling the data and advanced functions from EDX with these advanced spatio-temporal models has culminated with an integrated web-based decision-support tool. This platform has capabilities to identify and combine data across scales and disciplines, evaluate potential environmental, social, and economic impacts, highlight knowledge or technology gaps, and reduce uncertainty for a range of `what if' scenarios relevant to oil spill prevention efforts. These examples illustrate EDX's growing capabilities for advanced spatial data search and analysis to support geo-data science needs.

  11. NEON's Mobile Deployment Platform: A Resource for Community Research

    NASA Astrophysics Data System (ADS)

    Sanclements, M.

    2017-12-01

    Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provide a tool for linking PI led research to the continental scale data sets collected by NEON.

  12. Predicted Arabidopsis Interactome Resource and Gene Set Linkage Analysis: A Transcriptomic Analysis Resource.

    PubMed

    Yao, Heng; Wang, Xiaoxuan; Chen, Pengcheng; Hai, Ling; Jin, Kang; Yao, Lixia; Mao, Chuanzao; Chen, Xin

    2018-05-01

    An advanced functional understanding of omics data is important for elucidating the design logic of physiological processes in plants and effectively controlling desired traits in plants. We present the latest versions of the Predicted Arabidopsis Interactome Resource (PAIR) and of the gene set linkage analysis (GSLA) tool, which enable the interpretation of an observed transcriptomic change (differentially expressed genes [DEGs]) in Arabidopsis ( Arabidopsis thaliana ) with respect to its functional impact for biological processes. PAIR version 5.0 integrates functional association data between genes in multiple forms and infers 335,301 putative functional interactions. GSLA relies on this high-confidence inferred functional association network to expand our perception of the functional impacts of an observed transcriptomic change. GSLA then interprets the biological significance of the observed DEGs using established biological concepts (annotation terms), describing not only the DEGs themselves but also their potential functional impacts. This unique analytical capability can help researchers gain deeper insights into their experimental results and highlight prospective directions for further investigation. We demonstrate the utility of GSLA with two case studies in which GSLA uncovered how molecular events may have caused physiological changes through their collective functional influence on biological processes. Furthermore, we showed that typical annotation-enrichment tools were unable to produce similar insights to PAIR/GSLA. The PAIR version 5.0-inferred interactome and GSLA Web tool both can be accessed at http://public.synergylab.cn/pair/. © 2018 American Society of Plant Biologists. All Rights Reserved.

  13. National Energy Audit Tool for Multifamily Buildings Development Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malhotra, Mini; MacDonald, Michael; Accawi, Gina K

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherizationmore » of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional development in the future is expected to be needed if more capabilities are to be added. A rough schedule for development of the version 1 tool is presented. The components and capabilities described in this plan will serve as the starting point for development of the proposed new multifamily energy audit tool for WAP.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vang, Leng; Prescott, Steven R; Smith, Curtis

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  15. Data Presentation and Visualization (DPV) Interface Control Document

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    Data Presentation and Visualization (DPV) is a subset of the modeling and simulation (M&S) capabilities at Kennedy Space Center (KSC) that endeavors to address the challenges of how to present and share simulation output for analysts, stakeholders, decision makers, and other interested parties. DPV activities focus on the development and provision of visualization tools to meet the objectives identified above, as well as providing supporting tools and capabilities required to make its visualization products available and accessible across NASA.

  16. Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code

    NASA Technical Reports Server (NTRS)

    Freeh, Josh

    2003-01-01

    Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.

  17. A variable-temperature nanostencil compatible with a low-temperature scanning tunneling microscope/atomic force microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steurer, Wolfram, E-mail: wst@zurich.ibm.com; Gross, Leo; Schlittler, Reto R.

    2014-02-15

    We describe a nanostencil lithography tool capable of operating at variable temperatures down to 30 K. The setup is compatible with a combined low-temperature scanning tunneling microscope/atomic force microscope located within the same ultra-high-vacuum apparatus. The lateral movement capability of the mask allows the patterning of complex structures. To demonstrate operational functionality of the tool and estimate temperature drift and blurring, we fabricated LiF and NaCl nanostructures on Cu(111) at 77 K.

  18. A variable-temperature nanostencil compatible with a low-temperature scanning tunneling microscope/atomic force microscope.

    PubMed

    Steurer, Wolfram; Gross, Leo; Schlittler, Reto R; Meyer, Gerhard

    2014-02-01

    We describe a nanostencil lithography tool capable of operating at variable temperatures down to 30 K. The setup is compatible with a combined low-temperature scanning tunneling microscope/atomic force microscope located within the same ultra-high-vacuum apparatus. The lateral movement capability of the mask allows the patterning of complex structures. To demonstrate operational functionality of the tool and estimate temperature drift and blurring, we fabricated LiF and NaCl nanostructures on Cu(111) at 77 K.

  19. Implementation and benefits of advanced process control for lithography CD and overlay

    NASA Astrophysics Data System (ADS)

    Zavyalova, Lena; Fu, Chong-Cheng; Seligman, Gary S.; Tapp, Perry A.; Pol, Victor

    2003-05-01

    Due to the rapidly reduced imaging process windows and increasingly stingent device overlay requirements, sub-130 nm lithography processes are more severely impacted than ever by systamic fault. Limits on critical dimensions (CD) and overlay capability further challenge the operational effectiveness of a mix-and-match environment using multiple lithography tools, as such mode additionally consumes the available error budgets. Therefore, a focus on advanced process control (APC) methodologies is key to gaining control in the lithographic modules for critical device levels, which in turn translates to accelerated yield learning, achieving time-to-market lead, and ultimately a higher return on investment. This paper describes the implementation and unique challenges of a closed-loop CD and overlay control solution in high voume manufacturing of leading edge devices. A particular emphasis has been placed on developing a flexible APC application capable of managing a wide range of control aspects such as process and tool drifts, single and multiple lot excursions, referential overlay control, 'special lot' handling, advanced model hierarchy, and automatic model seeding. Specific integration cases, including the multiple-reticle complementary phase shift lithography process, are discussed. A continuous improvement in the overlay and CD Cpk performance as well as the rework rate has been observed through the implementation of this system, and the results are studied.

  20. Microvascular anastomosis in rodent model evaluated by Fourier domain Doppler optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Huang, Yong; Tong, Dedi; Zhu, Shan; Wu, Lehao; Ibrahim, Zuhaib; Lee, WP Andrew; Brandacher, Gerald; Kang, Jin U.

    2014-03-01

    Vascular and microvascular anastomosis are critical components of reconstructive microsurgery, vascular surgery and transplant surgery. Imaging modality that provides immediate, real-time in-depth view and 3D structure and flow information of the surgical site can be a great valuable tool for the surgeon to evaluate surgical outcome following both conventional and innovative anastomosis techniques, thus potentially increase the surgical success rate. Microvascular anastomosis for vessels with outer diameter smaller than 1.0 mm is extremely challenging and effective evaluation of the outcome is very difficult if not impossible using computed tomography (CT) angiograms, magnetic resonance (MR) angiograms and ultrasound Doppler. Optical coherence tomography (OCT) is a non-invasive high-resolution (micron level), high-speed, 3D imaging modality that has been adopted widely in biomedical and clinical applications. Phaseresolved Doppler OCT that explores the phase information of OCT signals has been shown to be capable of characterizing dynamic blood flow clinically. In this work, we explore the capability of Fourier domain Doppler OCT as an evaluation tool to detect commonly encountered post-operative complications that will cause surgical failure and to confirm positive result with surgeon's observation. Both suture and cuff based techniques were evaluated on the femoral artery and vein in the rodent model.

  1. ARM Data File Standards Version: 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kehoe, Kenneth; Beus, Sherman; Cialella, Alice

    2014-04-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools formore » the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palanisamy, Giri

    The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document willmore » enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.« less

  3. Demonstration of New OLAF Capabilities and Technologies

    NASA Astrophysics Data System (ADS)

    Kingston, C.; Palmer, E.; Stone, J.; Neese, C.; Mueller, B.

    2017-06-01

    Upgrades to the On-Line Archiving Facility (OLAF) PDS tool are leading to improved usability and additional functionality by integration of JavaScript web app frameworks. Also included is the capability to upload tabular data as CSV files.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mattsson, Ann E.

    Density Functional Theory (DFT) based Equation of State (EOS) construction is a prominent part of Sandia’s capabilities to support engineering sciences. This capability is based on augmenting experimental data with information gained from computational investigations, especially in those parts of the phase space where experimental data is hard, dangerous, or expensive to obtain. A key part of the success of the Sandia approach is the fundamental science work supporting the computational capability. Not only does this work enhance the capability to perform highly accurate calculations but it also provides crucial insight into the limitations of the computational tools, providing highmore » confidence in the results even where results cannot be, or have not yet been, validated by experimental data. This report concerns the key ingredient of projector augmented-wave (PAW) potentials for use in pseudo-potential computational codes. Using the tools discussed in SAND2012-7389 we assess the standard Vienna Ab-initio Simulation Package (VASP) PAWs for Molybdenum.« less

  5. The environment workbench: A design tool for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Jongeward, Gary A.; Kuharski, Robert A.; Rankin, Thomas V.; Wilcox, Katherine G.; Roche, James C.

    1991-01-01

    The environment workbench (EWB) is being developed for NASA by S-CUBED to provide a standard tool that can be used by the Space Station Freedom (SSF) design and user community for requirements verification. The desktop tool will predict and analyze the interactions of SSF with its natural and self-generated environments. A brief review of the EWB design and capabilities is presented. Calculations using a prototype EWB of the on-orbit floating potentials and contaminant environment of SSF are also presented. Both the positive and negative grounding configurations for the solar arrays are examined to demonstrate the capability of the EWB to provide quick estimates of environments, interactions, and system effects.

  6. Automatic Data Traffic Control on DSM Architecture

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)

    2000-01-01

    We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.

  7. Modern CACSD using the Robust-Control Toolbox

    NASA Technical Reports Server (NTRS)

    Chiang, Richard Y.; Safonov, Michael G.

    1989-01-01

    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.

  8. BASINS and WEPP Climate Assessment Tools (CAT): Case ...

    EPA Pesticide Factsheets

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential future effects of climate change on water resources. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.

  9. Fusing corn nitrogen recommendation tools for an improved canopy reflectance sensor performance

    USDA-ARS?s Scientific Manuscript database

    Nitrogen (N) rate recommendation tools are utilized to help producers maximize corn grain yield production. Many of these tools provide recommendations at field scales but often fail when corn N requirements are variable across the field. Canopy reflectance sensors are capable of capturing within-fi...

  10. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  11. Science opportunity analyzer - a multi-mission tool for planning

    NASA Technical Reports Server (NTRS)

    Streiffert, B. A.; Polanskey, C. A.; O'Reilly, T.; Colwell, J.

    2002-01-01

    For many years the diverse scientific community that supports JPL's wide variety ofinterplanetary space missions has needed a tool in order to plan and develop their experiments. The tool needs to be easily adapted to various mission types and portable to the user community. The Science Opportunity Analyzer, SOA, now in its third year of development, is intended to meet this need. SOA is a java-based application that is designed to enable scientists to identify and analyze opportunities for science observations from spacecraft. It differs from other planning tools in that it does not require an in-depth knowledge of the spacecraft command system or operation modes to begin high level planning. Users can, however, develop increasingly detailed levels of design. SOA consists of six major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, Data Output and Communications. Opportunity Search is a GUI driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. The user is given extensive flexibility to customize what is displayed in the view. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily impact the cost to science if flight rule changes occur. Data Output generates information based on the spacecraft's trajectory, opportunity search results or based on a created observation. The data can be viewed either in tabular format or as a graph. Finally, SOA is unique in that it is designed to be able to communicate with a variety of existing planning and sequencing tools. From the very beginning SOA was designed with the user in mind. Extensive surveys of the potential user community were conducted in order to develop the software requirements. Throughout the development period, close ties have been maintained with the science community to insure that the tool maintains its user focus. Although development is still in its early stages, SOA is already developing a user community on the Cassini project, which is depending on this tool for their science planning. There are other tools at JPL that do various pieces of what SOA can do; however, there is no other tool which combines all these functions and presents them to the user in such a convenient, cohesive, and easy to use fashion.

  12. Strategy for NEO follow-up observations

    NASA Astrophysics Data System (ADS)

    Tichy, Milos; Honkova, Michaela; Ticha, Jana; Kocer, Michal

    2015-03-01

    The Near-Earth Objects (NEOs) belong to the most important small bodies in the solar system, having the capability of close approaches to the Earth and even possibility to collide with the Earth. In fact, it is impossible to calculate reliable orbit of an object from a single night observations. Therefore it is necessary to extend astrometry dataset by early follow-up astrometry. Follow-up observations of the newly discovered NEO candidate should be done over an arc of several hours after the discovery and should be repeated over several following nights. The basic service used for planning of the follow-up observations is the NEO Confirmation Page (NEOCP) maintained by the Minor Planet Center of the IAU. This service provides on-line tool for calculating geocentric and topocentic ephemerides and sky-plane uncertainty maps of these objects at the specific date and time. Uncertainty map is one of the most important information used for planning of follow-up observation strategy for given time, indicating also the estimated distance of the newly discovered object and including possibility of the impact. Moreover, observatories dealing with NEO follow-up regularly have prepared their special tools and systems for follow-up work. The system and strategy for the NEO follow-up observation used at the Klet Observatory are described here. Methods and techniques used at the Klet NEO follow-up CCD astrometric programme, using 1.06-m and 0.57-m telescopes, are also discussed.

  13. A PDA study management tool (SMT) utilizing wireless broadband and full DICOM viewing capability

    NASA Astrophysics Data System (ADS)

    Documet, Jorge; Liu, Brent; Zhou, Zheng; Huang, H. K.; Documet, Luis

    2007-03-01

    During the last 4 years IPI (Image Processing and Informatics) Laboratory has been developing a web-based Study Management Tool (SMT) application that allows Radiologists, Film librarians and PACS-related (Picture Archiving and Communication System) users to dynamically and remotely perform Query/Retrieve operations in a PACS network. The users utilizing a regular PDA (Personal Digital Assistant) can remotely query a PACS archive to distribute any study to an existing DICOM (Digital Imaging and Communications in Medicine) node. This application which has proven to be convenient to manage the Study Workflow [1, 2] has been extended to include a DICOM viewing capability in the PDA. With this new feature, users can take a quick view of DICOM images providing them mobility and convenience at the same time. In addition, we are extending this application to Metropolitan-Area Wireless Broadband Networks. This feature requires Smart Phones that are capable of working as a PDA and have access to Broadband Wireless Services. With the extended application to wireless broadband technology and the preview of DICOM images, the Study Management Tool becomes an even more powerful tool for clinical workflow management.

  14. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  15. McIDAS-V: A Data Analysis and Visualization Tool for Global Satellite Data

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T. D.

    2011-12-01

    The Man-computer Interactive Data Access System (McIDAS-V) is a java-based, open-source, freely available system for scientists, researchers and algorithm developers working with atmospheric data. The McIDAS-V software tools provide powerful new data manipulation and visualization capabilities, including 4-dimensional displays, an abstract data model with integrated metadata, user defined computation, and a powerful scripting capability. As such, McIDAS-V is a valuable tool for scientists and researchers within the GEO and GOESS domains. The advancing polar and geostationary orbit environmental satellite missions conducted by several countries will carry advanced instrumentation and systems that will collect and distribute land, ocean, and atmosphere data. These systems provide atmospheric and sea surface temperatures, humidity sounding, cloud and aerosol properties, and numerous other environmental products. This presentation will display and demonstrate some of the capabilities of McIDAS-V to analyze and display high temporal and spectral resolution data using examples from international environmental satellites.

  16. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  17. PHLUX: Photographic Flux Tools for Solar Glare and Flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2010-12-02

    A web-based tool to a) analytically and empirically quantify glare from reflected light and determine the potential impact (e.g., temporary flash blindness, retinal burn), and b) produce flux maps for central receivers. The tool accepts RAW digital photographs of the glare source (for hazard assessment) or the receiver (for flux mapping), as well as a photograph of the sun for intensity and size scaling. For glare hazard assessment, the tool determines the retinal irradiance (W/cm2) and subtended source angle for an observer and plots the glare source on a hazard spectrum (i.e., low-potential for flash blindness impact, potential for flashmore » blindness impact, retinal burn). For flux mapping, the tool provides a colored map of the receiver scaled by incident solar flux (W/m2) and unwraps the physical dimensions of the receiver while accounting for the perspective of the photographer (e.g., for a flux map of a cylindrical receiver, the horizontal axis denotes receiver angle in degrees and the vertical axis denotes vertical position in meters; for a flat panel receiver, the horizontal axis denotes horizontal position in meters and the vertical axis denotes vertical position in meters). The flux mapping capability also allows the user to specify transects along which the program plots incident solar flux on the receiver.« less

  18. Optimization of turning process through the analytic flank wear modelling

    NASA Astrophysics Data System (ADS)

    Del Prete, A.; Franchi, R.; De Lorenzis, D.

    2018-05-01

    In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.

  19. Development and Overview of CPAS Sasquatch Airdrop Landing Location Predictor Software

    NASA Technical Reports Server (NTRS)

    Bledsoe, Kristin J.; Bernatovich, Michael A.

    2015-01-01

    The Capsule Parachute Assembly System (CPAS) is the parachute system for NASA's Orion spacecraft. CPAS is currently in the Engineering Development Unit (EDU) phase of testing. The test program consists of numerous drop tests, wherein a test article rigged with parachutes is extracted from an aircraft. During such tests, range safety is paramount, as is the recoverability of the parachutes and test article. It is crucial to establish a release point from the aircraft that will ensure that the article and all items released from it during flight will land in a designated safe area. The Sasquatch footprint tool was developed to determine this safe release point and to predict the probable landing locations (footprints) of the payload and all released objects. In 2012, a new version of Sasquatch, called Sasquatch Polygons, was developed that significantly upgraded the capabilities of the footprint tool. Key improvements were an increase in the accuracy of the predictions, and the addition of an interface with the Debris Tool (DT), an in-flight debris avoidance tool for use on the test observation helicopter. Additional enhancements include improved data presentation for communication with test personnel and a streamlined code structure. This paper discusses the development, validation, and performance of Sasquatch Polygons, as well as its differences from the original Sasquatch footprint tool.

  20. Satellite Contamination and Materials Outgassing Knowledge base

    NASA Technical Reports Server (NTRS)

    Minor, Jody L.; Kauffman, William J. (Technical Monitor)

    2001-01-01

    Satellite contamination continues to be a design problem that engineers must take into account when developing new satellites. To help with this issue, NASA's Space Environments and Effects (SEE) Program funded the development of the Satellite Contamination and Materials Outgassing Knowledge base. This engineering tool brings together in one location information about the outgassing properties of aerospace materials based upon ground-testing data, the effects of outgassing that has been observed during flight and measurements of the contamination environment by on-orbit instruments. The knowledge base contains information using the ASTM Standard E- 1559 and also consolidates data from missions using quartz-crystal microbalances (QCM's). The data contained in the knowledge base was shared with NASA by government agencies and industry in the US and international space agencies as well. The term 'knowledgebase' was used because so much information and capability was brought together in one comprehensive engineering design tool. It is the SEE Program's intent to continually add additional material contamination data as it becomes available - creating a dynamic tool whose value to the user is ever increasing. The SEE Program firmly believes that NASA, and ultimately the entire contamination user community, will greatly benefit from this new engineering tool and highly encourages the community to not only use the tool but add data to it as well.

  1. Top-attack modeling and automatic target detection using synthetic FLIR scenery

    NASA Astrophysics Data System (ADS)

    Weber, Bruce A.; Penn, Joseph A.

    2004-09-01

    A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.

  2. Micro Slot Generation by μ-ED Milling

    NASA Astrophysics Data System (ADS)

    Dave, H. K.; Mayanak, M. K.; Rajpurohit, S. R.; Mathai, V. J.

    2016-08-01

    Micro electro discharge machining is one of the most widely used advanced micro machining technique owing to its capability to fabricate micro features on any electrically conductive materials irrespective of its material properties. Despite its wide acceptability, the process is always adversely affected by issues like wear that occurred on the tool electrode, which results into generation of inaccurate features. Micro ED milling, a process variant in which the tool electrode simultaneously rotated and scanned during machining, is reported to have high process efficiency for generation of 3D complicated shapes and features with relatively less electrode wear intensity. In the present study an attempt has been made to study the effect of two process parameters viz. capacitance and scanning speed of tool electrode on end wear that occurs on the tool electrode and overcut of micro slots generated by micro ED milling. The experiment has been conducted on Al 1100 alloy with tungsten electrode having diameter of 300 μm. Results suggest that wear on the tool electrode and overcut of the micro features generated are highly influenced by the level of the capacitance employed during machining. For the parameter usage employed for present study however, no significant effect of variation of scanning speed has been observed on both responses.

  3. Integrating Wind Profiling Radars and Radiosonde Observations with Model Point Data to Develop a Decision Support Tool to Assess Upper-Level Winds for Space Launch

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Flinn, Clay

    2013-01-01

    On the day-of-launch, the 45th Weather Squadron (45 WS) Launch Weather Officers (LWOs) monitor the upper-level winds for their launch customers to include NASA's Launch Services Program and NASA's Ground Systems Development and Operations Program. They currently do not have the capability to display and overlay profiles of upper-level observations and numerical weather prediction model forecasts. The LWOs requested the Applied Meteorology Unit (AMU) develop a tool in the form of a graphical user interface (GUI) that will allow them to plot upper-level wind speed and direction observations from the Kennedy Space Center (KSC) 50 MHz tropospheric wind profiling radar, KSC Shuttle Landing Facility 915 MHz boundary layer wind profiling radar and Cape Canaveral Air Force Station (CCAFS) Automated Meteorological Processing System (AMPS) radiosondes, and then overlay forecast wind profiles from the model point data including the North American Mesoscale (NAM) model, Rapid Refresh (RAP) model and Global Forecast System (GFS) model to assess the performance of these models. The AMU developed an Excel-based tool that provides an objective method for the LWOs to compare the model-forecast upper-level winds to the KSC wind profiling radars and CCAFS AMPS observations to assess the model potential to accurately forecast changes in the upperlevel profile through the launch count. The AMU wrote Excel Visual Basic for Applications (VBA) scripts to automatically retrieve model point data for CCAFS (XMR) from the Iowa State University Archive Data Server (http://mtarchive.qeol.iastate.edu) and the 50 MHz, 915 MHz and AMPS observations from the NASA/KSC Spaceport Weather Data Archive web site (http://trmm.ksc.nasa.gov). The AMU then developed code in Excel VBA to automatically ingest and format the observations and model point data in Excel to ready the data for generating Excel charts for the LWO's. The resulting charts allow the LWOs to independently initialize the three models 0-hour forecasts against the observations to determine which is the best performing model and then overlay the model forecasts on time-matched observations during the launch countdown to further assess the model performance and forecasts. This paper will demonstrate integration of observed and predicted atmospheric conditions into a decision support tool and demonstrate how the GUI is implemented in operations.

  4. Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT

    NASA Technical Reports Server (NTRS)

    Maxwell, Thomas

    2012-01-01

    Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.

  5. CEREBRA: a 3-D visualization tool for brain network extracted from fMRI data.

    PubMed

    Nasir, Baris; Yarman Vural, Fatos T

    2016-08-01

    In this paper, we introduce a new tool, CEREBRA, to visualize the 3D network of human brain, extracted from the fMRI data. The tool aims to analyze the brain connectivity by representing the selected voxels as the nodes of the network. The edge weights among the voxels are estimated by considering the relationships among the voxel time series. The tool enables the researchers to observe the active brain regions and the interactions among them by using graph theoretic measures, such as, the edge weight and node degree distributions. CEREBRA provides an interactive interface with basic display and editing options for the researchers to study their hypotheses about the connectivity of the brain network. CEREBRA interactively simplifies the network by selecting the active voxels and the most correlated edge weights. The researchers may remove the voxels and edges by using local and global thresholds selected on the window. The built-in graph reduction algorithms are then eliminate the irrelevant regions, voxels and edges and display various properties of the network. The toolbox is capable of space-time representation of the voxel time series and estimated arc weights by using the animated heat maps.

  6. Atmospheric Delay Reduction Using KARAT for GPS Analysis and Implications for VLBI

    NASA Technical Reports Server (NTRS)

    Ichikawa, Ryuichi; Hobiger, Thomas; Koyama, Yasuhiro; Kondo, Tetsuro

    2010-01-01

    We have been developing a state-of-the-art tool to estimate the atmospheric path delays by raytracing through mesoscale analysis (MANAL) data, which is operationally used for numerical weather prediction by the Japan Meteorological Agency (JMA). The tools, which we have named KAshima RAytracing Tools (KARAT)', are capable of calculating total slant delays and ray-bending angles considering real atmospheric phenomena. The KARAT can estimate atmospheric slant delays by an analytical 2-D ray-propagation model by Thayer and a 3-D Eikonal solver. We compared PPP solutions using KARAT with that using the Global Mapping Function (GMF) and Vienna Mapping Function 1 (VMF1) for GPS sites of the GEONET (GPS Earth Observation Network System) operated by Geographical Survey Institute (GSI). In our comparison 57 stations of GEONET during the year of 2008 were processed. The KARAT solutions are slightly better than the solutions using VMF1 and GMF with linear gradient model for horizontal and height positions. Our results imply that KARAT is a useful tool for an efficient reduction of atmospheric path delays in radio-based space geodetic techniques such as GNSS and VLBI.

  7. Early development of Science Opportunity Analysis tools for the Jupiter Icy Moons Explorer (JUICE) mission

    NASA Astrophysics Data System (ADS)

    Cardesin Moinelo, Alejandro; Vallat, Claire; Altobelli, Nicolas; Frew, David; Llorente, Rosario; Costa, Marc; Almeida, Miguel; Witasse, Olivier

    2016-10-01

    JUICE is the first large mission in the framework of ESA's Cosmic Vision 2015-2025 program. JUICE will survey the Jovian system with a special focus on three of the Galilean Moons: Europa, Ganymede and Callisto.The mission has recently been adopted and big efforts are being made by the Science Operations Center (SOC) at the European Space and Astronomy Centre (ESAC) in Madrid for the development of tools to provide the necessary support to the Science Working Team (SWT) for science opportunity analysis and early assessment of science operation scenarios. This contribution will outline some of the tools being developed within ESA and in collaboration with the Navigation and Ancillary Information Facility (NAIF) at JPL.The Mission Analysis and Payload Planning Support (MAPPS) is developed by ESA and has been used by most of ESA's planetary missions to generate and validate science observation timelines for the simulation of payload and spacecraft operations. MAPPS has the capability to compute and display all the necessary geometrical information such as the distances, illumination angles and projected field-of-view of an imaging instrument on the surface of the given body and a preliminary setup is already in place for the early assessment of JUICE science operations.NAIF provides valuable SPICE support to the JUICE mission and several tools are being developed to compute and visualize science opportunities. In particular the WebGeoCalc and Cosmographia systems are provided by NAIF to compute time windows and create animations of the observation geometry available via traditional SPICE data files, such as planet orbits, spacecraft trajectory, spacecraft orientation, instrument field-of-view "cones" and instrument footprints. Other software tools are being developed by ESA and other collaborating partners to support the science opportunity analysis for all missions, like the SOLab (Science Operations Laboratory) or new interfaces for observation definitions and opportunity window databases.

  8. Advanced Power System Analysis Capabilities

    NASA Technical Reports Server (NTRS)

    1997-01-01

    As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.

  9. Performance, results, and prospects of the visible spectrograph VEGA on CHARA

    NASA Astrophysics Data System (ADS)

    Mourard, Denis; Challouf, Mounir; Ligi, Roxanne; Bério, Philippe; Clausse, Jean-Michel; Gerakis, Jérôme; Bourges, Laurent; Nardetto, Nicolas; Perraut, Karine; Tallon-Bosc, Isabelle; McAlister, H.; ten Brummelaar, T.; Ridgway, S.; Sturmann, J.; Sturmann, L.; Turner, N.; Farrington, C.; Goldfinger, P. J.

    2012-07-01

    In this paper, we review the current performance of the VEGA/CHARA visible spectrograph and make a review of the most recent astrophysical results. The science programs take benefit of the exceptional angular resolution, the unique spectral resolution and one of the main features of CHARA: Infrared and Visible parallel operation. We also discuss recent developments concerning the tools for the preparation of observations and important features of the data reduction software. A short discussion of the future developments will complete the presentation, directed towards new detectors and possible new beam combination scheme for improved sensitivity and imaging capabilities.

  10. Esophageal cancer detection based on tissue surface-enhanced Raman spectroscopy and multivariate analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shangyuan; Lin, Juqiang; Huang, Zufang; Chen, Guannan; Chen, Weisheng; Wang, Yue; Chen, Rong; Zeng, Haishan

    2013-01-01

    The capability of using silver nanoparticle based near-infrared surface enhanced Raman scattering (SERS) spectroscopy combined with principal component analysis (PCA) and linear discriminate analysis (LDA) to differentiate esophageal cancer tissue from normal tissue was presented. Significant differences in Raman intensities of prominent SERS bands were observed between normal and cancer tissues. PCA-LDA multivariate analysis of the measured tissue SERS spectra achieved diagnostic sensitivity of 90.9% and specificity of 97.8%. This exploratory study demonstrated great potential for developing label-free tissue SERS analysis into a clinical tool for esophageal cancer detection.

  11. Objectively Optimized Observation Direction System Providing Situational Awareness for a Sensor Web

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Lary, D. J.

    2010-12-01

    There is great utility in having a flexible and automated objective observation direction system for the decadal survey missions and beyond. Such a system allows us to optimize the observations made by suite of sensors to address specific goals from long term monitoring to rapid response. We have developed such a prototype using a network of communicating software elements to control a heterogeneous network of sensor systems, which can have multiple modes and flexible viewing geometries. Our system makes sensor systems intelligent and situationally aware. Together they form a sensor web of multiple sensors working together and capable of automated target selection, i.e. the sensors “know” where they are, what they are able to observe, what targets and with what priorities they should observe. This system is implemented in three components. The first component is a Sensor Web simulator. The Sensor Web simulator describes the capabilities and locations of each sensor as a function of time, whether they are orbital, sub-orbital, or ground based. The simulator has been implemented using AGIs Satellite Tool Kit (STK). STK makes it easy to analyze and visualize optimal solutions for complex space scenarios, and perform complex analysis of land, sea, air, space assets, and shares results in one integrated solution. The second component is target scheduler that was implemented with STK Scheduler. STK Scheduler is powered by a scheduling engine that finds better solutions in a shorter amount of time than traditional heuristic algorithms. The global search algorithm within this engine is based on neural network technology that is capable of finding solutions to larger and more complex problems and maximizing the value of limited resources. The third component is a modeling and data assimilation system. It provides situational awareness by supplying the time evolution of uncertainty and information content metrics that are used to tell us what we need to observe and the priority we should give to the observations. A prototype of this component was implemented with AutoChem. AutoChem is NASA release software constituting an automatic code generation, symbolic differentiator, analysis, documentation, and web site creation tool for atmospheric chemical modeling and data assimilation. Its model is explicit and uses an adaptive time-step, error monitoring time integration scheme for stiff systems of equations. AutoChem was the first model to ever have the facility to perform 4D-Var data assimilation and Kalman filter. The project developed a control system with three main accomplishments. First, fully multivariate observational and theoretical information with associated uncertainties was combined using a full Kalman filter data assimilation system. Second, an optimal distribution of the computations and of data queries was achieved by utilizing high performance computers/load balancing and a set of automatically mirrored databases. Third, inter-instrument bias correction was performed using machine learning. The PI for this project was Dr. David Lary of the UMBC Joint Center for Earth Systems Technology at NASA/Goddard Space Flight Center.

  12. Coastal Thematic Exploitation Platform (C-TEP): An innovative and collaborative platform to facilitate Big Data coastal research

    NASA Astrophysics Data System (ADS)

    Tuohy, Eimear; Clerc, Sebastien; Politi, Eirini; Mangin, Antoine; Datcu, Mihai; Vignudelli, Stefano; Illuzzi, Diomede; Craciunescu, Vasile; Aspetsberger, Michael

    2017-04-01

    The Coastal Thematic Exploitation Platform (C-TEP) is an on-going European Space Agency (ESA) funded project to develop a web service dedicated to the observation of the coastal environment and to support coastal management and monitoring. For over 20 years ESA satellites have provided a wealth of environmental data. The availability of an ever increasing volume of environmental data from satellite remote sensing provides a unique opportunity for exploratory science and the development of coastal applications. However, the diversity and complexity of EO data available, the need for efficient data access, information extraction, data management and high spec processing tools pose major challenges to achieving its full potential in terms of Big Data exploitation. C-TEP will provide a new means to handle the technical challenges of the observation of coastal areas and contribute to improved understanding and decision-making with respect to coastal resources and environments. C-TEP will unlock coastal knowledge and innovation as a collaborative, virtual work environment providing access to a comprehensive database of coastal Earth Observation (EO) data, in-situ data, model data and the tools and processors necessary to fully exploit these vast and heterogeneous datasets. The cloud processing capabilities provided, allow users to perform heavy processing tasks through a user-friendly Graphical User Interface (GUI). A connection to the PEPS (Plateforme pour l'Exploitation des Produits Sentinel) archive will provide data from Sentinel missions 1, 2 and 3. Automatic comparison tools will be provided to exploit the in-situ datasets in synergy with EO data. In addition, users may develop, test and share their own advanced algorithms for the extraction of coastal information. Algorithm validation will be facilitated by the capabilities to compute statistics over long time-series. Finally, C-TEP subscription services will allow users to perform automatic monitoring of some key indicators (water quality, water level, vegetation stress) from Near Real Time data. To demonstrate the benefits of C-TEP, three pilot cases have been implemented, each addressing specific, and highly topical, coastal research needs. These applications include change detection in land and seabed cover, water quality monitoring and reporting, and a coastal altimetry processor. The pilot cases demonstrate the wide scope of C-TEP and how it may contribute to European projects and international coastal networks. In conclusion, CTEP aims to provide new services and tools which will revolutionise accessibility to EO datasets, support a multi-disciplinary research collaboration, and the provision of long-term data series and innovative services for the monitoring of coastal regions.

  13. Data and Tools | Concentrating Solar Power | NREL

    Science.gov Websites

    download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and

  14. ECO-DRIVING MODELING ENVIRONMENT

    DOT National Transportation Integrated Search

    2015-11-01

    This research project aims to examine the eco-driving modeling capabilities of different traffic modeling tools available and to develop a driver-simulator-based eco-driving modeling tool to evaluate driver behavior and to reliably estimate or measur...

  15. Mechanics and energetics in tool manufacture and use: a synthetic approach.

    PubMed

    Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya

    2014-11-06

    Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  16. Mechanics and energetics in tool manufacture and use: a synthetic approach

    PubMed Central

    Wang, Liyu; Brodbeck, Luzius; Iida, Fumiya

    2014-01-01

    Tool manufacture and use are observed not only in humans but also in other animals such as mammals, birds and insects. Manufactured tools are used for biomechanical functions such as effective control of fluids and small solid objects and extension of reaching. These tools are passive and used with gravity and the animal users' own energy. From the perspective of evolutionary biology, manufactured tools are extended phenotypes of the genes of the animal and exhibit phenotypic plasticity. This incurs energetic cost of manufacture as compared to the case with a fixed tool. This paper studies mechanics and energetics aspects of tool manufacture and use in non-human beings. Firstly, it investigates possible mechanical mechanisms of the use of passive manufactured tools. Secondly, it formulates the energetic cost of manufacture and analyses when phenotypic plasticity benefits an animal tool maker and user. We take a synthetic approach and use a controlled physical model, i.e. a robot arm. The robot is capable of additively manufacturing scoop and gripper structures from thermoplastic adhesives to pick and place fluid and solid objects, mimicking primates and birds manufacturing tools for a similar function. We evaluate the effectiveness of tool use in pick-and-place and explain the mechanism for gripper tools picking up solid objects with a solid-mechanics model. We propose a way to formulate the energetic cost of tool manufacture that includes modes of addition and reshaping, and use it to analyse the case of scoop tools. Experiment results show that with a single motor trajectory, the robot was able to effectively pick and place water, rice grains, a pebble and a plastic box with a scoop tool or gripper tools that were manufactured by itself. They also show that by changing the dimension of scoop tools, the energetic cost of tool manufacture and use could be reduced. The work should also be interesting for engineers to design adaptive machines. PMID:25209405

  17. Improvements to Integrated Tradespace Analysis of Communications Architectures (ITACA) Network Loading Analysis Tool

    NASA Technical Reports Server (NTRS)

    Lee, Nathaniel; Welch, Bryan W.

    2018-01-01

    NASA's SCENIC project aims to simplify and reduce the cost of space mission planning by replicating the analysis capabilities of commercially licensed software which are integrated with relevant analysis parameters specific to SCaN assets and SCaN supported user missions. SCENIC differs from current tools that perform similar analyses in that it 1) does not require any licensing fees, 2) will provide an all-in-one package for various analysis capabilities that normally requires add-ons or multiple tools to complete. As part of SCENIC's capabilities, the ITACA network loading analysis tool will be responsible for assessing the loading on a given network architecture and generating a network service schedule. ITACA will allow users to evaluate the quality of service of a given network architecture and determine whether or not the architecture will satisfy the mission's requirements. ITACA is currently under development, and the following improvements were made during the fall of 2017: optimization of runtime, augmentation of network asset pre-service configuration time, augmentation of Brent's method of root finding, augmentation of network asset FOV restrictions, augmentation of mission lifetimes, and the integration of a SCaN link budget calculation tool. The improvements resulted in (a) 25% reduction in runtime, (b) more accurate contact window predictions when compared to STK(Registered Trademark) contact window predictions, and (c) increased fidelity through the use of specific SCaN asset parameters.

  18. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  19. WENESSA, Wide Eye-Narrow Eye Space Simulation fo Situational Awareness

    NASA Astrophysics Data System (ADS)

    Albarait, O.; Payne, D. M.; LeVan, P. D.; Luu, K. K.; Spillar, E.; Freiwald, W.; Hamada, K.; Houchard, J.

    In an effort to achieve timelier indications of anomalous object behaviors in geosynchronous earth orbit, a Planning Capability Concept (PCC) for a “Wide Eye-Narrow Eye” (WE-NE) telescope network has been established. The PCC addresses the problem of providing continuous and operationally robust, layered and cost-effective, Space Situational Awareness (SSA) that is focused on monitoring deep space for anomalous behaviors. It does this by first detecting the anomalies with wide field of regard systems, and then providing reliable handovers for detailed observational follow-up by another optical asset. WENESSA will explore the added value of such a system to the existing Space Surveillance Network (SSN). The study will assess and quantify the degree to which the PCC completely fulfills, or improves or augments, these deep space knowledge deficiencies relative to current operational systems. In order to improve organic simulation capabilities, we will explore options for the federation of diverse community simulation approaches, while evaluating the efficiencies offered by a network of small and larger aperture, ground-based telescopes. Existing Space Modeling and Simulation (M&S) tools designed for evaluating WENESSA-like problems will be taken into consideration as we proceed in defining and developing the tools needed to perform this study, leading to the creation of a unified Space M&S environment for the rapid assessment of new capabilities. The primary goal of this effort is to perform a utility assessment of the WE-NE concept. The assessment will explore the mission utility of various WE-NE concepts in discovering deep space anomalies in concert with the SSN. The secondary goal is to generate an enduring modeling and simulation environment to explore the utility of future proposed concepts and supporting technologies. Ultimately, our validated simulation framework would support the inclusion of other ground- and space-based SSA assets through integrated analysis. Options will be explored using at least two competing simulation capabilities, but emphasis will be placed on reasoned analyses as supported by the simulations.

  20. The Defense Threat Reduction Agency's Technical Nuclear Forensics Research and Development Program

    NASA Astrophysics Data System (ADS)

    Franks, J.

    2015-12-01

    The Defense Threat Reduction Agency (DTRA) Technical Nuclear Forensics (TNF) Research and Development (R&D) Program's overarching goal is to design, develop, demonstrate, and transition advanced technologies and methodologies that improve the interagency operational capability to provide forensics conclusions after the detonation of a nuclear device. This goal is attained through the execution of three focus areas covering the span of the TNF process to enable strategic decision-making (attribution): Nuclear Forensic Materials Exploitation - Development of targeted technologies, methodologies and tools enabling the timely collection, analysis and interpretation of detonation materials.Prompt Nuclear Effects Exploitation - Improve ground-based capabilities to collect prompt nuclear device outputs and effects data for rapid, complementary and corroborative information.Nuclear Forensics Device Characterization - Development of a validated and verified capability to reverse model a nuclear device with high confidence from observables (e.g., prompt diagnostics, sample analysis, etc.) seen after an attack. This presentation will outline DTRA's TNF R&D strategy and current investments, with efforts focusing on: (1) introducing new technical data collection capabilities (e.g., ground-based prompt diagnostics sensor systems; innovative debris collection and analysis); (2) developing new TNF process paradigms and concepts of operations to decrease timelines and uncertainties, and increase results confidence; (3) enhanced validation and verification (V&V) of capabilities through technology evaluations and demonstrations; and (4) updated weapon output predictions to account for the modern threat environment. A key challenge to expanding these efforts to a global capability is the need for increased post-detonation TNF international cooperation, collaboration and peer reviews.

  1. Earth Observations for Early Detection of Agricultural Drought: Contributions of the Famine Early Warning Systems Network (FEWS NET)

    NASA Astrophysics Data System (ADS)

    Budde, M. E.; Funk, C.; Husak, G. J.; Peterson, P.; Rowland, J.; Senay, G. B.; Verdin, J. P.

    2016-12-01

    The U.S. Geological Survey (USGS) has a long history of supporting the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET) program. The use of remote sensing and crop modeling to address food security threats in the form of drought, floods, pests, and changing climatic regimes has been a core activity in monitoring FEWS NET countries. In recent years, it has become a requirement that FEWS NET apply monitoring and modeling frameworks at global scales to assess emerging crises in regions that FEWS NET does not traditionally monitor. USGS FEWS NET, in collaboration with the University of California, Santa Barbara, has developed a number of new global applications of satellite observations, derived products, and efficient tools for visualization and analyses to address these requirements. (1) A 35-year quasi-global (+/- 50 degrees latitude) time series of gridded rainfall estimates, the Climate Hazards Infrared Precipitation with Stations (CHIRPS) dataset, based on infrared satellite imagery and station observations. Data are available as 5-day (pentadal) accumulations at 0.05 degree spatial resolution. (2) Global actual evapotranspiration data based on application of the Simplified Surface Energy Balance (SSEB) model using 10-day MODIS Land Surface Temperature composites at 1-km resolution. (3) Production of global expedited MODIS (eMODIS) 10-day NDVI composites updated every 5 days. (4) Development of an updated Early Warning eXplorer (EWX) tool for data visualization, analysis, and sharing. (5) Creation of stand-alone tools for enhancement of gridded rainfall data and trend analyses. (6) Establishment of an agro-climatology analysis tool and knowledge base for more than 90 countries of interest to FEWS NET. In addition to these new products and tools, FEWS NET has partnered with the GEOGLAM community to develop a Crop Monitor for Early Warning (CM4EW) which brings together global expertise in agricultural monitoring to reach consensus on growing season status of "countries at risk". Such engagements will result in enhanced capabilities for extending our monitoring efforts globally.

  2. EUV mask defect inspection and defect review strategies for EUV pilot line and high volume manufacturing

    NASA Astrophysics Data System (ADS)

    Chan, Y. David; Rastegar, Abbas; Yun, Henry; Putna, E. Steve; Wurm, Stefan

    2010-04-01

    Reducing mask blank and patterned mask defects is the number one challenge for extreme ultraviolet lithography. If the industry succeeds in reducing mask blank defects at the required rate of 10X every year for the next 2-3 years to meet high volume manufacturing defect requirements, new inspection and review tool capabilities will soon be needed to support this goal. This paper outlines the defect inspection and review tool technical requirements and suggests development plans to achieve pilot line readiness in 2011/12 and high volume manufacturing readiness in 2013. The technical specifications, tooling scenarios, and development plans were produced by a SEMATECH-led technical working group with broad industry participation from material suppliers, tool suppliers, mask houses, integrated device manufacturers, and consortia. The paper summarizes this technical working group's assessment of existing blank and mask inspection/review infrastructure capabilities to support pilot line introduction and outlines infrastructure development requirements and tooling strategies to support high volume manufacturing.

  3. High brightness electrodeless Z-Pinch EUV source for mask inspection tools

    NASA Astrophysics Data System (ADS)

    Horne, Stephen F.; Partlow, Matthew J.; Gustafson, Deborah S.; Besen, Matthew M.; Smith, Donald K.; Blackborow, Paul A.

    2012-03-01

    Energetiq Technology has been shipping the EQ-10 Electrodeless Z-pinchTM light source since 1995. The source is currently being used for metrology, mask inspection, and resist development. Energetiq's higher brightness source has been selected as the source for pre-production actinic mask inspection tools. This improved source enables the mask inspection tool suppliers to build prototype tools with capabilities of defect detection and review down to 16nm design rules. In this presentation we will present new source technology being developed at Energetiq to address the critical source brightness issue. The new technology will be shown to be capable of delivering brightness levels sufficient to meet the HVM requirements of AIMS and ABI and potentially API tools. The basis of the source technology is to use the stable pinch of the electrodeless light source and have a brightness of up to 100W/mm(carat)2-sr. We will explain the source design concepts, discuss the expected performance and present the modeling results for the new design.

  4. Automated SEM and TEM sample preparation applied to copper/low k materials

    NASA Astrophysics Data System (ADS)

    Reyes, R.; Shaapur, F.; Griffiths, D.; Diebold, A. C.; Foran, B.; Raz, E.

    2001-01-01

    We describe the use of automated microcleaving for preparation of both SEM and TEM samples as done by SELA's new MC500 and TEMstation tools. The MC500 is an automated microcleaving tool that is capable of producing cleaves with 0.25 μm accuracy resulting in SEM-ready samples. The TEMstation is capable of taking a sample output from the MC500 (or from SELA's earlier MC200 tool) and producing a FIB ready slice of 25±5 μm, mounted on a TEM-washer and ready for FIB thinning to electron transparency for TEM analysis. The materials selected for the tool set evaluation mainly included the Cu/TaN/HOSP low-k system. The paper is divided into three sections, experimental approach, SEM preparation and analysis of HOSP low-k, and TEM preparation and analysis of Cu/TaN/HOSP low-k samples. For the samples discussed, data is presented to show the quality of preparation provided by these new automated tools.

  5. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  6. Human-system interfaces for space cognitive awareness

    NASA Astrophysics Data System (ADS)

    Ianni, J.

    Space situational awareness is a human activity. We have advanced sensors and automation capabilities but these continue to be tools for humans to use. The reality is, however, that humans cannot take full advantage of the power of these tools due to time constraints, cognitive limitations, poor tool integration, poor human-system interfaces, and other reasons. Some excellent tools may never be used in operations and, even if they were, they may not be well suited to provide a cohesive and comprehensive picture. Recognizing this, the Air Force Research Laboratory (AFRL) is applying cognitive science principles to increase the knowledge derived from existing tools and creating new capabilities to help space analysts and decision makers. At the center of this research is Sensemaking Support Environment technology. The concept is to create cognitive-friendly computer environments that connect critical and creative thinking for holistic decision making. AFRL is also investigating new visualization technologies for multi-sensor exploitation and space weather, human-to-human collaboration technologies, and other technology that will be discussed in this paper.

  7. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  8. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  9. The Implications of Cognitive Psychology for Computer-Based Learning Tools.

    ERIC Educational Resources Information Center

    Kozma, Robert B.

    1987-01-01

    Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…

  10. Aviation System Analysis Capability Executive Assistant Analyses

    NASA Technical Reports Server (NTRS)

    Roberts, Eileen; Kostiuk, Peter

    1999-01-01

    This document describes the analyses that may be incorporated into the Aviation System Analysis Capability Executive Assistant. The document will be used as a discussion tool to enable NASA and other integrated aviation system entities to evaluate, discuss, and prioritize analyses.

  11. CREME: The 2011 Revision of the Cosmic Ray Effects on Micro-Electronics Code

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Barghouty, Abdulnasser F.; Reed, Robert A.; Sierawski, Brian D.; Watts, John W., Jr.

    2012-01-01

    We describe a tool suite, CREME, which combines existing capabilities of CREME96 and CREME86 with new radiation environment models and new Monte Carlo computational capabilities for single event effects and total ionizing dose.

  12. DATA-CONSTRAINED CORONAL MASS EJECTIONS IN A GLOBAL MAGNETOHYDRODYNAMICS MODEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, M.; Manchester, W. B.; Van der Holst, B.

    We present a first-principles-based coronal mass ejection (CME) model suitable for both scientific and operational purposes by combining a global magnetohydrodynamics (MHD) solar wind model with a flux-rope-driven CME model. Realistic CME events are simulated self-consistently with high fidelity and forecasting capability by constraining initial flux rope parameters with observational data from GONG, SOHO /LASCO, and STEREO /COR. We automate this process so that minimum manual intervention is required in specifying the CME initial state. With the newly developed data-driven Eruptive Event Generator using Gibson–Low configuration, we present a method to derive Gibson–Low flux rope parameters through a handful ofmore » observational quantities so that the modeled CMEs can propagate with the desired CME speeds near the Sun. A test result with CMEs launched with different Carrington rotation magnetograms is shown. Our study shows a promising result for using the first-principles-based MHD global model as a forecasting tool, which is capable of predicting the CME direction of propagation, arrival time, and ICME magnetic field at 1 au (see the companion paper by Jin et al. 2016a).« less

  13. Payload Planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Johnson, Tameka J.

    1995-01-01

    A review of the evolution of the International Space Station (ISS) was performed for the purpose of understanding the project objectives. It was requested than an analysis of the current Office of Space Access and Technology (OSAT) Partnership Utilization Plan (PUP) traffic model be completed to monitor the process through which the scientific experiments called payloads are manifested for flight to the ISS. A viewing analysis of the ISS was also proposed to identify the capability to observe the United States Laboratory (US LAB) during the assembly sequence. Observations of the Drop-Tower experiment and nondestructive testing procedures were also performed to maximize the intern's technical experience. Contributions were made to the meeting in which the 1996 OSAT or Code X PUP traffic model was generated using the software tool, Filemaker Pro. The current OSAT traffic model satisfies the requirement for manifesting and delivering the proposed payloads to station. The current viewing capability of station provides the ability to view the US LAB during station assembly sequence. The Drop Tower experiment successfully simulates the effect of microgravity and conveniently documents the results for later use. The non-destructive test proved effective in determining stress in various components tested.

  14. Quantitative Raman spectroscopy as a tool to study the kinetics and formation mechanism of carbonates.

    PubMed

    Bonales, L J; Muñoz-Iglesias, V; Santamaría-Pérez, D; Caceres, M; Fernandez-Remolar, D; Prieto-Ballesteros, O

    2013-12-01

    We have carried out a systematic study of abiotic precipitation at different temperatures of several Mg and Ca carbonates (calcite, nesquehonite, hydrocalcite) present in carbonaceous chondrites. This study highlights the capability of Raman spectroscopy as a primary tool for performing full mineralogical analysis. The precipitation reaction and the structure of the resulting carbonates were monitored and identified with Raman spectroscopy. Raman spectroscopy enabled us to confirm that the precipitation reaction is very fast (minutes) when Ca(II) is present in the solution, whereas for Mg(II) such reactions developed at rather slow rates (weeks). We also observed that both the composition and the reaction mechanisms depended on temperature, which might help to clarify several issues in the fields of planetology and geology, because of the environmental implications of these carbonates on both terrestrial and extraterrestrial objects. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. AMP: a science-driven web-based application for the TeraGrid

    NASA Astrophysics Data System (ADS)

    Woitaszek, M.; Metcalfe, T.; Shorrock, I.

    The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.

  16. Skeletal Muscle Ultrasound in Critical Care: A Tool in Need of Translation.

    PubMed

    Mourtzakis, Marina; Parry, Selina; Connolly, Bronwen; Puthucheary, Zudin

    2017-10-01

    With the emerging interest in documenting and understanding muscle atrophy and function in critically ill patients and survivors, ultrasonography has transformational potential for measurement of muscle quantity and quality. We discuss the importance of quantifying skeletal muscle in the intensive care unit setting. We also identify the merits and limitations of various modalities that are capable of accurately and precisely measuring muscularity. Ultrasound is emerging as a potentially powerful tool for skeletal muscle quantification; however, there are key challenges that need to be addressed in future work to ensure useful interpretation and comparability of results across diverse observational and interventional studies. Ultrasound presents several methodological challenges, and ultimately muscle quantification combined with metabolic, nutritional, and functional markers will allow optimal patient assessment and prognosis. Moving forward, we recommend that publications include greater detail on landmarking, repeated measures, identification of muscle that was not assessable, and reproducible protocols to more effectively compare results across different studies.

  17. Polarization of Coronal Forbidden Lines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Hao; Qu, Zhongquan; Landi Degl’Innocenti, Egidio, E-mail: sayahoro@ynao.ac.cn

    Since the magnetic field is responsible for most manifestations of solar activity, one of the most challenging problems in solar physics is the diagnostics of solar magnetic fields, particularly in the outer atmosphere. To this end, it is important to develop rigorous diagnostic tools to interpret polarimetric observations in suitable spectral lines. This paper is devoted to analyzing the diagnostic content of linear polarization imaging observations in coronal forbidden lines. Although this technique is restricted to off-limb observations, it represents a significant tool to diagnose the magnetic field structure in the solar corona, where the magnetic field is intrinsically weakmore » and still poorly known. We adopt the quantum theory of polarized line formation developed in the framework of the density matrix formalism, and synthesize images of the emergent linear polarization signal in coronal forbidden lines using potential-field source-surface magnetic field models. The influence of electronic collisions, active regions, and Thomson scattering on the linear polarization of coronal forbidden lines is also examined. It is found that active regions and Thomson scattering are capable of conspicuously influencing the orientation of the linear polarization. These effects have to be carefully taken into account to increase the accuracy of the field diagnostics. We also found that linear polarization observation in suitable lines can give valuable information on the long-term evolution of the magnetic field in the solar corona.« less

  18. Marine bioacoustics and technology: The new world of marine acoustic ecology

    NASA Astrophysics Data System (ADS)

    Hastings, Mardi C.; Au, Whitlow W. L.

    2012-11-01

    Marine animals use sound for communication, navigation, predator avoidance, and prey detection. Thus the rise in acoustic energy associated with increasing human activity in the ocean has potential to impact the lives of marine animals. Thirty years ago marine bioacoustics primarily focused on evaluating effects of human-generated sound on hearing and behavior by testing captive animals and visually observing wild animals. Since that time rapidly changing electronic and computing technologies have yielded three tools that revolutionized how bioacousticians study marine animals. These tools are (1) portable systems for measuring electrophysiological auditory evoked potentials, (2) miniaturized tags equipped with positioning sensors and acoustic recording devices for continuous short-term acoustical observation rather than intermittent visual observation, and (3) passive acoustic monitoring (PAM) systems for remote long-term acoustic observations at specific locations. The beauty of these breakthroughs is their direct applicability to wild animals in natural habitats rather than only to animals held in captivity. Hearing capabilities of many wild species including polar bears, beaked whales, and reef fishes have now been assessed by measuring their auditory evoked potentials. Miniaturized acoustic tags temporarily attached to an animal to record its movements and acoustic environment have revealed the acoustic foraging behavior of sperm and beaked whales. Now tags are being adapted to fishes in effort to understand their behavior in the presence of noise. Moving and static PAM systems automatically detect and characterize biological and physical features of an ocean area without adding any acoustic energy to the environment. PAM is becoming a powerful technique for understanding and managing marine habitats. This paper will review the influence of these transformative tools on the knowledge base of marine bioacoustics and elucidation of relationships between marine animals and their acoustic environment, leading to a new, rapidly growing field of marine acoustic ecology.

  19. Manipulability, force, and compliance analysis for planar continuum manipulators

    NASA Technical Reports Server (NTRS)

    Gravagne, Ian A.; Walker, Ian D.

    2002-01-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  20. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  1. Error Propagation Analysis in the SAE Architecture Analysis and Design Language (AADL) and the EDICT Tool Framework

    NASA Technical Reports Server (NTRS)

    LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.

    2011-01-01

    This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.

  2. Manipulability, force, and compliance analysis for planar continuum manipulators.

    PubMed

    Gravagne, Ian A; Walker, Ian D

    2002-06-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  3. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    NASA Technical Reports Server (NTRS)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  4. Interactive 3D Models and Simulations for Nuclear Security Education, Training, and Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warner, David K.; Dickens, Brian Scott; Heimer, Donovan J.

    By providing examples of products that have been produced in the past, it is the hopes of the authors that the audience will have a more thorough understanding of 3D modeling tools, potential applications, and capabilities that they can provide. Truly the applications and capabilities of these types of tools are only limited by one’s imagination. The future of three-dimensional models lies in the expansion into the world of virtual reality where one will experience a fully immersive first-person environment. The use of headsets and hand tools will allow students and instructors to have a more thorough spatial understanding ofmore » facilities and scenarios that they will encounter in the real world.« less

  5. On-line analysis capabilities developed to support the AFW wind-tunnel tests

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.

    1992-01-01

    A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.

  6. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  7. Weather from 250 Miles Up: Visualizing Precipitation Satellite Data (and Other Weather Applications) Using CesiumJS

    NASA Technical Reports Server (NTRS)

    Lammers, Matt

    2017-01-01

    Geospatial weather visualization remains predominately a two-dimensional endeavor. Even popular advanced tools like the Nullschool Earth display 2-dimensional fields on a 3-dimensional globe. Yet much of the observational data and model output contains detailed three-dimensional fields. In 2014, NASA and JAXA (Japanese Space Agency) launched the Global Precipitation Measurement (GPM) satellite. Its two instruments, the Dual-frequency Precipitation Radar (DPR) and GPM Microwave Imager (GMI) observe much of the Earth's atmosphere between 65 degrees North Latitude and 65 degrees South Latitude. As part of the analysis and visualization tools developed by the Precipitation Processing System (PPS) Group at NASA Goddard, a series of CesiumJS [Using Cesium Markup Language (CZML), JavaScript (JS) and JavaScript Object Notation (JSON)] -based globe viewers have been developed to improve data acquisition decision making and to enhance scientific investigation of the satellite data. Other demos have also been built to illustrate the capabilities of CesiumJS in presenting atmospheric data, including model forecasts of hurricanes, observed surface radar data, and gridded analyses of global precipitation. This talk will present these websites and the various workflows used to convert binary satellite and model data into a form easily integrated with CesiumJS.

  8. Suction-based grasping tool for removal of regular- and irregular-shaped intraocular foreign bodies.

    PubMed

    Erlanger, Michael S; Velez-Montoya, Raul; Mackenzie, Douglas; Olson, Jeffrey L

    2013-01-01

    To describe a suction-based grasping tool for the surgical removal of irregular-shaped and nonferromagnetic intraocular foreign bodies. A surgical tool with suction capabilities, consisting of a stainless steel shaft with a plastic handle and a customizable and interchangeable suction tip, was designed in order to better engage and manipulate irregular-shaped in-traocular foreign bodies of various sizes and physical properties. The maximal suction force and surgical capabilities were assessed in the laboratory and on a cadaveric eye vitrectomy model. The suction force of the water-tight seal between the intraocular foreign body and the suction tip was estimated to be approximately 40 MN. During an open-sky vitrectomy in a porcine model, the device was successful in engaging and firmly securing foreign bodies of different sizes and shapes. The suction-based grasping tool enables removal of irregular-shaped and nonferromagnetic foreign bodies. Copyright 2013, SLACK Incorporated.

  9. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  11. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  12. AXARM: An Extensible Remote Assistance and Monitoring Tool for ND Telerehabilitation

    NASA Astrophysics Data System (ADS)

    Bueno, Antonio; Marzo, Jose L.; Vallejo, Xavier

    AXARM is a multimedia tool for rehabilitation specialists that allow remote assistance and monitoring of patients activities. This tool is the evolution of the work done in 2005-06 between the BCDS research group of UdG and the Multiple Sclerosis Foundation (FEM in Spanish) in Girona under the TRiEM project. Multiple Sclerosis (MS) is a neurodegenerative disease (ND) that can provoke significant exhaustion in patients even just by going to the medical centre for rehabilitation or regular checking visits. The tool presented in this paper allows the medical staff to remotely carry on patient consults and activities from their home, minimizing the displacements to medical consulting. AXARM has a hybrid P2P architecture and consists essentially of a cross-platform videoconference system, with audio/video recording capabilities. The system can easily be extended to include new capabilities like, among others, asynchronous activities whose result can later be analyzed by the medical personnel.

  13. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  14. Launch Load Resistant Spacecraft Mechanism Bearings Made From NiTi Superelastic Intermetallic Materials

    NASA Technical Reports Server (NTRS)

    DellaCorte, Christopher; Moore, Lewis E., III

    2014-01-01

    Compared to conventional bearing materials (tool steel and ceramics), emerging Superelastic Intermetallic Materials (SIMs), such as 60NiTi, have significantly lower elastic modulus and enhanced strain capability. They are also immune to atmospheric corrosion (rusting). This offers the potential for increased resilience and superior ability to withstand static indentation load without damage. In this paper, the static load capacity of hardened 60NiTi 50-mm-bore ball bearing races are measured to correlate existing flat-plate indentation load capacity data to an actual bearing geometry through the Hertz stress relations. The results confirmed the validity of using the Hertz stress relations to model 60NiTi contacts; 60NiTi exhibits a static stress capability (approximately 3.1 GPa) between that of 440C (2.4 GPa) and REX20 (3.8 GPa) tool steel. When the reduced modulus and extended strain capability are taken into account, 60NiTi is shown to withstand higher loads than other bearing materials. To quantify this effect, a notional space mechanism, a 5-kg mass reaction wheel, was modeled with respect to launch load capability when supported on standard (catalogue geometry) design 440C; 60NiTi and REX20 tool steel bearings. For this application, the use of REX20 bearings increased the static load capability of the mechanism by a factor of three while the use of 60NiTi bearings resulted in an order of magnitude improvement compared to the baseline 440C stainless steel bearings

  15. Launch Load Resistant Spacecraft Mechanism Bearings Made From NiTi Superelastic Intermetallic Materials

    NASA Technical Reports Server (NTRS)

    Dellacorte, Christopher; Moore, Lewis E.

    2014-01-01

    Compared to conventional bearing materials (tool steel and ceramics), emerging Superelastic Intermetallic Materials (SIMs), such as 60NiTi, have significantly lower elastic modulus and enhanced strain capability. They are also immune to atmospheric corrosion (rusting). This offers the potential for increased resilience and superior ability to withstand static indentation load without damage. In this paper, the static load capacity of hardened 60NiTi 50mm bore ball-bearing races are measured to correlate existing flat-plate indentation load capacity data to an actual bearing geometry through the Hertz stress relations. The results confirmed the validity of using the Hertz stress relations to model 60NiTi contacts; 60NiTi exhibits a static stress capability (3.1GPa) between that of 440C (2.4GPa) and REX20 (3.8GPa) tool steel. When the reduced modulus and extended strain capability are taken into account, 60NiTi is shown to withstand higher loads than other bearing materials. To quantify this effect, a notional space mechanism, a 5kg mass reaction wheel, was modeled with respect to launch load capability when supported on 440C, 60NiTi and REX20 tool steel bearings. For this application, the use of REX20 bearings increased the static load capability of the mechanism by a factor of three while the use of 60NiTi bearings resulted in an order of magnitude improvement compared to the baseline 440C stainless steel bearings.

  16. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    NASA Technical Reports Server (NTRS)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  17. B-HIT - A Tool for Harvesting and Indexing Biodiversity Data

    PubMed Central

    Barker, Katharine; Braak, Kyle; Cawsey, E. Margaret; Coddington, Jonathan; Robertson, Tim; Whitacre, Jamie

    2015-01-01

    With the rapidly growing number of data publishers, the process of harvesting and indexing information to offer advanced search and discovery becomes a critical bottleneck in globally distributed primary biodiversity data infrastructures. The Global Biodiversity Information Facility (GBIF) implemented a Harvesting and Indexing Toolkit (HIT), which largely automates data harvesting activities for hundreds of collection and observational data providers. The team of the Botanic Garden and Botanical Museum Berlin-Dahlem has extended this well-established system with a range of additional functions, including improved processing of multiple taxon identifications, the ability to represent associations between specimen and observation units, new data quality control and new reporting capabilities. The open source software B-HIT can be freely installed and used for setting up thematic networks serving the demands of particular user groups. PMID:26544980

  18. B-HIT - A Tool for Harvesting and Indexing Biodiversity Data.

    PubMed

    Kelbert, Patricia; Droege, Gabriele; Barker, Katharine; Braak, Kyle; Cawsey, E Margaret; Coddington, Jonathan; Robertson, Tim; Whitacre, Jamie; Güntsch, Anton

    2015-01-01

    With the rapidly growing number of data publishers, the process of harvesting and indexing information to offer advanced search and discovery becomes a critical bottleneck in globally distributed primary biodiversity data infrastructures. The Global Biodiversity Information Facility (GBIF) implemented a Harvesting and Indexing Toolkit (HIT), which largely automates data harvesting activities for hundreds of collection and observational data providers. The team of the Botanic Garden and Botanical Museum Berlin-Dahlem has extended this well-established system with a range of additional functions, including improved processing of multiple taxon identifications, the ability to represent associations between specimen and observation units, new data quality control and new reporting capabilities. The open source software B-HIT can be freely installed and used for setting up thematic networks serving the demands of particular user groups.

  19. WFIRST: STScI Science Operations Center (SSOC) Activities and Plans

    NASA Astrophysics Data System (ADS)

    Gilbert, Karoline M.; STScI WFIRST Team

    2018-01-01

    The science operations for the WFIRST Mission will be distributed between Goddard Space Flight Center, the Space Telescope Science Institute (STScI), and the Infrared Processing and Analysis Center (IPAC). The STScI Science Operations Center (SSOC) will schedule and archive all WFIRST observations, will calibrate and produce pipeline-reduced data products for the Wide Field Instrument, and will support the astronomical community in planning WFI observations and analyzing WFI data. During the formulation phase, WFIRST team members at STScI have developed operations concepts for scheduling, data management, and the archive; have performed technical studies investigating the impact of WFIRST design choices on data quality and analysis; and have built simulation tools to aid the community in exploring WFIRST’s capabilities. We will highlight examples of each of these efforts.

  20. What is an affordance? 40 years later.

    PubMed

    Osiurak, François; Rossetti, Yves; Badets, Arnaud

    2017-06-01

    About 40 years ago, James J. Gibson coined the term "affordance" to describe the action possibilities offered to an animal by the environment with reference to the animal's action capabilities. Since then, this notion has acquired a multitude of meanings, generating confusion in the literature. Here, we offer a clear operationalization of the concept of affordances and related concepts in the field of tool use. Our operationalization is organized around the distinction between the physical (what is objectively observable) and neurocognitive (what is subjectively experienced) levels. This leads us to propose that motor control (dorso-dorsal system), mechanical knowledge (ventro-dorsal system) and function knowledge (ventral system) could be neurocognitive systems respectively involved in the perception of affordances, the understanding of mechanical actions and the storage of contextual relationships (three action-system model; 3AS). We end by turning to two key issues that can be addressed within 3AS. These issues concern the link between affordances and tool incorporation, and the constraints posed by affordances for tool use. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox

    NASA Astrophysics Data System (ADS)

    Harris, A. T., III; Goodman, J.; Justice, B.

    2014-12-01

    As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.

  2. Future Trends in Solar Radio Astronomy and Coronal Magnetic-Field Measurements

    NASA Astrophysics Data System (ADS)

    Fleishman, Gregory; Nita, Gelu; Gary, Dale

    Solar radio astronomy has an amazingly rich, but yet largely unexploited, potential for probing the solar corona and chromosphere. Radio emission offers multiple ways of detecting and tracking electron beams, studying chromospheric and coronal thermal structure, plasma processes, particle acceleration, and measuring magnetic fields. To turn the mentioned potential into real routine diagnostics, two major components are needed: (1) well-calibrated observations with high spatial, spectral, and temporal resolutions and (2) accurate and reliable theoretical models and fast numerical tools capable of recovering the emission source parameters from the radio data. This report gives a brief overview of the new, expanded, and planned radio facilities, such as Expanded Owens Valley Solar Array (EOVSA), Jansky Very Large Array (JVLA), Chinese Solar Radio Heliograph (CSRH), Upgraded Siberian Solar Radio Telescope (USSRT), and Frequency Agile Solar Radiotelescope (FASR) with the emphasis on their ability to measure the coronal magnetic fields in active regions and flares. In particular, we emphasize the new tools for 3D modeling of the radio emission and forward fitting tools in development needed to derive the magnetic field data from the radio measurements.

  3. Web Tools Streamline Climate Preparedness and Resilience Planning and Implementation for Water Resources Infrastructure

    NASA Astrophysics Data System (ADS)

    White, K. D.; Friedman, D.; Schechter, J.; Foley, P.; Mueller, C.; Baker, B.; Huber, M.; Veatch, W.

    2016-12-01

    Observed and projected impacts of climate change are pronounced on the hydrologic cycle because of the sensitivity of hydroclimatic variables to changes in temperature. Well-documented climate change impacts to the hydrologic cycle include increases in extreme heat conditions, coastal flooding, heavy precipitation, and drought frequency and magnitude, all of which can combine in surprising ways to pose regionally varying threats to public health and safety, ecosystem functions, and the economy. Climate preparedness and resilience activities are therefore necessary for water infrastructure which provides flood risk reduction, navigation, water supply, ecosystem restoration, and hydropower services. Because this water infrastructure entails long lifetimes, up to or beyond 100 years, and significant public investment, accurate and timely information about climate impacts over both the near-and far-term is required to plan and implement climate preparedness and resilience measures. Engineers are natural translators of science into actionable information to support this type of decision-making, because they understand both the important physical processes and the processes, laws, standards, and criteria required for the planning and design of public infrastructure. Though engineers are capable of the data management activities needed to ingest, transform, and prepare climate information for use in these decisions, the US Army Corps of Engineers (USACE) has chosen to emphasize analysis of information over data management. In doing so, the USACE is developing and using web tools with visualization capabilities to streamline climate preparedness and resilience planning and implementation while ensuring repeatable analytical results nationally. Examples discussed here include calculation of sea level change, including a comparison of mean sea level and other tidal statistics against scenarios of change; detection of abrupt and slowly varying nonstationarities in observed hydrologic data; and evaluations of projected flow frequency and duration that help to characterize future conditions and facilitate comparisons to observed conditions.

  4. Crew chief

    NASA Technical Reports Server (NTRS)

    Easterly, Jill

    1993-01-01

    This software package does ergonomic human modeling for maintenance tasks. Technician capabilities can be directed to represent actual situations of work environment, strengths and capabilities of the individual, particular limitations (such as constraining characteristics of a particular space suit), tools required, and procedures or tasks to be performed.

  5. Space and Cyber: Shared Challenges, Shared Opportunities

    DTIC Science & Technology

    2011-11-15

    adversaries to have effective capabilities against networks and computer systems, unlike those anywhere else—here, cyber criminals , proxies for hire, and...or unintentional, conditions can impact our ability to use space and cyber capabilities. As the tools and techniques developed by cyber criminals continue

  6. Students' Perceptions of the Usefulness of an E-Book with Annotative and Sharing Capabilities as a Tool for Learning: A Case Study

    ERIC Educational Resources Information Center

    Lim, Ee-Lon; Hew, Khe Foon

    2014-01-01

    E-books offer a range of benefits to both educators and students, including ease of accessibility and searching capabilities. However, the majority of current e-books are repository-cum-delivery platforms of textual information. Hitherto, there is a lack of empirical research that examines e-books with annotative and sharing capabilities. This…

  7. Space station operations management

    NASA Technical Reports Server (NTRS)

    Cannon, Kathleen V.

    1989-01-01

    Space Station Freedom operations management concepts must be responsive to the unique challenges presented by the permanently manned international laboratory. Space Station Freedom will be assembled over a three year period where the operational environment will change as significant capability plateaus are reached. First Element Launch, Man-Tended Capability, and Permanent Manned Capability, represent milestones in operational capability that is increasing toward mature operations capability. Operations management concepts are being developed to accomodate the varying operational capabilities during assembly, as well as the mature operational environment. This paper describes operations management concepts designed to accomodate the uniqueness of Space Station Freedoom, utilizing tools and processes that seek to control operations costs.

  8. Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges

    NASA Technical Reports Server (NTRS)

    He, Matt; Hardin, Danny; Mayer, Paul; Blakeslee, Richard; Goodman, Michael

    2012-01-01

    Airborne real time observations are a major component of NASA 's Earth Science research and satellite ground validation studies. Multiple aircraft are involved in most NASA field campaigns. The coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. Planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. A flight planning tools is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama ]Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin on web platform, and to the rising open source GIS tools with New Java Script frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives.

  9. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  10. Interagency Collaborators Develop and Implement ForWarn, a National, Near Real Time Forest Monitoring Tool

    NASA Technical Reports Server (NTRS)

    Underwood, Lauren

    2013-01-01

    ForWarn is a satellite-based forest monitoring tool that is being used to detect and monitor disturbances to forest conditions and forest health. It has been developed through the synergistic efforts, capabilities and contributions of four federal agencies, including the US Forest Service Eastern Forest and Western Wildland Environmental Threat Assessment Centers, NASA Stennis Space Center (SSC), Department of Energy's (DOE) Oak Ridge National Laboratory (ORNL) and US Geological Survey Earth (USGS) Earth Research Observation System (EROS), as well as university partners, including the University of North Carolina Asheville's National Environmental Modeling and Analysis Center (NEMAC). This multi-organizational partnership is key in producing a unique, path finding near real-time forest monitoring system that is now used by many federal, state and local government end-users. Such a system could not have been produced so effectively by any of these groups on their own. The forests of the United States provide many societal values and benefits, ranging from ecological, economic, cultural, to recreational. Therefore, providing a reliable and dependable forest and other wildland monitoring system is important to ensure the continued health, productivity, sustainability and prudent use of our Nation's forests and forest resources. ForWarn does this by producing current health indicator maps of our nation's forests based on satellite data from NASA's MODIS (Moderate Resolution Imaging Spectroradiometer) sensors. Such a capability can provide noteworthy value, cost savings and significant impact at state and local government levels because at those levels of government, once disturbances are evident and cause negative impacts, a response must be carried out. The observations that a monitoring system like ForWarn provide, can also contribute to a much broader-scale understanding of vegetation disturbances.

  11. A New Architecture for Extending the Capabilities of the Copernicus Trajectory Optimization Program

    NASA Technical Reports Server (NTRS)

    Williams, Jacob

    2015-01-01

    This paper describes a new plugin architecture developed for the Copernicus spacecraft trajectory optimization program. Details of the software architecture design and development are described, as well as examples of how the capability can be used to extend the tool in order to expand the type of trajectory optimization problems that can be solved. The inclusion of plugins is a significant update to Copernicus, allowing user-created algorithms to be incorporated into the tool for the first time. The initial version of the new capability was released to the Copernicus user community with version 4.1 in March 2015, and additional refinements and improvements were included in the recent 4.2 release. It is proving quite useful, enabling Copernicus to solve problems that it was not able to solve before.

  12. Process for Upgrading Cognitive Assessment Capabilities Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Picano, J. J.; Seaton, K. A.; Holland, A. W.

    2016-01-01

    MOTIVATION: Spaceflight poses varied and unique risks to the brain and cognitive functioning including radiation exposure, sleep disturbance, fatigue, fluid shifts (increased intracranial pressure), toxin exposure, elevated carbon dioxide, and traumatic brain injury, among others. These potential threats to cognitive functioning are capable of degrading performance and compromising mission success. Furthermore, the threats may increase in severity, and new types of threats may emerge for longer duration exploration missions. This presentation will describe the process used to identify gaps in our current approach, evaluate best practices in cognitive assessment, and transition new cognitive assessment tools to operational use. OVERVIEW: Risks to brain health and performance posed by spaceflight missions require sensitive tools to assess cognitive functioning of astronauts in flight. The Spaceflight Cognitive Assessment Tool for Windows (WinSCAT) is the automated cognitive assessment tool currently deployed onboard the International Space Station (ISS). WinSCAT provides astronauts and flight surgeons with objective data to monitor neurocognitive functioning. WinSCAT assesses 5 discrete cognitive domains, is sensitive to changes in cognitive functioning, and was designed to be completed in less than 15 minutes. However, WinSCAT does not probe other areas of cognitive functioning that might be important to mission success. Researchers recently have developed batteries that may expand current capabilities, such as increased sensitivity to subtle fluctuations in cognitive functioning. Therefore, we engaged in a systematic process review in order to improve upon our current capabilities and incorporate new advances in cognitive assessment. This process included a literature review on newer measures of neurocognitive assessment, surveys of operational flight surgeons at NASA regarding needs and gaps in our capabilities, and expert panel review of candidate cognitive measures and assessment issues and procedures. SIGNIFICANCE: Our process and the results that flowed from it may be helpful to aeromedical professionals charged with transitioning research findings to operational use. Our specific findings regarding cognitive assessment tools are of significance to professionals who must assess readiness to perform in mission critical situations in environments involving threats to cognition and performance

  13. Selected aspects of microelectronics technology and applications: Numerically controlled machine tools. Technology trends series no. 2

    NASA Astrophysics Data System (ADS)

    Sigurdson, J.; Tagerud, J.

    1986-05-01

    A UNIDO publication about machine tools with automatic control discusses the following: (1) numerical control (NC) machine tool perspectives, definition of NC, flexible manufacturing systems, robots and their industrial application, research and development, and sensors; (2) experience in developing a capability in NC machine tools; (3) policy issues; (4) procedures for retrieval of relevant documentation from data bases. Diagrams, statistics, bibliography are included.

  14. FleetDASH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singer, Mark R

    2017-09-06

    FleetDASH helps federal fleet managers maximize their use of alternative fuel. This presentation explains how the dashboard works and demonstrates the newest capabilities added to the tool. It also reviews complementary online tools available to fleet managers on the Alternative Fuel Data Center.

  15. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D.; Chu, C.; Mlynczak, P.

    2014-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products. The flagship products TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. These datasets encompass a wide range of temporal and spatial resolutions, suited to specific applications. We thus offer time resolutions that range from instantaneous to monthly means, with spatial resolutions that range from 20-km footprint to global scales. The 14-year record is mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. CERES products are also used by the remote sensing community for their climatological studies. In the last years however, our CERES products had been used by an even broader audience, like the green energy, health and environmental research communities, and others. Because of that, the CERES project has implemented a now well-established web-oriented Ordering and Visualization Tool (OVT), which is well into its fifth year of development. In order to help facilitate a comprehensive quality control of CERES products, the OVT Team began introducing a series of specialized functions. These include the 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and other specialized scientific application capabilities. Over time increasingly higher order temporal and spatial resolution products are being made available to the public through the CERES OVT. These high-resolution products require accessing the existing long-term archive - thus the reading of many very large netCDF or HDF files that pose a real challenge to the task of near instantaneous visualization. An overview of the CERES OVT basic functions and QC capabilities as well as future steps in expanding its capabilities will be presented at the meeting.

  16. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  17. Crowdsourced Contributions to the Nation's Geodetic Elevation Infrastructure

    NASA Astrophysics Data System (ADS)

    Stone, W. A.

    2014-12-01

    NOAA's National Geodetic Survey (NGS), a United States Department of Commerce agency, is engaged in providing the nation's fundamental positioning infrastructure - the National Spatial Reference System (NSRS) - which includes the framework for latitude, longitude, and elevation determination as well as various geodetic models, tools, and data. Capitalizing on Global Navigation Satellite System (GNSS) technology for improved access to the nation's precise geodetic elevation infrastructure requires use of a geoid model, which relates GNSS-derived heights (ellipsoid heights) with traditional elevations (orthometric heights). NGS is facilitating the use of crowdsourced GNSS observations collected at published elevation control stations by the professional surveying, geospatial, and scientific communities to help improve NGS' geoid modeling capability. This collocation of published elevation data and newly collected GNSS data integrates together the two height systems. This effort in turn supports enhanced access to accurate elevation information across the nation, thereby benefiting all users of geospatial data. By partnering with the public in this collaborative effort, NGS is not only helping facilitate improvements to the elevation infrastructure for all users but also empowering users of NSRS with the capability to do their own high-accuracy positioning. The educational outreach facet of this effort helps inform the public, including the scientific community, about the utility of various NGS tools, including the widely used Online Positioning User Service (OPUS). OPUS plays a key role in providing user-friendly and high accuracy access to NSRS, with optional sharing of results with NGS and the public. All who are interested in helping evolve and improve the nationwide elevation determination capability are invited to participate in this nationwide partnership and to learn more about the geodetic infrastructure which is a vital component of viable spatial data for many disciplines, including the geosciences.

  18. A virtual reality-based system integrated with fmri to study neural mechanisms of action observation-execution: A proof of concept study

    PubMed Central

    Adamovich, S.V.; August, K.; Merians, A.; Tunik, E.

    2017-01-01

    Purpose Emerging evidence shows that interactive virtual environments (VEs) may be a promising tool for studying sensorimotor processes and for rehabilitation. However, the potential of VEs to recruit action observation-execution neural networks is largely unknown. For the first time, a functional MRI-compatible virtual reality system (VR) has been developed to provide a window into studying brain-behavior interactions. This system is capable of measuring the complex span of hand-finger movements and simultaneously streaming this kinematic data to control the motion of representations of human hands in virtual reality. Methods In a blocked fMRI design, thirteen healthy subjects observed, with the intent to imitate (OTI), finger sequences performed by the virtual hand avatar seen in 1st person perspective and animated by pre-recorded kinematic data. Following this, subjects imitated the observed sequence while viewing the virtual hand avatar animated by their own movement in real-time. These blocks were interleaved with rest periods during which subjects viewed static virtual hand avatars and control trials in which the avatars were replaced with moving non-anthropomorphic objects. Results We show three main findings. First, both observation with intent to imitate and imitation with real-time virtual avatar feedback, were associated with activation in a distributed frontoparietal network typically recruited for observation and execution of real-world actions. Second, we noted a time-variant increase in activation in the left insular cortex for observation with intent to imitate actions performed by the virtual avatar. Third, imitation with virtual avatar feedback (relative to the control condition) was associated with a localized recruitment of the angular gyrus, precuneus, and extrastriate body area, regions which are (along with insular cortex) associated with the sense of agency. Conclusions Our data suggest that the virtual hand avatars may have served as disembodied training tools in the observation condition and as embodied “extensions” of the subject’s own body (pseudo-tools) in the imitation. These data advance our understanding of the brain-behavior interactions when performing actions in VE and have implications in the development of observation- and imitation-based VR rehabilitation paradigms. PMID:19531876

  19. A life scientist's gateway to distributed data management and computing: the PathPort/ToolBus framework.

    PubMed

    Eckart, J Dana; Sobral, Bruno W S

    2003-01-01

    The emergent needs of the bioinformatics community challenge current information systems. The pace of biological data generation far outstrips Moore's Law. Therefore, a gap continues to widen between the capabilities to produce biological (molecular and cell) data sets and the capability to manage and analyze these data sets. As a result, Federal investments in large data set generation produces diminishing returns in terms of the community's capabilities of understanding biology and leveraging that understanding to make scientific and technological advances that improve society. We are building an open framework to address various data management issues including data and tool interoperability, nomenclature and data communication standardization, and database integration. PathPort, short for Pathogen Portal, employs a generic, web-services based framework to deal with some of the problems identified by the bioinformatics community. The motivating research goal of a scalable system to provide data management and analysis for key pathosystems, especially relating to molecular data, has resulted in a generic framework using two major components. On the server-side, we employ web-services. On the client-side, a Java application called ToolBus acts as a client-side "bus" for contacting data and tools and viewing results through a single, consistent user interface.

  20. Looking into the crystal ball: future device learning using hybrid e-beam and optical lithography (Keynote Paper)

    NASA Astrophysics Data System (ADS)

    Steen, S. E.; McNab, S. J.; Sekaric, L.; Babich, I.; Patel, J.; Bucchignano, J.; Rooks, M.; Fried, D. M.; Topol, A. W.; Brancaccio, J. R.; Yu, R.; Hergenrother, J. M.; Doyle, J. P.; Nunes, R.; Viswanathan, R. G.; Purushothaman, S.; Rothwell, M. B.

    2005-05-01

    Semiconductor process development teams are faced with increasing process and integration complexity while the time between lithographic capability and volume production has remained more or less constant over the last decade. Lithography tools have often gated the volume checkpoint of a new device node on the ITRS roadmap. The processes have to be redeveloped after the tooling capability for the new groundrule is obtained since straight scaling is no longer sufficient. In certain cases the time window that the process development teams have is actually decreasing. In the extreme, some forecasts are showing that by the time the 45nm technology node is scheduled for volume production, the tooling vendors will just begin shipping the tools required for this technology node. To address this time pressure, IBM has implemented a hybrid-lithography strategy that marries the advantages of optical lithography (high throughput) with electron beam direct write lithography (high resolution and alignment capability). This hybrid-lithography scheme allows for the timely development of semiconductor processes for the 32nm node, and beyond. In this paper we will describe how hybrid lithography has enabled early process integration and device learning and how IBM applied e-beam & optical hybrid lithography to create the world's smallest working SRAM cell.

  1. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  2. Improving Water Management Decision Support Tools Using NASA Satellite and Modeling Data

    NASA Astrophysics Data System (ADS)

    Toll, D. L.; Arsenault, K.; Nigro, J.; Pinheiro, A.; Engman, E. T.; Triggs, J.; Cosgrove, B.; Alonge, C.; Boyle, D.; Allen, R.; Townsend, P.; Ni-Meister, W.

    2006-05-01

    One of twelve Applications of National priority within NASA's Applied Science Program, the Water Management Program Element addresses concerns and decision making related to water availability, water forecast and water quality. The goal of the Water Management Program Element is to encourage water management organizations to use NASA Earth science data, models products, technology and other capabilities in their decision support tools for problem solving. The Water Management Program Element partners with Federal agencies, academia, private firms, and may include international organizations. This paper further describes the Water Management Program with the objective of informing the applications community of the potential opportunities for using NASA science products for problem solving. We will illustrate some ongoing and application Water Management projects evaluating and benchmarking NASA data with partnering federal agencies and their decision support tools: 1) Environmental Protection Agency for water quality; 2) Bureau of Reclamation for water supply, demand and forecast; and 3) NOAA National Weather Service for improved weather prediction. Examples of the types of NASA contributions to the these agency decision support tools include: 1) satellite observations within models assist to estimate water storage, i.e., snow water equivalent, soil moisture, aquifer volumes, or reservoir storages; 2) model derived products, i.e., evapotranspiration, precipitation, runoff, ground water recharge, and other 4-dimensional data assimilation products; 3) improve water quality, assessments by using improved inputs from NASA models (precipitation, evaporation) and satellite observations (e.g., temperature, turbidity, land cover) to nonpoint source models; and 4) water (i.e., precipitation) and temperature predictions from days to decades over local, regional and global scales.

  3. A parallel calibration utility for WRF-Hydro on high performance computers

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, C.; Kotamarthi, V. R.

    2017-12-01

    A successful modeling of complex hydrological processes comprises establishing an integrated hydrological model which simulates the hydrological processes in each water regime, calibrates and validates the model performance based on observation data, and estimates the uncertainties from different sources especially those associated with parameters. Such a model system requires large computing resources and often have to be run on High Performance Computers (HPC). The recently developed WRF-Hydro modeling system provides a significant advancement in the capability to simulate regional water cycles more completely. The WRF-Hydro model has a large range of parameters such as those in the input table files — GENPARM.TBL, SOILPARM.TBL and CHANPARM.TBL — and several distributed scaling factors such as OVROUGHRTFAC. These parameters affect the behavior and outputs of the model and thus may need to be calibrated against the observations in order to obtain a good modeling performance. Having a parameter calibration tool specifically for automate calibration and uncertainty estimates of WRF-Hydro model can provide significant convenience for the modeling community. In this study, we developed a customized tool using the parallel version of the model-independent parameter estimation and uncertainty analysis tool, PEST, to enabled it to run on HPC with PBS and SLURM workload manager and job scheduler. We also developed a series of PEST input file templates that are specifically for WRF-Hydro model calibration and uncertainty analysis. Here we will present a flood case study occurred in April 2013 over Midwest. The sensitivity and uncertainties are analyzed using the customized PEST tool we developed.

  4. TBEST model enhancements : parcel level demographic data capabilities and exploration of enhanced trip attraction capabilities.

    DOT National Transportation Integrated Search

    2011-09-01

    "FDOT, in pursuit of its role to assist in providing public transportation services in Florida, has made a substantial : research investment in a travel demand forecasting tool for public transportation known as Transit Boardings : Estimation and Sim...

  5. New smoke predictions for Alaska in NOAA’s National Air Quality Forecast Capability

    NASA Astrophysics Data System (ADS)

    Davidson, P. M.; Ruminski, M.; Draxler, R.; Kondragunta, S.; Zeng, J.; Rolph, G.; Stajner, I.; Manikin, G.

    2009-12-01

    Smoke from wildfire is an important component of fine particle pollution, which is responsible for tens of thousands of premature deaths each year in the US. In Alaska, wildfire smoke is the leading cause of poor air quality in summer. Smoke forecast guidance helps air quality forecasters and the public take steps to limit exposure to airborne particulate matter. A new smoke forecast guidance tool, built by a cross-NOAA team, leverages efforts of NOAA’s partners at the USFS on wildfire emissions information, and with EPA, in coordinating with state/local air quality forecasters. Required operational deployment criteria, in categories of objective verification, subjective feedback, and production readiness, have been demonstrated in experimental testing during 2008-2009, for addition to the operational products in NOAA's National Air Quality Forecast Capability. The Alaska smoke forecast tool is an adaptation of NOAA’s smoke predictions implemented operationally for the lower 48 states (CONUS) in 2007. The tool integrates satellite information on location of wildfires with weather (North American mesoscale model) and smoke dispersion (HYSPLIT) models to produce daily predictions of smoke transport for Alaska, in binary and graphical formats. Hour-by hour predictions at 12km grid resolution of smoke at the surface and in the column are provided each day by 13 UTC, extending through midnight next day. Forecast accuracy and reliability are monitored against benchmark criteria for accuracy and reliability. While wildfire activity in the CONUS is year-round, the intense wildfire activity in AK is limited to the summer. Initial experimental testing during summer 2008 was hindered by unusually limited wildfire activity and very cloudy conditions. In contrast, heavier than average wildfire activity during summer 2009 provided a representative basis (more than 60 days of wildfire smoke) for demonstrating required prediction accuracy. A new satellite observation product was developed for routine near-real time verification of these predictions. The footprint of the predicted smoke from identified fires is verified with satellite observations of the spatial extent of smoke aerosols (5km resolution). Based on geostationary aerosol optical depth measurements that provide good time resolution of the horizontal spatial extent of the plumes, these observations do not yield quantitative concentrations of smoke particles at the surface. Predicted surface smoke concentrations are consistent with the limited number of in situ observations of total fine particle mass from all sources; however they are much higher than predicted for most CONUS fires. To assess uncertainty associated with fire emissions estimates, sensitivity analyses are in progress.

  6. The Requirements and Design of the Rapid Prototyping Capabilities System

    NASA Astrophysics Data System (ADS)

    Haupt, T. A.; Moorhead, R.; O'Hara, C.; Anantharaj, V.

    2006-12-01

    The Rapid Prototyping Capabilities (RPC) system will provide the capability to rapidly evaluate innovative methods of linking science observations. To this end, the RPC will provide the capability to integrate the software components and tools needed to evaluate the use of a wide variety of current and future NASA sensors, numerical models, and research results, model outputs, and knowledge, collectively referred to as "resources". It is assumed that the resources are geographically distributed, and thus RPC will provide the support for the location transparency of the resources. The RPC system requires providing support for: (1) discovery, semantic understanding, secure access and transport mechanisms for data products available from the known data provides; (2) data assimilation and geo- processing tools for all data transformations needed to match given data products to the model input requirements; (3) model management including catalogs of models and model metadata, and mechanisms for creation environments for model execution; and (4) tools for model output analysis and model benchmarking. The challenge involves developing a cyberinfrastructure for a coordinated aggregate of software, hardware and other technologies, necessary to facilitate RPC experiments, as well as human expertise to provide an integrated, "end-to-end" platform to support the RPC objectives. Such aggregation is to be achieved through a horizontal integration of loosely coupled services. The cyberinfrastructure comprises several software layers. At the bottom, the Grid fabric encompasses network protocols, optical networks, computational resources, storage devices, and sensors. At the top, applications use workload managers to coordinate their access to physical resources. Applications are not tightly bounded to a single physical resource. Instead, they bind dynamically to resources (i.e., they are provisioned) via a common grid infrastructure layer. For the RPC system, the cyberinfrastructure must support organizing computations (or "data transformations" in general) into complex workflows with resource discovery, automatic resource allocation, monitoring, preserving provenance as well as to aggregate heterogeneous, distributed data into knowledge databases. Such service orchestration is the responsibility of the "collective services" layer. For RPC, this layer will be based on Java Business Integration (JBI, [JSR-208]) specification which is a standards-based integration platform that combines messaging, web services, data transformation, and intelligent routing to reliably connect and coordinate the interaction of significant numbers of diverse applications (plug-in components) across organizational boundaries. JBI concept is a new approach to integration that can provide the underpinnings for loosely coupled, highly distributed integration network that can scale beyond the limits of currently used hub-and-spoke brokers. This presentation discusses the requirements, design and early prototype of the NASA-sponsored RPC system under development at Mississippi State University, demonstrating the integration of data provisioning mechanisms, data transformation tools and computational models into a single interoperable system enabling rapid execution of RPC experiments.

  7. Comparison of Performance Predictions for New Low-Thrust Trajectory Tools

    NASA Technical Reports Server (NTRS)

    Polsgrove, Tara; Kos, Larry; Hopkins, Randall; Crane, Tracie

    2006-01-01

    Several low thrust trajectory optimization tools have been developed over the last 3% years by the Low Thrust Trajectory Tools development team. This toolset includes both low-medium fidelity and high fidelity tools which allow the analyst to quickly research a wide mission trade space and perform advanced mission design. These tools were tested using a set of reference trajectories that exercised each tool s unique capabilities. This paper compares the performance predictions of the various tools against several of the reference trajectories. The intent is to verify agreement between the high fidelity tools and to quantify the performance prediction differences between tools of different fidelity levels.

  8. Share Repository Framework: Component Specification and Otology

    DTIC Science & Technology

    2008-04-23

    Palantir Technologies has created one such software application to support the DoD intelligence community by providing robust capabilities for...managing data from various sources. The Palantir tool is based on user-defined ontologies and supports multiple representation and analysis tools

  9. Unveiling the Diffuse, Neutral Interstellar Medium: Absorption Spectroscopy of Galactic Hydrogen

    NASA Astrophysics Data System (ADS)

    Murray, Claire Elizabeth

    The formation of stars and evolution of galaxies depends on the cycle of interstellar matter between supernova-expelled plasma and molecule-rich gas. At the center of this cycle is multiphase neutral hydrogen (HI), whose physical conditions provide key ingredients to theoretical models. However, constraints for HI properties require measurements of gas emission and absorption which have been severely limited by previous observational capabilities. In this thesis, I present the largest survey of Galactic HI absorption ever undertaken with the Karl G. Jansky Very Large Array (VLA). The survey, 21 cm Spectral Line Observations of Neutral Gas with the VLA (21-SPONGE), is a statistical study of HI in all phases using direct absorption measurements. Leveraging novel calibration techniques, I demonstrate the capability of the VLA to detect a significant sample of 21 cm absorption lines from warm, diffuse HI. To maximize observational sensitivity, I stack the 21-SPONGE spectra and detect a pervasive signature of the warm neutral medium in absorption. The inferred excitation (or spin) temperature is consistent with existing estimates, yet higher than predictions from theoretical models of collisional HI excitation. This suggests that radiative feedback via resonant scattering of Lyalpha photons, known as the Wouthuysen-Field effect, is influential with important implications for cosmological 21 cm observations. Next, I compare 21-SPONGE with synthetic HI spectra from 3D numerical simulations using a new, objective decomposition and radiative transfer tool. I quantify the recovery of HI structures and their properties by Gaussian-fitted 21 cm spectral lines for the first time. I find that 21 cm absorption line shapes are sensitive to simulated physics, and demonstrate that my analysis method is a powerful tool for diagnosing neutral ISM conditions. Finally, I compare properties inferred from synthetic spectra with "true" simulation results to construct a bias correction function for estimating HI properties. I apply this correction to the mass distribution of HI as a function of temperature from 21-SPONGE, and find a significant fraction of thermally unstable gas. This confirms that non-steady radiative and dynamical processes, such as turbulence and supernovae, have a strong influence on the thermodynamic state of the ISM.

  10. Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn; Davis, Tom.

    2013-01-01

    NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.

  11. Development and Demonstration of a Computational Tool for the Analysis of Particle Vitiation Effects in Hypersonic Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Perkins, Hugh Douglas

    2010-01-01

    In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.

  12. The GMT/MATLAB Toolbox

    NASA Astrophysics Data System (ADS)

    Wessel, Paul; Luis, Joaquim F.

    2017-02-01

    The GMT/MATLAB toolbox is a basic interface between MATLAB® (or Octave) and GMT, the Generic Mapping Tools, which allows MATLAB users full access to all GMT modules. Data may be passed between the two programs using intermediate MATLAB structures that organize the metadata needed; these are produced when GMT modules are run. In addition, standard MATLAB matrix data can be used directly as input to GMT modules. The toolbox improves interoperability between two widely used tools in the geosciences and extends the capability of both tools: GMT gains access to the powerful computational capabilities of MATLAB while the latter gains the ability to access specialized gridding algorithms and can produce publication-quality PostScript-based illustrations. The toolbox is available on all platforms and may be downloaded from the GMT website.

  13. En route Spacing Tool: Efficient Conflict-free Spacing to Flow-Restricted Airspace

    NASA Technical Reports Server (NTRS)

    Green, S.

    1999-01-01

    This paper describes the Air Traffic Management (ATM) problem within the U.S. of flow-restricted en route airspace, an assessment of its impact on airspace users, and a set of near-term tools and procedures to resolve the problem. The FAA is committed, over the next few years, to deploy the first generation of modem ATM decision support tool (DST) technology under the Free-Flight Phase-1 (FFp1) program. The associated en route tools include the User Request Evaluation Tool (URET) and the Traffic Management Advisor (TMA). URET is an initial conflict probe (ICP) capability that assists controllers with the detection and resolution of conflicts in en route airspace. TMA orchestrates arrivals transitioning into high-density terminal airspace by providing controllers with scheduled times of arrival (STA) and delay feedback advisories to assist with STA conformance. However, these FFPl capabilities do not mitigate the en route Miles-In-Trail (MIT) restrictions that are dynamically applied to mitigate airspace congestion. National statistics indicate that en route facilities (Centers) apply Miles-In-Trail (MIT) restrictions for approximately 5000 hours per month. Based on results from this study, an estimated 45,000 flights are impacted by these restrictions each month. Current-day practices for implementing these restrictions result in additional controller workload and an economic impact of which the fuel penalty alone may approach several hundred dollars per flight. To mitigate much of the impact of these restrictions on users and controller workload, a DST and procedures are presented. The DST is based on a simple derivative of FFP1 technology that is designed to introduce a set of simple tools for flow-rate (spacing) conformance and integrate them with conflict-probe capabilities. The tool and associated algorithms are described based on a concept prototype implemented within the CTAS baseline in 1995. A traffic scenario is used to illustrate the controller's use of the tool, and potential display options are presented for future controller evaluation.

  14. Development of an interpretive simulation tool for the proton radiography technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, M. C., E-mail: levymc@stanford.edu; Lawrence Livermore National Laboratory, Livermore, California 94551; Ryutov, D. D.

    2015-03-15

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper, we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool’s numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from particle-in-cell or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field “primitives” is also provided. This latter capability allows users tomore » add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagnetic field elements. We show an example application of the tool in connection to experimental observations of the Weibel instability in counterstreaming plasmas, using ∼10{sup 8} particles generated from a realistic laser-driven point-like proton source, imaging fields which cover volumes of ∼10 mm{sup 3}. Insights derived from this application show that the tool can support understanding of HED plasmas.« less

  15. EMU battery/SMM power tool characterization study

    NASA Technical Reports Server (NTRS)

    Palandati, C.

    1982-01-01

    The power tool which will be used to replace the attitude control system in the SMM spacecraft was modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery was tested for the power tool application. The results are that the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.

  16. The Earth Phenomena Observing System: Intelligent Autonomy for Satellite Operations

    NASA Technical Reports Server (NTRS)

    Ricard, Michael; Abramson, Mark; Carter, David; Kolitz, Stephan

    2003-01-01

    Earth monitoring systems of the future may include large numbers of inexpensive small satellites, tasked in a coordinated fashion to observe both long term and transient targets. For best performance, a tool which helps operators optimally assign targets to satellites will be required. We present the design of algorithms developed for real-time optimized autonomous planning of large numbers of small single-sensor Earth observation satellites. The algorithms will reduce requirements on the human operators of such a system of satellites, ensure good utilization of system resources, and provide the capability to dynamically respond to temporal terrestrial phenomena. Our initial real-time system model consists of approximately 100 satellites and large number of points of interest on Earth (e.g., hurricanes, volcanoes, and forest fires) with the objective to maximize the total science value of observations over time. Several options for calculating the science value of observations include the following: 1) total observation time, 2) number of observations, and the 3) quality (a function of e.g., sensor type, range, slant angle) of the observations. An integrated approach using integer programming, optimization and astrodynamics is used to calculate optimized observation and sensor tasking plans.

  17. The All-Sky Automated Survey for Supernovae (ASAS-SN) Light Curve Server v1.0

    NASA Astrophysics Data System (ADS)

    Kochanek, C. S.; Shappee, B. J.; Stanek, K. Z.; Holoien, T. W.-S.; Thompson, Todd A.; Prieto, J. L.; Dong, Subo; Shields, J. V.; Will, D.; Britt, C.; Perzanowski, D.; Pojmański, G.

    2017-10-01

    The All-Sky Automated Survey for Supernovae (ASAS-SN) is working toward imaging the entire visible sky every night to a depth of V˜ 17 mag. The present data covers the sky and spans ˜2-5 years with ˜100-400 epochs of observation. The data should contain some ˜1 million variable sources, and the ultimate goal is to have a database of these observations publicly accessible. We describe here a first step, a simple but unprecedented web interface https://asas-sn.osu.edu/ that provides an up to date aperture photometry light curve for any user-selected sky coordinate. The V band photometry is obtained using a two-pixel (16.″0) radius aperture and is calibrated against the APASS catalog. Because the light curves are produced in real time, this web tool is relatively slow and can only be used for small samples of objects. However, it also imposes no selection bias on the part of the ASAS-SN team, allowing the user to obtain a light curve for any point on the celestial sphere. We present the tool, describe its capabilities, limitations, and known issues, and provide a few illustrative examples.

  18. Chemotaxis of cancer cells in three-dimensional environment monitored label-free by quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Schnekenburger, Jürgen; Ketelhut, Steffi

    2017-02-01

    We investigated the capabilities of digital holographic microscopy (DHM) for label-free quantification of the response of living single cells to chemical stimuli in 3D assays. Fibro sarcoma cells were observed in a collagen matrix inside 3D chemotaxis chambers with a Mach-Zehnder interferometer-based DHM setup. From the obtained series of quantitative phase images, the migration trajectories of single cells were retrieved by automated cell tracking and subsequently analyzed for maximum migration distance and motility. Our results demonstrate DHM as a highly reliable and efficient tool for label-free quantification of chemotaxis in 2D and 3D environments.

  19. Study of Some Planetary Atmospheres Features by Probe Entry and Descent Simulations

    NASA Technical Reports Server (NTRS)

    Gil, P. J. S.; Rosa, P. M. B.

    2005-01-01

    Characterization of planetary atmospheres is analyzed by its effects in the entry and descent trajectories of probes. Emphasis is on the most important variables that characterize atmospheres e.g. density profile with altitude. Probe trajectories are numerically determined with ENTRAP, a developing multi-purpose computational tool for entry and descent trajectory simulations capable of taking into account many features and perturbations. Real data from Mars Pathfinder mission is used. The goal is to be able to determine more accurately the atmosphere structure by observing real trajectories and what changes are to expect in probe descent trajectories if atmospheres have different properties than the ones assumed initially.

  20. High-power graphic computers for visual simulation: a real-time--rendering revolution

    NASA Technical Reports Server (NTRS)

    Kaiser, M. K.

    1996-01-01

    Advances in high-end graphics computers in the past decade have made it possible to render visual scenes of incredible complexity and realism in real time. These new capabilities make it possible to manipulate and investigate the interactions of observers with their visual world in ways once only dreamed of. This paper reviews how these developments have affected two preexisting domains of behavioral research (flight simulation and motion perception) and have created a new domain (virtual environment research) which provides tools and challenges for the perceptual psychologist. Finally, the current limitations of these technologies are considered, with an eye toward how perceptual psychologist might shape future developments.

  1. Picometre displacement measurements using a differential Fabry-Perot optical interferometer and an x-ray interferometer

    NASA Astrophysics Data System (ADS)

    Çelik, Mehmet; Hamid, Ramiz; Kuetgens, Ulrich; Yacoot, Andrew

    2012-08-01

    X-ray interferometry is emerging as an important tool for dimensional nanometrology both for sub-nanometre measurement and displacement. It has been used to verify the performance of the next generation of displacement measuring optical interferometers within the European Metrology Research Programme project NANOTRACE. Within this project a more detailed set of comparison measurements between the x-ray interferometer and a dual channel Fabry-Perot optical interferometer (DFPI) have been made to demonstrate the capabilities of both instruments for picometre displacement metrology. The results show good agreement between the two instruments, although some minor differences of less than 5 pm have been observed.

  2. The contribution of morphological knowledge to French MeSH mapping for information retrieval.

    PubMed Central

    Zweigenbaum, P.; Darmoni, S. J.; Grabar, N.

    2001-01-01

    MeSH-indexed Internet health directories must provide a mapping from natural language queries to MeSH terms so that both health professionals and the general public can query their contents. We describe here the design of lexical knowledge bases for mapping French expressions to MeSH terms, and the initial evaluation of their contribution to Doc'CISMeF, the search tool of a MeSH-indexed directory of French-language medical Internet resources. The observed trend is in favor of the use of morphological knowledge as a moderate (approximately 5%) but effective factor for improving query to term mapping capabilities. PMID:11825295

  3. Astronomical Polarimetry with the RIT Polarization Imaging Camera

    NASA Astrophysics Data System (ADS)

    Vorobiev, Dmitry V.; Ninkov, Zoran; Brock, Neal

    2018-06-01

    In the last decade, imaging polarimeters based on micropolarizer arrays have been developed for use in terrestrial remote sensing and metrology applications. Micropolarizer-based sensors are dramatically smaller and more mechanically robust than other polarimeters with similar spectral response and snapshot capability. To determine the suitability of these new polarimeters for astronomical applications, we developed the RIT Polarization Imaging Camera to investigate the performance of these devices, with a special attention to the low signal-to-noise regime. We characterized the device performance in the lab, by determining the relative throughput, efficiency, and orientation of every pixel, as a function of wavelength. Using the resulting pixel response model, we developed demodulation procedures for aperture photometry and imaging polarimetry observing modes. We found that, using the current calibration, RITPIC is capable of detecting polarization signals as small as ∼0.3%. The relative ease of data collection, calibration, and analysis provided by these sensors suggest than they may become an important tool for a number of astronomical targets.

  4. NASA'S SERVIR Gulf of Mexico Project: The Gulf of Mexico Regional Collaborative (GoMRC)

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Irwin, Daniel; Presson, Joan; Estes, Maury; Estes, Sue; Judd, Kathleen

    2006-01-01

    The Gulf of Mexico Regional Collaborative (GoMRC) is a NASA-funded project that has as its goal to develop an integrated, working, prototype IT infrastructure for Earth science data, knowledge and models for the five Gulf U.S. states and Mexico, and to demonstrate its ability to help decision-makers better understand critical Gulf-scale issues. Within this preview, the mission of this project is to provide cross cutting solution network and rapid prototyping capability for the Gulf of Mexico region, in order to demonstrate substantial, collaborative, multi-agency research and transitional capabilities using unique NASA data sets and models to address regional problems. SERVIR Mesoamerica is seen as an excellent existing framework that can be used to integrate observational and GIs data bases, provide a sensor web interface, visualization and interactive analysis tools, archival functions, data dissemination and product generation within a Rapid Prototyping concept to assist decision-makers in better understanding Gulf-scale environmental issues.

  5. Optical performance analysis of plenoptic camera systems

    NASA Astrophysics Data System (ADS)

    Langguth, Christin; Oberdörster, Alexander; Brückner, Andreas; Wippermann, Frank; Bräuer, Andreas

    2014-09-01

    Adding an array of microlenses in front of the sensor transforms the capabilities of a conventional camera to capture both spatial and angular information within a single shot. This plenoptic camera is capable of obtaining depth information and providing it for a multitude of applications, e.g. artificial re-focusing of photographs. Without the need of active illumination it represents a compact and fast optical 3D acquisition technique with reduced effort in system alignment. Since the extent of the aperture limits the range of detected angles, the observed parallax is reduced compared to common stereo imaging systems, which results in a decreased depth resolution. Besides, the gain of angular information implies a degraded spatial resolution. This trade-off requires a careful choice of the optical system parameters. We present a comprehensive assessment of possible degrees of freedom in the design of plenoptic systems. Utilizing a custom-built simulation tool, the optical performance is quantified with respect to particular starting conditions. Furthermore, a plenoptic camera prototype is demonstrated in order to verify the predicted optical characteristics.

  6. Innovative Near Real-Time Data Dissemination Tools Developed by the Space Weather Research Center

    NASA Astrophysics Data System (ADS)

    Mullinix, R.; Maddox, M. M.; Berrios, D.; Kuznetsova, M.; Pulkkinen, A.; Rastaetter, L.; Zheng, Y.

    2012-12-01

    Space weather affects virtually all of NASA's endeavors, from robotic missions to human exploration. Knowledge and prediction of space weather conditions are therefore essential to NASA operations. The diverse nature of currently available space environment measurements and modeling products compels the need for a single access point to such information. The Integrated Space Weather Analysis (iSWA) System provides this single point access along with the capability to collect and catalog a vast range of sources including both observational and model data. NASA Goddard Space Weather Research Center heavily utilizes the iSWA System daily for research, space weather model validation, and forecasting for NASA missions. iSWA provides the capabilities to view and analyze near real-time space weather data from any where in the world. This presentation will describe the technology behind the iSWA system and describe how to use the system for space weather research, forecasting, training, education, and sharing.

  7. Insights into the physical chemistry of materials from advances in HAADF-STEM

    DOE PAGES

    Sohlberg, Karl; Pennycook, Timothy J.; Zhou, Wu; ...

    2014-11-13

    The observation that, ‘‘New tools lead to new science’’[P. S. Weiss, ACS Nano., 2012, 6(3), 1877–1879], is perhaps nowhere more evident than in scanning transmission electron microscopy (STEM). Advances in STEM have endowed this technique with several powerful and complimentary capabilities. For example, the application of high-angle annular dark-field imaging has made possible real-space imaging at subangstrom resolution with Z-contrast (Z = atomic number). Further advances have wrought: simultaneous real-space imaging and elemental identification by using electron energy loss spectroscopy (EELS); 3-dimensional (3D) mapping by depth sectioning; monitoring of surface diffusion by time-sequencing of images; reduced electron energy imaging formore » probing graphenes; etc. In this paper we review how these advances, often coupled with first-principles theory, have led to interesting and important new insights into the physical chemistry of materials. We then review in detail a few specific applications that highlight some of these STEM capabilities.« less

  8. Progress in and prospects for fluvial flood modelling.

    PubMed

    Wheater, H S

    2002-07-15

    Recent floods in the UK have raised public and political awareness of flood risk. There is an increasing recognition that flood management and land-use planning are linked, and that decision-support modelling tools are required to address issues of climate and land-use change for integrated catchment management. In this paper, the scientific context for fluvial flood modelling is discussed, current modelling capability is considered and research challenges are identified. Priorities include (i) appropriate representation of spatial precipitation, including scenarios of climate change; (ii) development of a national capability for continuous hydrological simulation of ungauged catchments; (iii) improved scientific understanding of impacts of agricultural land-use and land-management change, and the development of new modelling approaches to represent those impacts; (iv) improved representation of urban flooding, at both local and catchment scale; (v) appropriate parametrizations for hydraulic simulation of in-channel and flood-plain flows, assimilating available ground observations and remotely sensed data; and (vi) a flexible decision-support modelling framework, incorporating developments in computing, data availability, data assimilation and uncertainty analysis.

  9. A Deep Space Orbit Determination Software: Overview and Event Prediction Capability

    NASA Astrophysics Data System (ADS)

    Kim, Youngkwang; Park, Sang-Young; Lee, Eunji; Kim, Minsik

    2017-06-01

    This paper presents an overview of deep space orbit determination software (DSODS), as well as validation and verification results on its event prediction capabilities. DSODS was developed in the MATLAB object-oriented programming environment to support the Korea Pathfinder Lunar Orbiter (KPLO) mission. DSODS has three major capabilities: celestial event prediction for spacecraft, orbit determination with deep space network (DSN) tracking data, and DSN tracking data simulation. To achieve its functionality requirements, DSODS consists of four modules: orbit propagation (OP), event prediction (EP), data simulation (DS), and orbit determination (OD) modules. This paper explains the highest-level data flows between modules in event prediction, orbit determination, and tracking data simulation processes. Furthermore, to address the event prediction capability of DSODS, this paper introduces OP and EP modules. The role of the OP module is to handle time and coordinate system conversions, to propagate spacecraft trajectories, and to handle the ephemerides of spacecraft and celestial bodies. Currently, the OP module utilizes the General Mission Analysis Tool (GMAT) as a third-party software component for highfidelity deep space propagation, as well as time and coordinate system conversions. The role of the EP module is to predict celestial events, including eclipses, and ground station visibilities, and this paper presents the functionality requirements of the EP module. The validation and verification results show that, for most cases, event prediction errors were less than 10 millisec when compared with flight proven mission analysis tools such as GMAT and Systems Tool Kit (STK). Thus, we conclude that DSODS is capable of predicting events for the KPLO in real mission applications.

  10. Improvements in Thermal Protection Sizing Capabilities for TCAT: Conceptual Design for Advanced Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Olds, John R.; Izon, Stephen James

    2002-01-01

    The Thermal Calculation Analysis Tool (TCAT), originally developed for the Space Systems Design Lab at the Georgia Institute of Technology, is a conceptual design tool capable of integrating aeroheating analysis into conceptual reusable launch vehicle design. It provides Thermal Protection System (TPS) unit thicknesses and acreage percentages based on the geometry of the vehicle and a reference trajectory to be used in calculation of the total cost and weight of the vehicle design. TCAT has proven to be reasonably accurate at calculating the TPS unit weights for in-flight trajectories; however, it does not have the capability of sizing TPS materials above cryogenic fuel tanks for ground hold operations. During ground hold operations, the vehicle is held for a brief period (generally about two hours) during which heat transfer from the TPS materials to the cryogenic fuel occurs. If too much heat is extracted from the TPS material, the surface temperature may fall below the freezing point of water, thereby freezing any condensation that may be present at the surface of the TPS. Condensation or ice on the surface of the vehicle is potentially hazardous to the mission and can also damage the TPS. It is questionable whether or not the TPS thicknesses provided by the aeroheating analysis would be sufficiently thick to insulate the surface of the TPS from the heat transfer to the fuel. Therefore, a design tool has been developed that is capable of sizing TPS materials at these cryogenic fuel tank locations to augment TCAT's TPS sizing capabilities.

  11. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  12. Development of a fourth generation predictive capability maturity model.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard Guy; Witkowski, Walter R.; Urbina, Angel

    2013-09-01

    The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, themore » PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.« less

  13. NASA Automated Fiber Placement Capabilities: Similar Systems, Complementary Purposes

    NASA Technical Reports Server (NTRS)

    Wu, K. Chauncey; Jackson, Justin R.; Pelham, Larry I.; Stewart, Brian K.

    2015-01-01

    New automated fiber placement systems at the NASA Langley Research Center and NASA Marshall Space Flight Center provide state-of-art composites capabilities to these organizations. These systems support basic and applied research at Langley, complementing large-scale manufacturing and technology development at Marshall. These systems each consist of a multi-degree of freedom mobility platform including a commercial robot, a commercial tool changer mechanism, a bespoke automated fiber placement end effector, a linear track, and a rotational tool support structure. In addition, new end effectors with advanced capabilities may be either bought or developed with partners in industry and academia to extend the functionality of these systems. These systems will be used to build large and small composite parts in support of the ongoing NASA Composites for Exploration Upper Stage Project later this year.

  14. Using cloud-based mobile technology for assessment of competencies among medical students.

    PubMed

    Ferenchick, Gary S; Solomon, David

    2013-01-01

    Valid, direct observation of medical student competency in clinical settings remains challenging and limits the opportunity to promote performance-based student advancement. The rationale for direct observation is to ascertain that students have acquired the core clinical competencies needed to care for patients. Too often student observation results in highly variable evaluations which are skewed by factors other than the student's actual performance. Among the barriers to effective direct observation and assessment include the lack of effective tools and strategies for assuring that transparent standards are used for judging clinical competency in authentic clinical settings. We developed a web-based content management system under the name, Just in Time Medicine (JIT), to address many of these issues. The goals of JIT were fourfold: First, to create a self-service interface allowing faculty with average computing skills to author customizable content and criterion-based assessment tools displayable on internet enabled devices, including mobile devices; second, to create an assessment and feedback tool capable of capturing learner progress related to hundreds of clinical skills; third, to enable easy access and utilization of these tools by faculty for learner assessment in authentic clinical settings as a means of just in time faculty development; fourth, to create a permanent record of the trainees' observed skills useful for both learner and program evaluation. From July 2010 through October 2012, we implemented a JIT enabled clinical evaluation exercise (CEX) among 367 third year internal medicine students. Observers (attending physicians and residents) performed CEX assessments using JIT to guide and document their observations, record their time observing and providing feedback to the students, and their overall satisfaction. Inter-rater reliability and validity were assessed with 17 observers who viewed six videotaped student-patient encounters and by measuring the correlation between student CEX scores and their scores on subsequent standardized-patient OSCE exams. A total of 3567 CEXs were completed by 516 observers. The average number of evaluations per student was 9.7 (±1.8 SD) and the average number of CEXs completed per observer was 6.9 (±15.8 SD). Observers spent less than 10 min on 43-50% of the CEXs and 68.6% on feedback sessions. A majority of observers (92%) reported satisfaction with the CEX. Inter-rater reliability was measured at 0.69 among all observers viewing the videotapes and these ratings adequately discriminated competent from non-competent performance. The measured CEX grades correlated with subsequent student performance on an end-of-year OSCE. We conclude that the use of JIT is feasible in capturing discrete clinical performance data with a high degree of user satisfaction. Our embedded checklists had adequate inter-rater reliability and concurrent and predictive validity.

  15. Trans3D: a free tool for dynamical visualization of EEG activity transmission in the brain.

    PubMed

    Blinowski, Grzegorz; Kamiński, Maciej; Wawer, Dariusz

    2014-08-01

    The problem of functional connectivity in the brain is in the focus of attention nowadays, since it is crucial for understanding information processing in the brain. A large repertoire of measures of connectivity have been devised, some of them being capable of estimating time-varying directed connectivity. Hence, there is a need for a dedicated software tool for visualizing the propagation of electrical activity in the brain. To this aim, the Trans3D application was developed. It is an open access tool based on widely available libraries and supporting both Windows XP/Vista/7(™), Linux and Mac environments. Trans3D can create animations of activity propagation between electrodes/sensors, which can be placed by the user on the scalp/cortex of a 3D model of the head. Various interactive graphic functions for manipulating and visualizing components of the 3D model and input data are available. An application of the Trans3D tool has helped to elucidate the dynamics of the phenomena of information processing in motor and cognitive tasks, which otherwise would have been very difficult to observe. Trans3D is available at: http://www.eeg.pl/. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  17. Real Time Metrics and Analysis of Integrated Arrival, Departure, and Surface Operations

    NASA Technical Reports Server (NTRS)

    Sharma, Shivanjli; Fergus, John

    2017-01-01

    To address the Integrated Arrival, Departure, and Surface (IADS) challenge, NASA is developing and demonstrating trajectory-based departure automation under a collaborative effort with the FAA and industry known Airspace Technology Demonstration 2 (ATD-2). ATD-2 builds upon and integrates previous NASA research capabilities that include the Spot and Runway Departure Advisor (SARDA), the Precision Departure Release Capability (PDRC), and the Terminal Sequencing and Spacing (TSAS) capability. As trajectory-based departure scheduling and collaborative decision making tools are introduced in order to reduce delays and uncertainties in taxi and climb operations across the National Airspace System, users of the tools across a number of roles benefit from a real time system that enables common situational awareness. A real time dashboard was developed to inform and present users notifications and integrated information regarding airport surface operations. The dashboard is a supplement to capabilities and tools that incorporate arrival, departure, and surface air-traffic operations concepts in a NextGen environment. In addition to shared situational awareness, the dashboard offers the ability to compute real time metrics and analysis to inform users about capacity, predictability, and efficiency of the system as a whole. This paper describes the architecture of the real time dashboard as well as an initial proposed set of metrics. The potential impact of the real time dashboard is studied at the site identified for initial deployment and demonstration in 2017: Charlotte-Douglas International Airport (CLT). The architecture of implementing such a tool as well as potential uses are presented for operations at CLT. Metrics computed in real time illustrate the opportunity to provide common situational awareness and inform users of system delay, throughput, taxi time, and airport capacity. In addition, common awareness of delays and the impact of takeoff and departure restrictions stemming from traffic flow management initiatives are explored. The potential of the real time tool to inform users of the predictability and efficiency of using a trajectory-based departure scheduling system is also discussed.

  18. Drum ring removal/installation tool

    DOEpatents

    Andrade, William Andrew [Livermore, CA

    2006-11-14

    A handheld tool, or a pair of such tools, such as for use in removing/installing a bolt-type clamping ring on a container barrel/drum, where the clamping ring has a pair of clamping ends each with a throughbore. Each tool has an elongated handle and an elongated lever arm transversely connected to one end of the handle. The lever arm is capable of being inserted into the throughbore of a selected clamping end and leveraged with the handle to exert a first moment on the selected clamping end. Each tool also has a second lever arm, such as a socket with an open-ended slot, which is suspended alongside the first lever arm. The second lever arm is capable of engaging the selected clamping end and being leveraged with the handle to exert a second moment which is orthogonal to the first moment. In this manner, the first and second moments operate to hold the selected clamping end fixed relative to the tool so that the selected clamping end may be controlled with the handle. The pair of clamping ends may also be simultaneously and independently controlled with the use of two handles/tools so as to contort the geometry of the drum clamping ring and enable its removal/installation.

  19. Loads produced by a suited subject performing tool tasks without the use of foot restraints

    NASA Technical Reports Server (NTRS)

    Rajulu, Sudhakar L.; Poliner, Jeffrey; Klute, Glenn K.

    1993-01-01

    With an increase in the frequency of extravehicular activities (EVA's) aboard the Space Shuttle, NASA is interested in determining the capabilities of suited astronauts while performing manual tasks during an EVA, in particular the situations in which portable foot restraints are not used to stabilize the astronauts. Efforts were made to document the forces that are transmitted to spacecraft while pushing and pulling an object as well as while operating a standard wrench and an automatic power tool. The six subjects studied aboard the KC-135 reduced gravity aircraft were asked to exert a maximum torque and to maintain a constant level of torque with a wrench, to push and pull an EVA handrail, and to operate a Hubble Space Telescope (HST) power tool. The results give an estimate of the forces and moments that an operator will transmit to the handrail as well as to the supporting structure. In general, it was more effective to use the tool inwardly toward the body rather than away from the body. There were no differences in terms of strength capabilities between right and left hands. The power tool was difficult to use. It is suggested that ergonomic redesigning of the power tool may increase the efficiency of power tool use.

  20. Building Learning Modules for Undergraduate Education Using LEAD Technology

    NASA Astrophysics Data System (ADS)

    Clark, R. D.; Yalda, S.

    2006-12-01

    Linked Environments for Atmospheric Discovery (LEAD) has as its goal to make meteorological data, forecast models, and analysis and visualization tools available to anyone who wants to interactively explore the weather as it evolves. LEAD advances through the development and beta-deployment of Integrated Test Beds (ITBs), which are technology build-outs that are the fruition of collaborative IT and meteorological research. As the ITBs mature, opportunities emerge for the integration of this new technological capability into the education arena. The LEAD education and outreach initiative is aimed at bringing new capabilities into classroom from the middle school level to graduate education and beyond, and ensuring the congruency of this technology with curricular. One of the principal goals of LEAD is to democratize the availability of advanced weather technologies for research and education. The degree of democratization is tied to the growth of student knowledge and skills, and is correlated with education level (though not for every student in the same way). The average high school student may experience LEAD through an environment that retains a higher level of instructor control compared to the undergraduate and graduate student. This is necessary to accommodate not only differences in knowledge and skills, but the computer capabilities in the classroom such that the "teachable moment" is not lost.Undergraduates will have the opportunity to query observation data and model output, explore and discover relationships through concept mapping using an ontology service, select domains of interest based on current weather, and employ an experiment builder within the LEAD portal as an interface to configure, launch the WRF model, monitor the workflow, and visualize results using Unidata's Integrated Data Viewer (IDV), whether it be on a local server or across the TeraGrid. Such a robust and comprehensive suite of tools and services can create new paradigms for embedding students in an authentic, contextualized environment where the knowledge domain is an extension, yet integral supplement, to the classroom experience.This presentation describes two different approaches for the use of LEAD in undergraduate education: 1) a use-case for integrating LEAD technology into undergraduate subject material; and 2) making LEAD capability available to a select group of students participating in the National Collegiate Forecasting Contest (NCFC). The use-case (1) is designed to have students explore a particular weather phenomenon (e.g., a frontal boundary, jet streak, or lake effect snow event) through self-guided inquiry, and is intended as a supplement to classroom instruction. Students will use interactive, Web-based, LEAD-to-Learn modules created specifically to build conceptual knowledge of the phenomenon, adjoin germane terminology, explore relationships between concepts and similar phenomena using the LEAD ontology, and guide them through the experiment builder and workflow orchestration process in order to establish a high-resolution WRF run over a region that exhibits the characteristics of the phenomenon they wish to study. The results of the experiment will be stored in the student's MyLEAD workspace from which it can be retrieved, visualized and analyzed for atmospheric signatures characteristic of the phenomenon. The learning process is authentic in that students will be exposed to the same process of investigation, and will have available many of the same tools, as researchers. The modules serve to build content knowledge, guide discovery, and provide assessment while the LEAD portal opens the gateway to real-time observations, model accessibility, and a variety of tools, services, and resources.

  1. School Building Design: The Building as an Instructional Tool.

    ERIC Educational Resources Information Center

    Rakestraw, William E.

    1979-01-01

    Concepts used in the design of a Dallas school make the building an integral part of the instructional program. These concepts include instrumented resource consumption, wind powered electrical generating capabilities, solar powered domestic hot water system, grey water cycling and sampling capabilities, and mechanical systems monitoring.…

  2. Device Performance | Photovoltaic Research | NREL

    Science.gov Websites

    Device Performance Device Performance PV Calibrations Blog Check out the latest updates from the PV than 190 person-years. Capabilities Our capabilities for measuring key performance parameters of solar cells and modules include the use of various solar simulators and tools to measure current-voltage and

  3. 75 FR 66116 - Agency Information Collection Activities: Proposed Collection; Comment Request, OMB No. 1660-NEW...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ... logistics readiness, identify areas for targeted improvement, and develop a roadmap to both mitigate...; Logistics Capability Assessment Tool (LCAT) AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice... Paperwork Reduction Act of 1995, this Notice seeks comments concerning the Logistics Capability Assessment...

  4. Helping Resource Managers Understand Hydroclimatic Variability and Forecasts: A Case Study in Research Equity

    NASA Astrophysics Data System (ADS)

    Hartmann, H. C.; Pagano, T. C.; Sorooshian, S.; Bales, R.

    2002-12-01

    Expectations for hydroclimatic research are evolving as changes in the contract between science and society require researchers to provide "usable science" that can improve resource management policies and practices. However, decision makers have a broad range of abilities to access, interpret, and apply scientific research. "High-end users" have technical capabilities and operational flexibility capable of readily exploiting new information and products. "Low-end users" have fewer resources and are less likely to change their decision making processes without clear demonstration of benefits by influential early adopters (i.e., high-end users). Should research programs aim for efficiency, targeting high-end users? Should they aim for impact, targeting decisions with high economic value or great influence (e.g., state or national agencies)? Or should they focus on equity, whereby outcomes benefit groups across a range of capabilities? In this case study, we focus on hydroclimatic variability and forecasts. Agencies and individuals responsible for resource management decisions have varying perspectives about hydroclimatic variability and opportunities for using forecasts to improve decision outcomes. Improper interpretation of forecasts is widespread and many individuals find it difficult to place forecasts in an appropriate regional historical context. In addressing these issues, we attempted to mitigate traditional inequities in the scope, communication, and accessibility of hydroclimatic research results. High-end users were important in prioritizing information needs, while low-end users were important in determining how information should be communicated. For example, high-end users expressed hesitancy to use seasonal forecasts in the absence of quantitative performance evaluations. Our subsequently developed forecast evaluation framework and research products, however, were guided by the need for a continuum of evaluation measures and interpretive materials to enable low-end users to increase their understanding of probabilistic forecasts, credibility concepts, and implications for decision making. We also developed an interactive forecast assessment tool accessible over the Internet, to support resource decisions by individuals as well as agencies. The tool provides tutorials for guiding forecast interpretation, including quizzes that allow users to test their forecast interpretation skills. Users can monitor recent and historical observations for selected regions, communicated using terminology consistent with available forecast products. The tool also allows users to evaluate forecast performance for the regions, seasons, forecast lead times, and performance criteria relevant to their specific decision making situations. Using consistent product formats, the evaluation component allows individuals to use results at the level they are capable of understanding, while offering opportunity to shift to more sophisticated criteria. Recognizing that many individuals lack Internet access, the forecast assessment webtool design also includes capabilities for customized report generation so extension agents or other trusted information intermediaries can provide material to decision makers at meetings or site visits.

  5. From Analysis to Impact: Challenges and Outcomes from Google's Cloud-based Platforms for Analyzing and Leveraging Petapixels of Geospatial Data

    NASA Astrophysics Data System (ADS)

    Thau, D.

    2017-12-01

    For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson

  6. Afghan National Engineer Brigade: Despite U.S. Training Efforts, the Brigade is Incapable of Operating Independently

    DTIC Science & Technology

    2016-01-01

    carpentry, masonry , and the operation of heavy equipment. Plans called for the NEB to receive at least $29 million in engineering equipment and...JTF Sapper, NMCB 25, and NMCB 28, had responsibility for training the NEB in such areas as plumbing, electrical work, carpentry, masonry , and...measurement tool consisted of five possible ratings: fully capable, capable, partially capable, developing, and established. USFOR-A used these

  7. Qualitative Discovery in Medical Databases

    NASA Technical Reports Server (NTRS)

    Maluf, David A.

    2000-01-01

    Implication rules have been used in uncertainty reasoning systems to confirm and draw hypotheses or conclusions. However a major bottleneck in developing such systems lies in the elicitation of these rules. This paper empirically examines the performance of evidential inferencing with implication networks generated using a rule induction tool called KAT. KAT utilizes an algorithm for the statistical analysis of empirical case data, and hence reduces the knowledge engineering efforts and biases in subjective implication certainty assignment. The paper describes several experiments in which real-world diagnostic problems were investigated; namely, medical diagnostics. In particular, it attempts to show that: (1) with a limited number of case samples, KAT is capable of inducing implication networks useful for making evidential inferences based on partial observations, and (2) observation driven by a network entropy optimization mechanism is effective in reducing the uncertainty of predicted events.

  8. Structure and dynamics of mesophilic variants from the homing endonuclease I-DmoI

    NASA Astrophysics Data System (ADS)

    Alba, Josephine; Marcaida, Maria Jose; Prieto, Jesus; Montoya, Guillermo; Molina, Rafael; D'Abramo, Marco

    2017-12-01

    I-DmoI, from the hyperthermophilic archaeon Desulfurococcus mobilis, belongs to the LAGLIDADG homing endonuclease protein family. Its members are highly specific enzymes capable of recognizing long DNA target sequences, thus providing potential tools for genome manipulation. Working towards this particular application, many efforts have been made to generate mesophilic variants of I-DmoI that function at lower temperatures than the wild-type. Here, we report a structural and computational analysis of two I-DmoI mesophilic mutants. Despite very limited structural variations between the crystal structures of these variants and the wild-type, a different dynamical behaviour near the cleavage sites is observed. In particular, both the dynamics of the water molecules and the protein perturbation effect on the cleavage site correlate well with the changes observed in the experimental enzymatic activity.

  9. Lightning Imaging Sensor (LIS) on the International Space Station (ISS): Launch, Installation, Activation and First Results

    NASA Technical Reports Server (NTRS)

    Blakeslee, R. J.; Christian, H. J.; Mach, D. M.; Buechler, D. E.; Wharton, N. A.; Stewart, M. F.; Ellett, W. T.; Koshak, W. J.; Walker, T. D.; Virts, K.; hide

    2017-01-01

    Mission: Fly a flight-spare LIS (Lightning Imaging Sensor) on ISS to take advantage of unique capabilities provided by the ISS (e.g., high inclination, real time data); Integrate LIS as a hosted payload on the DoD Space Test Program-Houston 5 (STP-H5) mission and launch on a Space X rocket for a minimum 2 year mission. Measurement: NASA and its partners developed and demonstrated effectiveness and value of using space-based lightning observations as a remote sensing tool; LIS measures lightning (amount, rate, radiant energy) with storm scale resolution, millisecond timing, and high detection efficiency, with no land-ocean bias. Benefit: LIS on ISS will extend TRMM (Tropical Rainfall Measuring Mission) time series observations, expand latitudinal coverage, provide real time data to operational users, and enable cross-sensor calibration.

  10. NASA'S Water Resources Element Within the Applied Sciences Program

    NASA Technical Reports Server (NTRS)

    Toll, David; Doorn, Bradley; Engman, Edwin

    2011-01-01

    The NASA Earth Systems Division has the primary responsibility for the Applied Science Program and the objective to accelerate the use of NASA science results in applications to help solve problems important to society and the economy. The primary goal of the NASA Applied Science Program is to improve future and current operational systems by infusing them with scientific knowledge of the Earth system gained through space-based observation, assimilation of new observations, and development and deployment of enabling technologies, systems, and capabilities. This paper discusses major problems facing water resources managers, including having timely and accurate data to drive their decision support tools. It then describes how NASA's science and space based satellites may be used to overcome this problem. Opportunities for the water resources community to participate in NASA's Water Resources Applications Program are described.

  11. Towards improved and more routine Earth system model evaluation in CMIP

    DOE PAGES

    Eyring, Veronika; Gleckler, Peter J.; Heinze, Christoph; ...

    2016-11-01

    The Coupled Model Intercomparison Project (CMIP) has successfully provided the climate community with a rich collection of simulation output from Earth system models (ESMs) that can be used to understand past climate changes and make projections and uncertainty estimates of the future. Confidence in ESMs can be gained because the models are based on physical principles and reproduce many important aspects of observed climate. More research is required to identify the processes that are most responsible for systematic biases and the magnitude and uncertainty of future projections so that more relevant performance tests can be developed. At the same time,more » there are many aspects of ESM evaluation that are well established and considered an essential part of systematic evaluation but have been implemented ad hoc with little community coordination. Given the diversity and complexity of ESM analysis, we argue that the CMIP community has reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently and consistently. We provide a perspective and viewpoint on how a more systematic, open, and rapid performance assessment of the large and diverse number of models that will participate in current and future phases of CMIP can be achieved, and announce our intention to implement such a system for CMIP6. Accomplishing this could also free up valuable resources as many scientists are frequently "re-inventing the wheel" by re-writing analysis routines for well-established analysis methods. A more systematic approach for the community would be to develop and apply evaluation tools that are based on the latest scientific knowledge and observational reference, are well suited for routine use, and provide a wide range of diagnostics and performance metrics that comprehensively characterize model behaviour as soon as the output is published to the Earth System Grid Federation (ESGF). The CMIP infrastructure enforces data standards and conventions for model output and documentation accessible via the ESGF, additionally publishing observations (obs4MIPs) and reanalyses (ana4MIPs) for model intercomparison projects using the same data structure and organization as the ESM output. This largely facilitates routine evaluation of the ESMs, but to be able to process the data automatically alongside the ESGF, the infrastructure needs to be extended with processing capabilities at the ESGF data nodes where the evaluation tools can be executed on a routine basis. Efforts are already underway to develop community-based evaluation tools, and we encourage experts to provide additional diagnostic codes that would enhance this capability for CMIP. And, at the same time, we encourage the community to contribute observations and reanalyses for model evaluation to the obs4MIPs and ana4MIPs archives. The intention is to produce through the ESGF a widely accepted quasi-operational evaluation framework for CMIP6 that would routinely execute a series of standardized evaluation tasks. Over time, as this capability matures, we expect to produce an increasingly systematic characterization of models which, compared with early phases of CMIP, will more quickly and openly identify the strengths and weaknesses of the simulations. This will also reveal whether long-standing model errors remain evident in newer models and will assist modelling groups in improving their models. Finally, this framework will be designed to readily incorporate updates, including new observations and additional diagnostics and metrics as they become available from the research community.« less

  12. Inter-agency communication and operations capabilities during a hospital functional exercise: reliability and validity of a measurement tool.

    PubMed

    Savoia, Elena; Biddinger, Paul D; Burstein, Jon; Stoto, Michael A

    2010-01-01

    As proxies for actual emergencies, drills and exercises can raise awareness, stimulate improvements in planning and training, and provide an opportunity to examine how different components of the public health system would combine to respond to a challenge. Despite these benefits, there remains a substantial need for widely accepted and prospectively validated tools to evaluate agencies' and hospitals' performance during such events. Unfortunately, to date, few studies have focused on addressing this need. The purpose of this study was to assess the validity and reliability of a qualitative performance assessment tool designed to measure hospitals' communication and operational capabilities during a functional exercise. The study population included 154 hospital personnel representing nine hospitals that participated in a functional exercise in Massachusetts in June 2008. A 25-item questionnaire was developed to assess the following three hospital functional capabilities: (1) inter-agency communication; (2) communication with the public; and (3) disaster operations. Analyses were conducted to examine internal consistency, associations among scales, the empirical structure of the items, and inter-rater agreement. Twenty-two questions were retained in the final instrument, which demonstrated reliability with alpha coefficients of 0.83 or higher for all scales. A three-factor solution from the principal components analysis accounted for 57% of the total variance, and the factor structure was consistent with the original hypothesized domains. Inter-rater agreement between participants' self reported scores and external evaluators' scores ranged from moderate to good. The resulting 22-item performance measurement tool reliably measured hospital capabilities in a functional exercise setting, with preliminary evidence of concurrent and criterion-related validity.

  13. Current Capabilities and Planned Enhancements of SUSTAIN - Paper

    EPA Science Inventory

    Efforts have been under way by the U.S. Environmental Protection Agency (EPA) since 2003 to develop a decision-support tool for placement of best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment ...

  14. Evaluation and demonstration of commercialization potential of CCSI tools within gPROMS advanced simulation platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawal, Adekola; Schmal, Pieter; Ramos, Alfredo

    PSE, in the first phase of the CCSI commercialization project, set out to identify market opportunities for the CCSI tools combined with existing gPROMS platform capabilities and develop a clear technical plan for the proposed commercialization activities.

  15. Hobel stripper for shielded and unshielded flat conductor cable

    NASA Technical Reports Server (NTRS)

    Angele, W.

    1971-01-01

    Stripping tool exposes an area of shield for grounding purposes without removing an area of insulation between terminated shield and exposed conductors. Tool does not require heated blade and is capable of removing small portions of material at a time, to any depth.

  16. Web-based applications for building, managing and analysing kinetic models of biological systems.

    PubMed

    Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A

    2009-01-01

    Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.

  17. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    NASA Technical Reports Server (NTRS)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  18. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  19. Cassini RADAR observations of lakes and seas in the Northern Polar region of Titan: Bathymetry and Composition

    NASA Astrophysics Data System (ADS)

    Mastrogiuseppe, Marco; Hayes, Alex; Poggiali, Valerio; Lunine, Jonathan; Seu, Roberto; Hofgartner, Jason; Le Gall, Alice; Lorenz, Ralph; Mitri, Giuseppe

    2017-04-01

    Recent observations by the Cassini spacecraft has revealed its RADAR to be an invaluable tool for investigating Titan's seas and lakes. The T91 (May 2013) observation of Ligeia Mare, Titan's second largest sea, has demonstrated the capabilities of the RADAR, in its altimeter mode, to measure depth, composition and seafloor topography. The 104 (August 2014) observation provided similar data over the largest sea, Kraken Mare, and the T108 (January 2015) flyby acquired an altimetry pass over Punga Mare. The T49 (December 2007) altimetry pass over Ontario Lacus, the largest southern liquid body, has also been processed to retrieve subsurface echoes. Cassini's final flyby of Titan, T126 (April 2017), is the next and unique opportunity to observe an area in the Northern Polar region of Titan, where several small - medium size (5 - 30 km) lakes are present and have been previously imaged by Cassini. In our presentation, we will report the integrated results of these investigations and discuss them in the overall context of Titan's hydrologic cycle.

  20. Ionospheric Simulation System for Satellite Observations and Global Assimilative Model Experiments - ISOGAME

    NASA Technical Reports Server (NTRS)

    Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga; Stephens, Philip; Iijima, Bryron A.

    2013-01-01

    Modeling and imaging the Earth's ionosphere as well as understanding its structures, inhomogeneities, and disturbances is a key part of NASA's Heliophysics Directorate science roadmap. This invention provides a design tool for scientific missions focused on the ionosphere. It is a scientifically important and technologically challenging task to assess the impact of a new observation system quantitatively on our capability of imaging and modeling the ionosphere. This question is often raised whenever a new satellite system is proposed, a new type of data is emerging, or a new modeling technique is developed. The proposed constellation would be part of a new observation system with more low-Earth orbiters tracking more radio occultation signals broadcast by Global Navigation Satellite System (GNSS) than those offered by the current GPS and COSMIC observation system. A simulation system was developed to fulfill this task. The system is composed of a suite of software that combines the Global Assimilative Ionospheric Model (GAIM) including first-principles and empirical ionospheric models, a multiple- dipole geomagnetic field model, data assimilation modules, observation simulator, visualization software, and orbit design, simulation, and optimization software.

  1. Science Opportunity Analyzer (SOA) Version 8

    NASA Technical Reports Server (NTRS)

    Witoff, Robert J.; Polanskey, Carol A.; Aguinaldo, Anna Marie A.; Liu, Ning; Hofstadter, Mark D.

    2013-01-01

    SOA allows scientists to plan spacecraft observations. It facilitates the identification of geometrically interesting times in a spacecraft s orbit that a user can use to plan observations or instrument-driven spacecraft maneuvers. These observations can then be visualized multiple ways in both two- and three-dimensional views. When observations have been optimized within a spacecraft's flight rules, the resulting plans can be output for use by other JPL uplink tools. Now in its eighth major version, SOA improves on these capabilities in a modern and integrated fashion. SOA consists of five major functions: Opportunity Search, Visualization, Observation Design, Constraint Checking, and Data Output. Opportunity Search is a GUI-driven interface to existing search engines that can be used to identify times when a spacecraft is in a specific geometrical relationship with other bodies in the solar system. This function can be used for advanced mission planning as well as for making last-minute adjustments to mission sequences in response to trajectory modifications. Visualization is a key aspect of SOA. The user can view observation opportunities in either a 3D representation or as a 2D map projection. Observation Design allows the user to orient the spacecraft and visualize the projection of the instrument field of view for that orientation using the same views as Opportunity Search. Constraint Checking is provided to validate various geometrical and physical aspects of an observation design. The user has the ability to easily create custom rules or to use official project-generated flight rules. This capability may also allow scientists to easily assess the cost to science if flight rule changes occur. Data Output allows the user to compute ancillary data related to an observation or to a given position of the spacecraft along its trajectory. The data can be saved as a tab-delimited text file or viewed as a graph. SOA combines science planning functionality unique to both JPL and the sponsoring spacecraft. SOA is able to ingest JPL SPICE Kernels that are used to drive the tool and its computations. A Percy search engine is then included that identifies interesting time periods for the user to build observations. When observations are then built, flight-like orientation algorithms replicate spacecraft dynamics to closely simulate the flight spacecraft s dynamics. SOA v8 represents large steps forward from SOA v7 in terms of quality, reliability, maintainability, efficiency, and user experience. A tailored agile development environment has been built around SOA that provides automated unit testing, continuous build and integration, a consolidated Web-based code and documentation storage environment, modern Java enhancements, and a focus on usability

  2. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.; Rutan, D. A.

    2016-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  3. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  4. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  5. Local Debonding and Fiber Breakage in Composite Materials Modeled Accurately

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2001-01-01

    A prerequisite for full utilization of composite materials in aerospace components is accurate design and life prediction tools that enable the assessment of component performance and reliability. Such tools assist both structural analysts, who design and optimize structures composed of composite materials, and materials scientists who design and optimize the composite materials themselves. NASA Glenn Research Center's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package (http://www.grc.nasa.gov/WWW/LPB/mac) addresses this need for composite design and life prediction tools by providing a widely applicable and accurate approach to modeling composite materials. Furthermore, MAC/GMC serves as a platform for incorporating new local models and capabilities that are under development at NASA, thus enabling these new capabilities to progress rapidly to a stage in which they can be employed by the code's end users.

  6. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  7. Toward an objective assessment of technical skills: a national survey of surgical program directors in Saudi Arabia.

    PubMed

    Alkhayal, Abdullah; Aldhukair, Shahla; Alselaim, Nahar; Aldekhayel, Salah; Alhabdan, Sultan; Altaweel, Waleed; Magzoub, Mohi Elden; Zamakhshary, Mohammed

    2012-01-01

    After almost a decade of implementing competency-based programs in postgraduate training programs, the assessment of technical skills remains more subjective than objective. National data on the assessment of technical skills during surgical training are lacking. We conducted this study to document the assessment tools for technical skills currently used in different surgical specialties, their relationship with remediation, the recommended tools from the program directors' perspective, and program directors' attitudes toward the available objective tools to assess technical skills. This study was a cross-sectional survey of surgical program directors (PDs). The survey was initially developed using a focus group and was then sent to 116 PDs. The survey contains demographic information about the program, the objective assessment tools used, and the reason for not using assessment tools. The last section discusses the recommended tools to be used from the PDs' perspective and the PDs' attitude and motivation to apply these tools in each program. The associations between the responses to the assessment questions and remediation were statistically evaluated. Seventy-one (61%) participants responded. Of the respondents, 59% mentioned using only nonstandardized, subjective, direct observation for technical skills assessment. Sixty percent use only summative evaluation, whereas 15% perform only formative evaluations of their residents, and the remaining 22% conduct both summative and formative evaluations of their residents' technical skills. Operative portfolios are kept by 53% of programs. The percentage of programs with mechanisms for remediation is 29% (19 of 65). The survey showed that surgical training programs use different tools to assess surgical skills competency. Having a clear remediation mechanism was highly associated with reporting remediation, which reflects the capability to detect struggling residents. Surgical training leadership should invest more in standardizing the assessment of surgical skills.

  8. Multicomponent Musculoskeletal Movement Assessment Tools: A Systematic Review and Critical Appraisal of Their Development and Applicability to Professional Practice.

    PubMed

    Bennett, Hunter; Davison, Kade; Arnold, John; Slattery, Flynn; Martin, Max; Norton, Kevin

    2017-10-01

    Multicomponent movement assessment tools have become commonplace to measure movement quality, proposing to indicate injury risk and performance capabilities. Despite popular use, there has been no attempt to compare the components of each tool reported in the literature, the processes in which they were developed, or the underpinning rationale for their included content. As such, the objective of this systematic review was to provide a comprehensive summary of current movement assessment tools and appraise the evidence supporting their development. A systematic literature search was performed using PRISMA guidelines to identify multicomponent movement assessment tools. Commonalities between tools and the evidence provided to support the content of each tool was identified. Each tool underwent critical appraisal to identify the rigor in which it was developed, and its applicability to professional practice. Eleven tools were identified, of which 5 provided evidence to support their content as assessments of movement quality. One assessment tool (Soccer Injury Movement Screen [SIMS]) received an overall score of above 65% on critical appraisal, with a further 2 tools (Movement Competency Screen [MCS] and modified 4 movement screen [M4-MS]) scoring above 60%. Only the MCS provided clear justification for its developmental process. The remaining 8 tools scored between 40 and 60%. On appraisal, the MCS, M4-MS, and SIMS seem to provide the most practical value for assessing movement quality as they provide the strongest reports of developmental rigor and an identifiable evidence base. In addition, considering the evidence provided, these tools may have the strongest potential for identifying performance capabilities and guiding exercise prescription in athletic and sport-specific populations.

  9. Pigeons (Columba livia) as Trainable Observers of Pathology and Radiology Breast Cancer Images

    PubMed Central

    Levenson, Richard M.; Krupinski, Elizabeth A.; Navarro, Victor M.; Wasserman, Edward A.

    2015-01-01

    Pathologists and radiologists spend years acquiring and refining their medically essential visual skills, so it is of considerable interest to understand how this process actually unfolds and what image features and properties are critical for accurate diagnostic performance. Key insights into human behavioral tasks can often be obtained by using appropriate animal models. We report here that pigeons (Columba livia)—which share many visual system properties with humans—can serve as promising surrogate observers of medical images, a capability not previously documented. The birds proved to have a remarkable ability to distinguish benign from malignant human breast histopathology after training with differential food reinforcement; even more importantly, the pigeons were able to generalize what they had learned when confronted with novel image sets. The birds’ histological accuracy, like that of humans, was modestly affected by the presence or absence of color as well as by degrees of image compression, but these impacts could be ameliorated with further training. Turning to radiology, the birds proved to be similarly capable of detecting cancer-relevant microcalcifications on mammogram images. However, when given a different (and for humans quite difficult) task—namely, classification of suspicious mammographic densities (masses)—the pigeons proved to be capable only of image memorization and were unable to successfully generalize when shown novel examples. The birds’ successes and difficulties suggest that pigeons are well-suited to help us better understand human medical image perception, and may also prove useful in performance assessment and development of medical imaging hardware, image processing, and image analysis tools. PMID:26581091

  10. Pigeons (Columba livia) as Trainable Observers of Pathology and Radiology Breast Cancer Images.

    PubMed

    Levenson, Richard M; Krupinski, Elizabeth A; Navarro, Victor M; Wasserman, Edward A

    2015-01-01

    Pathologists and radiologists spend years acquiring and refining their medically essential visual skills, so it is of considerable interest to understand how this process actually unfolds and what image features and properties are critical for accurate diagnostic performance. Key insights into human behavioral tasks can often be obtained by using appropriate animal models. We report here that pigeons (Columba livia)-which share many visual system properties with humans-can serve as promising surrogate observers of medical images, a capability not previously documented. The birds proved to have a remarkable ability to distinguish benign from malignant human breast histopathology after training with differential food reinforcement; even more importantly, the pigeons were able to generalize what they had learned when confronted with novel image sets. The birds' histological accuracy, like that of humans, was modestly affected by the presence or absence of color as well as by degrees of image compression, but these impacts could be ameliorated with further training. Turning to radiology, the birds proved to be similarly capable of detecting cancer-relevant microcalcifications on mammogram images. However, when given a different (and for humans quite difficult) task-namely, classification of suspicious mammographic densities (masses)-the pigeons proved to be capable only of image memorization and were unable to successfully generalize when shown novel examples. The birds' successes and difficulties suggest that pigeons are well-suited to help us better understand human medical image perception, and may also prove useful in performance assessment and development of medical imaging hardware, image processing, and image analysis tools.

  11. Unidata Workshop: Demonstrating Democratization of Numerical Weather Prediction Capabilities Using Linked Environments for Atmospheric Discovery (LEAD) Capabilities

    NASA Astrophysics Data System (ADS)

    Baltzer, T.; Wilson, A.; Marru, S.; Rossi, A.; Christi, M.; Hampton, S.; Gannon, D.; Alameda, J.; Ramamurthy, M.; Droegemeier, K.

    2006-12-01

    On July 13th 2006 during the triannual Unidata Workshop, members of the Unidata community got their first experience with capabilities being developed under the Linked Environments for Atmospheric Discovery (LEAD) project (see: http://lead.ou.edu). The key LEAD goal demonstrated during the workshop was that of "Democratization," that is, providing capabilities that typically have a high barrier to entry to the larger meteorological community. At the workshop, participants worked with software that demonstrated the specific concepts of: 1) Lowering the barrier to entry by making it easy for users to: - Experiment using meteorological tools - Create meteorological forecasts - Perform mesoscale modeling and forecasting - Access data (source and product) - Make use of large scale cyberinfrastructure (E.g. TeraGrid) 2) Giving users the freedom from technological issues such as: - Hassle-free access to supercomputing resources - Hassle-free execution of forecast models and related tools - Data format independence This talk will overview the capabilities presented to the Unidata workshop participants as well as capabilities developed since the workshop. There will also be a lessons-learned section. This overview will be accomplished with a live demonstration of some of the capabilities. Capabilities that will be discussed and demonstrated have applicability across many disciplines e.g. discovering, acquiring and using data and orchestrating of complex workflow. Acknowledgement: The LEAD project involves the work of nearly 100 individuals whose dedication has resulted in the capabilities that will be shown here. The authors would like to recognize all of them, but in particular we'd like to recognize: John Caron, Rich Clark, Ethan Davis, Charles Hart, Yuan Ho, Scott Jenson, Rob Kambic, Brian Kelly, Ning Liu, Jeff McWhirter, Don Murray, Beth Plale, Rahul Ramachandran, Yogesh Simmhan, Kevin Thomas, Nithya Vijayakumar, Yunheng Wang, Dan Weber, and Bob Wilhelmson.

  12. Advances in a distributed approach for ocean model data interoperability

    USGS Publications Warehouse

    Signell, Richard P.; Snowden, Derrick P.

    2014-01-01

    An infrastructure for earth science data is emerging across the globe based on common data models and web services. As we evolve from custom file formats and web sites to standards-based web services and tools, data is becoming easier to distribute, find and retrieve, leaving more time for science. We describe recent advances that make it easier for ocean model providers to share their data, and for users to search, access, analyze and visualize ocean data using MATLAB® and Python®. These include a technique for modelers to create aggregated, Climate and Forecast (CF) metadata convention datasets from collections of non-standard Network Common Data Form (NetCDF) output files, the capability to remotely access data from CF-1.6-compliant NetCDF files using the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS), a metadata standard for unstructured grid model output (UGRID), and tools that utilize both CF and UGRID standards to allow interoperable data search, browse and access. We use examples from the U.S. Integrated Ocean Observing System (IOOS®) Coastal and Ocean Modeling Testbed, a project in which modelers using both structured and unstructured grid model output needed to share their results, to compare their results with other models, and to compare models with observed data. The same techniques used here for ocean modeling output can be applied to atmospheric and climate model output, remote sensing data, digital terrain and bathymetric data.

  13. A Portable Gait Asymmetry Rehabilitation System for Individuals with Stroke Using a Vibrotactile Feedback.

    PubMed

    Afzal, Muhammad Raheel; Oh, Min-Kyun; Lee, Chang-Hee; Park, Young Sook; Yoon, Jungwon

    2015-01-01

    Gait asymmetry caused by hemiparesis results in reduced gait efficiency and reduced activity levels. In this paper, a portable rehabilitation device is proposed that can serve as a tool in diagnosing gait abnormalities in individuals with stroke and has the capability of providing vibration feedback to help compensate for the asymmetric gait. Force-sensitive resistor (FSR) based insoles are used to detect ground contact and estimate stance time. A controller (Arduino) provides different vibration feedback based on the gait phase measurement. It also allows wireless interaction with a personal computer (PC) workstation using the XBee transceiver module, featuring data logging capabilities for subsequent analysis. Walking trials conducted with healthy young subjects allowed us to observe that the system can influence abnormality in the gait. The results of trials showed that a vibration cue based on temporal information was more effective than intensity information. With clinical experiments conducted for individuals with stroke, significant improvement in gait symmetry was observed with minimal disturbance caused to the balance and gait speed as an effect of the biofeedback. Future studies of the long-term rehabilitation effects of the proposed system and further improvements to the system will result in an inexpensive, easy-to-use, and effective rehabilitation device.

  14. A Portable Gait Asymmetry Rehabilitation System for Individuals with Stroke Using a Vibrotactile Feedback

    PubMed Central

    Afzal, Muhammad Raheel; Oh, Min-Kyun; Lee, Chang-Hee; Park, Young Sook; Yoon, Jungwon

    2015-01-01

    Gait asymmetry caused by hemiparesis results in reduced gait efficiency and reduced activity levels. In this paper, a portable rehabilitation device is proposed that can serve as a tool in diagnosing gait abnormalities in individuals with stroke and has the capability of providing vibration feedback to help compensate for the asymmetric gait. Force-sensitive resistor (FSR) based insoles are used to detect ground contact and estimate stance time. A controller (Arduino) provides different vibration feedback based on the gait phase measurement. It also allows wireless interaction with a personal computer (PC) workstation using the XBee transceiver module, featuring data logging capabilities for subsequent analysis. Walking trials conducted with healthy young subjects allowed us to observe that the system can influence abnormality in the gait. The results of trials showed that a vibration cue based on temporal information was more effective than intensity information. With clinical experiments conducted for individuals with stroke, significant improvement in gait symmetry was observed with minimal disturbance caused to the balance and gait speed as an effect of the biofeedback. Future studies of the long-term rehabilitation effects of the proposed system and further improvements to the system will result in an inexpensive, easy-to-use, and effective rehabilitation device. PMID:26161398

  15. Game-Based Approaches' Pedagogical Principles: Exploring Task Constraints in Youth Soccer.

    PubMed

    Serra-Olivares, Jaime; González-Víllora, Sixto; García-López, Luis Miguel; Araújo, Duarte

    2015-06-27

    This study tested the use of two pedagogical principles of Game-based approaches, representation and exaggeration, in the context of game performance of U10 soccer players. Twenty-one players participated in two 3 vs. 3 small-sided games. The first small-sided game was modified by representation. The second small-sided game was modified by enhancing the penetration of the defense tactical problem for invasion games. Decision-making and execution were assessed using the Game Performance Evaluation Tool. No significant differences were observed between games in the number of decision-making units related to keeping possession, nor in those related to penetrating the defense. No significant differences were observed in any execution ability (ball control, passing, dribbling and get free movements). The findings suggested that both games could provide similar degeneracy processes to the players for skill acquisition (specific and contextualized task constraints in which they could develop their game performance and the capability to achieve different outcomes in varying contexts). Probably both games had similar learner-environment dynamics leading players to develop their capabilities for adapting their behaviours to the changing performance situations. More research is necessary, from the ecological dynamics point of view, to determine how we should use small-sided games in Game-based approaches.

  16. On the potential of the EChO mission to characterize gas giant atmospheres

    NASA Astrophysics Data System (ADS)

    Barstow, J. K.; Aigrain, S.; Irwin, P. G. J.; Bowles, N.; Fletcher, L. N.; Lee, J.-M.

    2013-04-01

    Space telescopes such as Exoplanet Characterisation Observatory (EChO) and James Webb Space Telescope (JWST) will be important for the future study of extrasolar planet atmospheres. Both of these missions are capable of performing high sensitivity spectroscopic measurements at moderate resolutions in the visible and infrared, which will allow the characterization of atmospheric properties using primary and secondary transit spectroscopy. We use the Non-linear optimal Estimator for MultivariateE spectral analysis (NEMESIS) radiative transfer and retrieval tool, as developed by Irwin et al. and Lee et al., to explore the potential of the proposed EChO mission to solve the retrieval problem for a range of H2-He planets orbiting different stars. We find that EChO should be capable of retrieving temperature structure to ˜200 K precision and detecting H2O, CO2 and CH4 from a single eclipse measurement for a hot Jupiter orbiting a Sun-like star and a hot Neptune orbiting an M star, also providing upper limits on CO and NH3. We provide a table of retrieval precisions for these quantities in each test case. We expect around 30 Jupiter-sized planets to be observable by EChO; hot Neptunes orbiting M dwarfs are rarer, but we anticipate observations of at least one similar planet.

  17. Game-Based Approaches’ Pedagogical Principles: Exploring Task Constraints in Youth Soccer

    PubMed Central

    Serra-Olivares, Jaime; González-Víllora, Sixto; García-López, Luis Miguel; Araújo, Duarte

    2015-01-01

    This study tested the use of two pedagogical principles of Game-based approaches, representation and exaggeration, in the context of game performance of U10 soccer players. Twenty-one players participated in two 3 vs. 3 small-sided games. The first small-sided game was modified by representation. The second small-sided game was modified by enhancing the penetration of the defense tactical problem for invasion games. Decision-making and execution were assessed using the Game Performance Evaluation Tool. No significant differences were observed between games in the number of decision-making units related to keeping possession, nor in those related to penetrating the defense. No significant differences were observed in any execution ability (ball control, passing, dribbling and get free movements). The findings suggested that both games could provide similar degeneracy processes to the players for skill acquisition (specific and contextualized task constraints in which they could develop their game performance and the capability to achieve different outcomes in varying contexts). Probably both games had similar learner-environment dynamics leading players to develop their capabilities for adapting their behaviours to the changing performance situations. More research is necessary, from the ecological dynamics point of view, to determine how we should use small-sided games in Game-based approaches. PMID:26240668

  18. Prosthetic Hand For Holding Rods, Tools, And Handles

    NASA Technical Reports Server (NTRS)

    Belcher, Jewell G., Jr.; Vest, Thomas W.

    1995-01-01

    Prosthetic hand with quick-grip/quick-release lever broadens range of specialized functions available to lower-arm amputee by providing improved capabilities for gripping rods, tools, handles, and like. Includes two stationary lower fingers opposed by one pivoting upper finger. Lever operates in conjunction with attached bracket.

  19. Turning information into knowledge for rangeland management

    USDA-ARS?s Scientific Manuscript database

    The kind of knowledge system that will be capable of meeting the needs of rangeland managers will evolve as scientists, technology specialists, managers, and biologists find ways to integrate the ever expanding array of information systems and tools to meet their needs. The tools and techniques high...

  20. Evaluating Digital Authoring Tools

    ERIC Educational Resources Information Center

    Wilde, Russ

    2004-01-01

    As the quality of authoring software increases, online course developers become less reliant on proprietary learning management systems, and develop skills in the design of original, in-house materials and the delivery platforms for them. This report examines the capabilities of digital authoring software tools for the development of learning…

  1. Towards improved capability and confidence in coupled atmospheric and wildland fire modeling

    NASA Astrophysics Data System (ADS)

    Sauer, Jeremy A.

    This dissertation work is aimed at improving the capability and confidence in a modernized and improved version of Los Alamos National Laboratory's coupled atmospheric and wild- land fire dynamics model, Higrad-Firetec. Higrad is the hydrodynamics component of this large eddy simulation model that solves the three dimensional, fully compressible Navier-Stokes equations, incorporating a dynamic eddy viscosity formulation through a two-scale turbulence closure scheme. Firetec is the vegetation, drag forcing, and combustion physics portion that is integrated with Higrad. The modern version of Higrad-Firetec incorporates multiple numerical methodologies and high performance computing aspects which combine to yield a unique tool capable of augmenting theoretical and observational investigations in order to better understand the multi-scale, multi-phase, and multi-physics, phenomena involved in coupled atmospheric and environmental dynamics. More specifically, the current work includes extended functionality and validation efforts targeting component processes in coupled atmospheric and wildland fire scenarios. Since observational data of sufficient quality and resolution to validate the fully coupled atmosphere-wildfire scenario simply does not exist, we instead seek to validate components of the full prohibitively convoluted process. This manuscript provides first, an introduction and background into the application space of Higrad-Firetec. Second we document the model formulation, solution procedure, and a simple scalar transport verification exercise. Third, we perform a validate model results against observational data for time averaged flow field metrics in and above four idealized forest canopies. Fourth, we carry out a validation effort for the non-buoyant jet in a crossflow scenario (to which an analogy can be made for atmosphere-wildfire interactions) comparing model results to laboratory data of both steady-in-time and unsteady-in-time metrics. Finally, an extension of model multi-phase physics is implemented, allowing for the representation of multiple collocated fuels as separately evolving constituents leading to differences resulting rate of spread and total burned area. In combination these efforts demonstrate improved capability, increased validation of component functionality, and unique applicability the Higrad-Firetec modeling framework. As a result this work provides a substantially more robust foundation for future new, more widely acceptable investigations into the complexities of coupled atmospheric and wildland fire behavior.

  2. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  3. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.

  4. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  5. Heart Rhythm Monitoring in the Constellation Lunar and Launch/Landing EVA Suit: Recommendations from an Expert Panel

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.; Hamilton, Doug; Jones, Jeffrey A.; Alexander, David

    2009-01-01

    There are currently several physiological monitoring requirements for EVA in the Human-Systems Interface Requirements (HSIR) document. There are questions as to whether the capability to monitor heart rhythm in the lunar surface space suit is a necessary capability for lunar surface operations. Similarly, there are questions as to whether the capability to monitor heart rhythm during a cabin depressurization scenario in the launch/landing space suit is necessary. This presentation seeks to inform space medicine personnel of recommendations made by an expert panel of cardiovascular medicine specialists regarding in-suit ECG heart rhythm monitoring requirements during lunar surface operations. After a review of demographic information and clinical cases and panel discussion, the panel recommended that ECG monitoring capability as a clinical tool was not essential in the lunar space suit; ECG monitoring was not essential in the launch/landing space suit for contingency scenarios; the current hear rate monitoring capability requirement for both launch/landing and lunar space suits should be maintained; lunar vehicles should be required to have ECG monitoring capability with a minimum of 5-lead ECG for IVA medical assessments; and, exercise stress testing for astronaut selection and retention should be changed from the current 85% maximum heart rate limit to maximal, exhaustive 'symptom-limited' testing to maximize diagnostic utility as a screening tool for evaluating the functional capacity of astronauts and their cardiovascular health.

  6. Planetary plasma data analysis and 3D visualisation tools of the CDPP in the IMPEx infrastructure

    NASA Astrophysics Data System (ADS)

    Gangloff, Michel; Génot, Vincent; Khodachenko, Maxim; Modolo, Ronan; Kallio, Esa; Alexeev, Igor; Al-Ubaidi, Tarek; Scherf, Manuel; André, Nicolas; Bourrel, Nataliya; Budnik, Elena; Bouchemit, Myriam; Dufourg, Nicolas; Beigbeder, Laurent

    2015-04-01

    The CDPP (Centre de Données de la Physique des Plasmas,(http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed services like AMDA (http://amda.cdpp.eu/) which enables in depth analysis of a large amount of data through dedicated functionalities such as: visualization, conditional search, cataloguing, and 3DView (http://3dview.cdpp.eu/) which provides immersive visualisations in planetary environments and is further developed to include simulation and observational data. Both tools provide an interface to the IMPEx infrastructure (http://impexfp7.oeaw.ac.at) which facilitates the joint access to outputs of simulations (MHD or Hybrid models) in planetary sciences from providers like LATMOS, FMI as well as planetary plasma observational data provided by the CDPP. Several magnetospheric models are implemented in 3Dview (e.g. Tsyganenko for the Earth, and Cain for Mars). Magnetospheric models provided by SINP for the Earth, Jupiter, Saturn and Mercury as well as Hess models for Jupiter can also be used in 3DView, through the IMPEx infrastructure. A use case demonstrating the new capabilities offered by these tools and their interaction, including magnetospheric models, will be presented together with the IMPEx simulation metadata model used for the interface to simulation databases and model providers.

  7. The GLOBAL Learning and Observations to Benefit the Environment (GLOBE) Data Visualization and Retrieval System. Building a robust system for scientists and students.

    NASA Astrophysics Data System (ADS)

    Overoye, D.; Lewis, C.; Butler, D. M.; Andersen, T. J.

    2016-12-01

    The Global Learning and Observations to Benefit the Environment (GLOBE) Program is a worldwide hands-on, primary and secondary school-based science and education program founded on Earth Day 1995. Implemented in 117 countries, GLOBE promotes the teaching and learning of science, supporting students, teachers and scientists worldwide to collaborate with each other on inquiry-based investigations of the Earth system. The GLOBE Data Information System (DIS) currently supports users with the ability to enter data from over 50 different science protocols. GLOBE's Data Access and Visualization tools have been developed to accommodate the need to display and retrieve data from this large number of protocols. The community of users is also diverse, including NASA scientists, citizen scientists and grade school students. The challenge for GLOBE is to meet the needs from this diverse set of users with protocol specific displays that are simple enough for a GLOBE school to use, but also provide enough features for a NASA Scientist to retrieve data sets they are interested in. During the last 3 years, the GLOBE visualization system has evolved to meet the needs of these various users, leveraging user feedback and technological advances. Further refinements and enhancements continue. In this session we review the design and capabilities of the GLOBE visualization and data retrieval tool set, discuss the evolution of these tools, and discuss coming directions.

  8. Adult English Language Learners' Perceptions of Audience Response Systems (Clickers) as Communication Aides: A Q-Methodology Study

    ERIC Educational Resources Information Center

    Rodriguez, Lisa Ann; Shepard, MaryFriend

    2013-01-01

    This study explored the perceptions of adult English language learners about audience response systems (clickers) as tools to facilitate communication. According to second language acquisition theory, learners' receptive capabilities in the early stages of second language acquisition surpass expressive capabilities, often rendering them silent in…

  9. Will Robots Ever Replace Attendants? Exploring the Current Capabilities and Future Potential of Robots in Education and Rehabilitation.

    ERIC Educational Resources Information Center

    Lees, David; LePage, Pamela

    1994-01-01

    This article describes the current capabilities and future potential of robots designed as supplements or replacements for human assistants or as tools for education and rehabilitation of people with disabilities. Review of robots providing educational, vocational, or independent living assistance concludes that eventually effective, reliable…

  10. Predictive tool for estimating the potential effect of water fluoridation on dental caries.

    PubMed

    Foster, G R K; Downer, M C; Lunt, M; Aggarwal, V; Tickle, M

    2009-03-01

    To provide a tool for public health planners to estimate the potential improvement in dental caries in children that might be expected in a region if its water supply were to be fluoridated. Recent BASCD (British Association for the Study of Community Dentistry) dental epidemiological data for caries in 5- and 11-year-old children in English primary care trusts in fluoridated and non-fluoridated areas were analysed to estimate absolute and relative improvement in dmft/DMFT and caries-free measures observed in England. Where data were sufficient for testing significance this analysis included the effect of different levels of deprivation. A table of observed improvements was produced, together with an example of how that table can be used as a tool for estimating the expected improvement in caries in any specific region of England. Observed absolute improvements and 95% confidence intervals were: for 5-year-olds reduction in mean dmft 0.56 (0.38, 0.74) for IMD 12, 0.73 (0.60, 0.85) for IMD 20, and 0.94 (0.76, 1.12) for IMD 30, with 12% (9%, 14%) more children free of caries; for 11-year-olds reduction in mean DMFT 0.12 (0.04, 0.20) for IMD 12, 0.19 (0.13, 0.26) for IMD 20, 0.29 (0.18, 0.40) and for IMD 30, with 8% (5%, 11%) more children free from caries. The BASCD data taken together with a deprivation measure are capable of yielding an age-specific, 'intention to treat' model of water fluoridation that can be used to estimate the potential effect on caries levels of a notional new fluoridation scheme in an English region.

  11. Product ion isotopologue pattern: A tool to improve the reliability of elemental composition elucidations of unknown compounds in complex matrices.

    PubMed

    Kaufmann, A; Walker, S; Mol, G

    2016-04-15

    Elucidation of the elemental compositions of unknown compounds (e.g., in metabolomics) generally relies on the availability of accurate masses and isotopic ratios. This study focuses on the information provided by the abundance ratio within a product ion pair (monoisotopic versus the first isotopic peak) when isolating and fragmenting the first isotopic ion (first isotopic mass spectrum) of the precursor. This process relies on the capability of the quadrupole within the Q Orbitrap instrument to isolate a very narrow mass window. Selecting only the first isotopic peak (first isotopic mass spectrum) leads to the observation of a unique product ion pair. The lighter ion within such an isotopologue pair is monoisotopic, while the heavier ion contains a single carbon isotope. The observed abundance ratio is governed by the percentage of carbon atoms lost during the fragmentation and can be described by a hypergeometric distribution. The observed carbon isotopologue abundance ratio (product ion isotopologue pattern) gives reliable information regarding the percentage of carbon atoms lost in the fragmentation process. It therefore facilitates the elucidation of the involved precursor and product ions. Unlike conventional isotopic abundances, the product ion isotopologue pattern is hardly affected by isobaric interferences. Furthermore, the appearance of these pairs greatly aids in cleaning up a 'matrix-contaminated' product ion spectrum. The product ion isotopologue pattern is a valuable tool for structural elucidation. It increases confidence in results and permits structural elucidations for heavier ions. This tool is also very useful in elucidating the elemental composition of product ions. Such information is highly valued in the field of multi-residue analysis, where the accurate mass of product ions is required for the confirmation process. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Multidisciplinary analysis and design of printed wiring boards

    NASA Astrophysics Data System (ADS)

    Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin

    1991-04-01

    Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.

  13. ISAC: A tool for aeroservoelastic modeling and analysis

    NASA Technical Reports Server (NTRS)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  14. Mars Trek: An Interactive Web Portal for Current and Future Missions to Mars

    NASA Technical Reports Server (NTRS)

    Law, E.; Day, B.

    2017-01-01

    NASA's Mars Trek (https://marstrek.jpl.nasa.gov) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped data products from past and current missions to Mars. During the past year, the capabilities and data served by Mars Trek have been significantly expanded beyond its original design as a public outreach tool. At the request of NASA's Science Mission Directorate and Human Exploration Operations Mission Directorate, Mars Trek's technology and capabilities are now being extended to support site selection and analysis activities for the first human missions to Mars.

  15. An automated tool joint inspection device for the drill string

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moyer, M.C.; Dale, B.A.; Kusenberger, F.N.

    1983-02-01

    This paper discusses the development of an automated tool joint inspection device (i.e., the Fatigue Crack Detector), which is capable of detecting defects in the threaded region of drill pipe and drill collars. On the basis of inspection tests conducted at a research test facility and at drilling rig sites, this device is capable of detecting both simulated defects (saw slots and drilled holes) and service-induced defects, such as fatigue cracks, pin stretch (plastic deformation), mashed threads, and corrosion pitting. The system employs an electromagnetic flux-leakage principle and has several advantages over the conventional method of magnetic particle inspection.

  16. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  17. Predictive Capability Maturity Model (PCMM).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Knupp, Patrick Michael; Urbina, Angel

    2010-10-01

    Predictive Capability Maturity Model (PCMM) is a communication tool that must include a dicussion of the supporting evidence. PCMM is a tool for managing risk in the use of modeling and simulation. PCMM is in the service of organizing evidence to help tell the modeling and simulation (M&S) story. PCMM table describes what activities within each element are undertaken at each of the levels of maturity. Target levels of maturity can be established based on the intended application. The assessment is to inform what level has been achieved compared to the desired level, to help prioritize the VU activities &more » to allocate resources.« less

  18. Diverter AI based decision aid, phases 1 and 2

    NASA Technical Reports Server (NTRS)

    Sexton, George A.; Bayles, Scott J.; Patterson, Robert W.; Schulke, Duane A.; Williams, Deborah C.

    1989-01-01

    It was determined that a system to incorporate artificial intelligence (AI) into airborne flight management computers is feasible. The AI functions that would be most useful to the pilot are to perform situational assessment, evaluate outside influences on the contemplated rerouting, perform flight planning/replanning, and perform maneuver planning. A study of the software architecture and software tools capable of demonstrating Diverter was also made. A skeletal planner known as the Knowledge Acquisition Development Tool (KADET), which is a combination script-based and rule-based system, was used to implement the system. A prototype system was developed which demonstrates advanced in-flight planning/replanning capabilities.

  19. Mars Trek: An Interactive Web Portal for Current and Future Missions to Mars

    NASA Astrophysics Data System (ADS)

    Law, E.; Day, B.

    2017-09-01

    NASA's Mars Trek (https://marstrek.jpl.nasa.gov) provides a web-based Portal and a suite of interactive visualization and analysis tools to enable mission planners, lunar scientists, and engineers to access mapped data products from past and current missions to Mars. During the past year, the capabilities and data served by Mars Trek have been significantly expanded beyond its original design as a public outreach tool. At the request of NASA's Science Mission Directorate and Human Exploration Operations Mission Directorate, Mars Trek's technology and capabilities are now being extended to support site selection and analysis activities for the first human missions to Mars.

  20. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  1. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  2. EMU Battery/module Service Tool Characterization Study

    NASA Technical Reports Server (NTRS)

    Palandati, C. F.

    1984-01-01

    The power tool which will be used to replace the attitude control system in the SMM spacecraft is being modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery, a silver zinc battery, was tested for the power tool application. The results obtained during show the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.

  3. Simulation services and analysis tools at the CCMC to study multi-scale structure and dynamics of Earth's magnetopause

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M. M.; Liu, Y. H.; Rastaetter, L.; Pembroke, A. D.; Chen, L. J.; Hesse, M.; Glocer, A.; Komar, C. M.; Dorelli, J.; Roytershteyn, V.

    2016-12-01

    The presentation will provide overview of new tools, services and models implemented at the Community Coordinated Modeling Center (CCMC) to facilitate MMS dayside results analysis. We will provide updates on implementation of Particle-in-Cell (PIC) simulations at the CCMC and opportunities for on-line visualization and analysis of results of PIC simulations of asymmetric magnetic reconnection for different guide fields and boundary conditions. Fields, plasma parameters, particle distribution moments as well as particle distribution functions calculated in selected regions of the vicinity of reconnection sites can be analyzed through the web-based interactive visualization system. In addition there are options to request distribution functions in user selected regions of interest and to fly through simulated magnetic reconnection configurations and a map of distributions to facilitate comparisons with observations. A broad collection of global magnetosphere models hosted at the CCMC provide opportunity to put MMS observations and local PIC simulations into global context. We recently implemented the RECON-X post processing tool (Glocer et al, 2016) which allows users to determine the location of separator surface around closed field lines and between open field lines and solar wind field lines. The tool also finds the separatrix line where the two surfaces touch and positions of magnetic nulls. The surfaces and the separatrix line can be visualized relative to satellite positions in the dayside magnetosphere using an interactive HTML-5 visualization for each time step processed. To validate global magnetosphere models' capability to simulate locations of dayside magnetosphere boundaries we will analyze the proximity of MMS to simulated separatrix locations for a set of MMS diffusion region crossing events.

  4. NASA's Aeroacoustic Tools and Methods for Analysis of Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.

    2015-01-01

    Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The ability to quantify aircraft noise at the source and ultimately at observers is required to develop low noise aircraft designs and flight procedures. Predicting noise at the source, accounting for scattering and propagation through the atmosphere to the observer, and assessing the perception and impact on a community requires physics-based aeroacoustics tools. Along with the analyses for aero-performance, weights and fuel burn, these tools can provide the acoustic component for aircraft MDAO (Multidisciplinary Design Analysis and Optimization). Over the last decade significant progress has been made in advancing the aeroacoustic tools such that acoustic analyses can now be performed during the design process. One major and enabling advance has been the development of the system noise framework known as Aircraft NOise Prediction Program2 (ANOPP2). ANOPP2 is NASA's aeroacoustic toolset and is designed to facilitate the combination of acoustic approaches of varying fidelity for the analysis of noise from conventional and unconventional aircraft. The toolset includes a framework that integrates noise prediction and propagation methods into a unified system for use within general aircraft analysis software. This includes acoustic analyses, signal processing and interfaces that allow for the assessment of perception of noise on a community. ANOPP2's capability to incorporate medium fidelity shielding predictions and wind tunnel experiments into a design environment is presented. An assessment of noise from a conventional and Hybrid Wing Body (HWB) aircraft using medium fidelity scattering methods combined with noise measurements from a model-scale HWB recently placed in NASA's 14x22 wind tunnel are presented. The results are in the form of community noise metrics and auralizations.

  5. The Coronal Analysis of SHocks and Waves (CASHeW) framework

    NASA Astrophysics Data System (ADS)

    Kozarev, Kamen A.; Davey, Alisdair; Kendrick, Alexander; Hammer, Michael; Keith, Celeste

    2017-11-01

    Coronal bright fronts (CBF) are large-scale wavelike disturbances in the solar corona, related to solar eruptions. They are observed (mostly in extreme ultraviolet (EUV) light) as transient bright fronts of finite width, propagating away from the eruption source location. Recent studies of individual solar eruptive events have used EUV observations of CBFs and metric radio type II burst observations to show the intimate connection between waves in the low corona and coronal mass ejection (CME)-driven shocks. EUV imaging with the atmospheric imaging assembly instrument on the solar dynamics observatory has proven particularly useful for detecting large-scale short-lived CBFs, which, combined with radio and in situ observations, holds great promise for early CME-driven shock characterization capability. This characterization can further be automated, and related to models of particle acceleration to produce estimates of particle fluxes in the corona and in the near Earth environment early in events. We present a framework for the coronal analysis of shocks and waves (CASHeW). It combines analysis of NASA Heliophysics System Observatory data products and relevant data-driven models, into an automated system for the characterization of off-limb coronal waves and shocks and the evaluation of their capability to accelerate solar energetic particles (SEPs). The system utilizes EUV observations and models written in the interactive data language. In addition, it leverages analysis tools from the SolarSoft package of libraries, as well as third party libraries. We have tested the CASHeW framework on a representative list of coronal bright front events. Here we present its features, as well as initial results. With this framework, we hope to contribute to the overall understanding of coronal shock waves, their importance for energetic particle acceleration, as well as to the better ability to forecast SEP events fluxes.

  6. Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meisner, Robert; McCoy, Michel; Archer, Bill

    2013-09-11

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less

  7. Ganalyzer: A Tool for Automatic Galaxy Image Analysis

    NASA Astrophysics Data System (ADS)

    Shamir, Lior

    2011-08-01

    We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.

  8. Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment

    NASA Technical Reports Server (NTRS)

    Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.

    2007-01-01

    Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.

  9. Proactive malware detection

    NASA Astrophysics Data System (ADS)

    Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty

    2014-06-01

    Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.

  10. Measuring infrastructure: A key step in program evaluation and planning

    PubMed Central

    Schmitt, Carol L.; Glasgow, LaShawn; Lavinghouze, S. Rene; Rieker, Patricia P.; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-01-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General’s call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model’s utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. PMID:27037655

  11. Measuring infrastructure: A key step in program evaluation and planning.

    PubMed

    Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd

    2016-06-01

    State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Simplified Metadata Curation via the Metadata Management Tool

    NASA Astrophysics Data System (ADS)

    Shum, D.; Pilone, D.

    2015-12-01

    The Metadata Management Tool (MMT) is the newest capability developed as part of NASA Earth Observing System Data and Information System's (EOSDIS) efforts to simplify metadata creation and improve metadata quality. The MMT was developed via an agile methodology, taking into account inputs from GCMD's science coordinators and other end-users. In its initial release, the MMT uses the Unified Metadata Model for Collections (UMM-C) to allow metadata providers to easily create and update collection records in the ISO-19115 format. Through a simplified UI experience, metadata curators can create and edit collections without full knowledge of the NASA Best Practices implementation of ISO-19115 format, while still generating compliant metadata. More experienced users are also able to access raw metadata to build more complex records as needed. In future releases, the MMT will build upon recent work done in the community to assess metadata quality and compliance with a variety of standards through application of metadata rubrics. The tool will provide users with clear guidance as to how to easily change their metadata in order to improve their quality and compliance. Through these features, the MMT allows data providers to create and maintain compliant and high quality metadata in a short amount of time.

  13. An attempt to implement tools to support examination of community-based activities for disaster mitigation: A case study in Toyokawa city, Japan

    NASA Astrophysics Data System (ADS)

    Karashima, Kazuki; Ohgai, Akira

    2017-10-01

    Japan is a country with a high risk for earthquake disasters. The measures used to promote structures' seismic safety, such as reconstruction, widening narrow roads, and the response capacities to deal with huge earthquakes are important. Techniques to support the examination of countermeasures to huge earthquakes are required. To improve this capability, the authors developed tools to: (1) evaluate fire-spread risk, (2) evaluate the difficulty of emergency response and evacuation, and (3) evaluate capacities of neighborhood communities for disaster mitigation. The usefulness of the tools was clarified by the demonstration experiments of previous studies. The next step was implementation of the tools in community-based activities for disaster mitigation. This study aimed to clarify the usability and problems of implementing the tools in community-based activities. The tools were used at several workshops in actual community-based activities for disaster mitigation for one year. After the last workshop, interviews and a questionnaire were conducted on municipal staff and consultant staff. The results found that the tools visually showed the fire-spread risk, the difficulty of evacuation under current conditions and after improvements, and the effects of each disaster mitigation activity. The users could easily explore the draft plans to promote seismic safety of urban structures and response capabilities. The tools were positively incorporated into some community-based activities for disaster mitigation. Thus, the tools have the possibility of successful use at continuing community-based activities and the possibility of implementing the tools will be promoted.

  14. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  15. Future directions in high-pressure neutron diffraction

    NASA Astrophysics Data System (ADS)

    Guthrie, M.

    2015-04-01

    The ability to manipulate structure and properties using pressure has been well known for many centuries. Diffraction provides the unique ability to observe these structural changes in fine detail on lengthscales spanning atomic to nanometre dimensions. Amongst the broad suite of diffraction tools available today, neutrons provide unique capabilities of fundamental importance. However, to date, the growth of neutron diffraction under extremes of pressure has been limited by the weakness of available sources. In recent years, substantial government investments have led to the construction of a new generation of neutron sources while existing facilities have been revitalized by upgrades. The timely convergence of these bright facilities with new pressure-cell technologies suggests that the field of high-pressure (HP) neutron science is on the cusp of substantial growth. Here, the history of HP neutron research is examined with the hope of gleaning an accurate prediction of where some of these revolutionary capabilities will lead in the near future. In particular, a dramatic expansion of current pressure-temperature range is likely, with corresponding increased scope for extreme-conditions science with neutron diffraction. This increase in coverage will be matched with improvements in data quality. Furthermore, we can also expect broad new capabilities beyond diffraction, including in neutron imaging, small angle scattering and inelastic spectroscopy.

  16. Composite Materials NDE Using Enhanced Leaky Lamb Wave Dispersion Data Acquisition Method

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Mal, Ajit; Lih, Shyh-Shiuh; Chang, Zensheu

    1999-01-01

    The leaky Lamb wave (LLW) technique is approaching a maturity level that is making it an attractive quantitative NDE tool for composites and bonded joints. Since it was first observed in 1982, the phenomenon has been studied extensively, particularly in composite materials. The wave is induced by oblique insonification using a pitch-catch arrangement and the plate wave modes are detected by identifying minima in the reflected spectra to obtain the dispersion data. The wave behavior in multi-orientation laminates has been well documented and corroborated experimentally with high accuracy. The sensitivity of the wave to the elastic constants of the material and to the boundary conditions led to the capability to measure the elastic properties of bonded joints. Recently, the authors significantly enhanced the LLW method's capability by increasing the speed of the data acquisition, the number of modes that can be identified and the accuracy of the data inversion. In spite of the theoretical and experimental progress, methods that employ oblique insonification of composites are still not being applied as standard industrial NDE methods. The authors investigated the issues that are hampering the transition of the LLW to industrial applications and identified 4 key issues. The current capability of the method and the nature of these issues are described in this paper.

  17. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  18. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    USDA-ARS?s Scientific Manuscript database

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  19. Sight Application Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G.

    2014-09-17

    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  20. Demonstration of the Capabilities of the KINEROS2 – AGWA 3.0 Suite of Modeling Tools

    EPA Science Inventory

    This poster and computer demonstration illustrates a sampling of the wide range of applications that are possible using the KINEROS2 - AGWA suite of modeling tools. Applications include: 1) Incorporation of Low Impact Development (LID) features; 2) A real-time flash flood forecas...

Top