Sample records for remote visual testing

  1. Remote Visualization and Remote Collaboration On Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    A new technology has been developed for remote visualization that provides remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as fluid dynamics simulations or measurements). Based on this technology, some World Wide Web sites on the Internet are providing fluid dynamics data for educational or testing purposes. This technology is also being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics and wind tunnel testing. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit).

  2. Remote Infrared Thermography for In-Flight Flow Diagnostics

    NASA Technical Reports Server (NTRS)

    Shiu, H. J.; vanDam, C. P.

    1999-01-01

    The feasibility of remote in-flight boundary layer visualization via infrared in incompressible flow was established in earlier flight experiments. The past year's efforts focused on refining and determining the extent and accuracy of this technique of remote in-flight flow visualization via infrared. Investigations were made into flow separation visualization, visualization at transonic conditions, shock visualization, post-processing to mitigate banding noise in the NITE Hawk's thermograms, and a numeric model to predict surface temperature distributions. Although further flight tests are recommended, this technique continues to be promising.

  3. Earth orbital teleoperator visual system evaluation program

    NASA Technical Reports Server (NTRS)

    Shields, N. L., Jr.; Kirkpatrick, M., III; Frederick, P. N.; Malone, T. B.

    1975-01-01

    Empirical tests of range estimation accuracy and resolution, via television, under monoptic and steroptic viewing conditions are discussed. Test data are used to derive man machine interface requirements and make design decisions for an orbital remote manipulator system. Remote manipulator system visual tasks are given and the effects of system parameters of these tasks are evaluated.

  4. Adaptive strategies of remote systems operators exposed to perturbed camera-viewing conditions

    NASA Technical Reports Server (NTRS)

    Stuart, Mark A.; Manahan, Meera K.; Bierschwale, John M.; Sampaio, Carlos E.; Legendre, A. J.

    1991-01-01

    This report describes a preliminary investigation of the use of perturbed visual feedback during the performance of simulated space-based remote manipulation tasks. The primary objective of this NASA evaluation was to determine to what extent operators exhibit adaptive strategies which allow them to perform these specific types of remote manipulation tasks more efficiently while exposed to perturbed visual feedback. A secondary objective of this evaluation was to establish a set of preliminary guidelines for enhancing remote manipulation performance and reducing the adverse effects. These objectives were accomplished by studying the remote manipulator performance of test subjects exposed to various perturbed camera-viewing conditions while performing a simulated space-based remote manipulation task. Statistical analysis of performance and subjective data revealed that remote manipulation performance was adversely affected by the use of perturbed visual feedback and performance tended to improve with successive trials in most perturbed viewing conditions.

  5. High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL.

    PubMed

    Stone, John E; Messmer, Peter; Sisneros, Robert; Schulten, Klaus

    2016-05-01

    Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications.

  6. High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL

    PubMed Central

    Stone, John E.; Messmer, Peter; Sisneros, Robert; Schulten, Klaus

    2016-01-01

    Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications. PMID:27747137

  7. Visual-motor integration, visual perception, and fine motor coordination in a population of children with high levels of Fetal Alcohol Spectrum Disorder.

    PubMed

    Doney, Robyn; Lucas, Barbara R; Watkins, Rochelle E; Tsang, Tracey W; Sauer, Kay; Howat, Peter; Latimer, Jane; Fitzpatrick, James P; Oscar, June; Carter, Maureen; Elliott, Elizabeth J

    2016-08-01

    Visual-motor integration (VMI) skills are essential for successful academic performance, but to date no studies have assessed these skills in a population-based cohort of Australian Aboriginal children who, like many children in other remote, disadvantaged communities, consistently underperform academically. Furthermore, many children in remote areas of Australia have prenatal alcohol exposure (PAE) and Fetal Alcohol Spectrum Disorder (FASD), which are often associated with VMI deficits. VMI, visual perception, and fine motor coordination were assessed using The Beery-Buktenica Developmental Test of Visual-Motor Integration, including its associated subtests of Visual Perception and Fine Motor Coordination, in a cohort of predominantly Australian Aboriginal children (7.5-9.6 years, n=108) in remote Western Australia to explore whether PAE adversely affected test performance. Cohort results were reported, and comparisons made between children i) without PAE; ii) with PAE (no FASD); and iii) FASD. The prevalence of moderate (≤16th percentile) and severe (≤2nd percentile) impairment was established. Mean VMI scores were 'below average' (M=87.8±9.6), and visual perception scores were 'average' (M=97.6±12.5), with no differences between groups. Few children had severe VMI impairment (1.9%), but moderate impairment rates were high (47.2%). Children with FASD had significantly lower fine motor coordination scores and higher moderate impairment rates (M=87.9±12.5; 66.7%) than children without PAE (M=95.1±10.7; 23.3%) and PAE (no FASD) (M=96.1±10.9; 15.4%). Aboriginal children living in remote Western Australia have poor VMI skills regardless of PAE or FASD. Children with FASD additionally had fine motor coordination problems. VMI and fine motor coordination should be assessed in children with PAE, and included in FASD diagnostic assessments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The effects of spatially displaced visual feedback on remote manipulator performance

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Stuart, Mark A.

    1989-01-01

    The effects of spatially displaced visual feedback on the operation of a camera viewed remote manipulation task are analyzed. A remote manipulation task is performed by operators exposed to the following different viewing conditions: direct view of the work site; normal camera view; reversed camera view; inverted/reversed camera view; and inverted camera view. The task completion performance times are statistically analyzed with a repeated measures analysis of variance, and a Newman-Keuls pairwise comparison test is administered to the data. The reversed camera view is ranked third out of four camera viewing conditions, while the normal viewing condition is found significantly slower than the direct viewing condition. It is shown that generalization to remote manipulation applications based upon the results of direct manipulation studies are quite useful, but they should be made cautiously.

  9. Applications of Sentinel-2 data for agriculture and forest monitoring using the absolute difference (ZABUD) index derived from the AgroEye software (ESA)

    NASA Astrophysics Data System (ADS)

    de Kok, R.; WeŻyk, P.; PapieŻ, M.; Migo, L.

    2017-10-01

    To convince new users of the advantages of the Sentinel_2 sensor, a simplification of classic remote sensing tools allows to create a platform of communication among domain specialists of agricultural analysis, visual image interpreters and remote sensing programmers. An index value, known in the remote sensing user domain as "Zabud" was selected to represent, in color, the essentials of a time series analysis. The color index used in a color atlas offers a working platform for an agricultural field control. This creates a database of test and training areas that enables rapid anomaly detection in the agricultural domain. The use cases and simplifications now function as an introduction to Sentinel_2 based remote sensing, in an area that before relies on VHR imagery and aerial data, to serve mainly the visual interpretation. The database extension with detected anomalies allows developers of open source software to design solutions for further agricultural control with remote sensing.

  10. MEDSAT - A remote sensing satellite for malaria early warning and control

    NASA Technical Reports Server (NTRS)

    Vesecky, John; Slawski, James; Stottlemeyer, Bret; De La Sierra, Ruben; Daida, Jason; Wood, Byron; Lawless, James

    1992-01-01

    A remote sensing, medical satellite (MEDSAT) aids in the control of carrier (vector) borne disease. The prototype design is a light satellite to test for control of malaria. The design features a 340-kg satellite with visual/IR and SAR sensors in a low inclination orbit observing a number of worldwide test sites. The approach is to use four-band visual/IR and dual-polarized L-band SAR images obtained from MEDSAT in concert with in-situ data to estimate the temporal and spatial variations of malaria risk. This allows public health resources to focus on the most vulnerable areas at the appropriate time. It is concluded that a light-satellite design for a MEDSAT satellite with a Pegasus launch is feasible.

  11. Feasibility of a novel remote daily monitoring system for age-related macular degeneration using mobile handheld devices: results of a pilot study.

    PubMed

    Kaiser, Peter K; Wang, Yi-Zhong; He, Yu-Guang; Weisberger, Annemarie; Wolf, Stephane; Smith, Craig H

    2013-10-01

    This pilot study evaluated the feasibility of the Health Management Tool (HMT), a novel computing system using mobile handheld devices, to remotely monitor retinal visual function daily in patients with neovascular age-related macular degeneration treated with ranibizumab. Patients with neovascular age-related macular degeneration in at least 1 eye (newly diagnosed or successfully treated < 1 year) and eligible for ranibizumab therapy were enrolled in this 16-week, prospective, open-label, single-arm study. Patients performed a shape discrimination hyperacuity test (myVisionTrack [mVT]) daily on the HMT device (iPhone 3GS) remotely and at all clinic visits. Data entered into HMT devices were collected in the HMT database, which also sent reminders for patients to take mVT. Among 160 patients from 24 U.S. centers enrolled in the study (103 [64%] ≥ 75 years of age), 84.7% on average complied with daily mVT testing and ≈ 98.9% complied with at least weekly mVT testing. The HMT database successfully uploaded more than 17,000 mVT assessment values and sent more than 9,000 reminders. Elderly patients with neovascular age-related macular degeneration were willing and able to comply with daily self-testing of retinal visual function using mobile handheld devices in this novel system of remote vision monitoring.

  12. Design and Development of Functionally Operative and Visually Appealing Remote Firing Room Displays

    NASA Technical Reports Server (NTRS)

    Quaranto, Kristy

    2014-01-01

    This internship provided an opportunity for an intern to work with NASA's Ground Support Equipment (GSE) for the Spaceport Command and Control System (SCCS) at Kennedy Space Center as a remote display developer, under NASA mentor Kurt Leucht. The main focus was on creating remote displays for the hypergolic and high pressure helium subsystem team to help control the filling of the respective tanks. As a remote display developer for the GSE hypergolic and high pressure helium subsystem team the intern was responsible for creating and testing graphical remote displays to be used in the Launch Control Center (LCC) on the Firing Room's computer monitors. To become more familiar with the subsystem, the individual attended multiple project meetings and acquired their specific requirements regarding what needed to be included in the remote displays. After receiving the requirements, the next step was to create a display that had both visual appeal and logical order using the Display Editor, on the Virtual Machine (VM). In doing so, all Compact Unique Identifiers (CUI), which are associated with specific components within the subsystem, will need to be included in each respective display for the system to run properly. Then, once the display was created it needed to be tested to ensure that the display runs as intended by using the Test Driver, also found on the VM. This Test Driver is a specific application that checks to make sure all the CUIs in the display are running properly and returning the correct form of information. After creating and locally testing the display it will need to go through further testing and evaluation before deemed suitable for actual use. By the end of the semester long experience at NASA's Kennedy Space Center, the individual should have gained great knowledge and experience in various areas of display development and testing. They were able to demonstrate this new knowledge obtained by creating multiple successful remote displays that will one day be used by the hypergolic and high pressure helium subsystem team in one of the LCC's firing rooms to fill the new Orion spacecraft.

  13. Helicopter Visual Aid System

    NASA Technical Reports Server (NTRS)

    Baisley, R. L.

    1973-01-01

    The results of an evaluation of police helicopter effectiveness revealed a need for improved visual capability. A JPL program developed a method that would enhance visual observation capability for both day and night usage and demonstrated the feasibility of the adopted approach. This approach made use of remote pointable optics, a display screen, a slaved covert searchlight, and a coupled camera. The approach was proved feasible through field testing and by judgement against evaluation criteria.

  14. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  15. Java-based browsing, visualization and processing of heterogeneous medical data from remote repositories.

    PubMed

    Masseroli, M; Bonacina, S; Pinciroli, F

    2004-01-01

    The actual development of distributed information technologies and Java programming enables employing them also in the medical arena to support the retrieval, integration and evaluation of heterogeneous data and multimodal images in a web browser environment. With this aim, we used them to implement a client-server architecture based on software agents. The client side is a Java applet running in a web browser and providing a friendly medical user interface to browse and visualize different patient and medical test data, integrating them properly. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. Based on the Java Advanced Imaging API, processing and analysis tools were developed to support the evaluation of remotely retrieved bioimages through the quantification of their features in different regions of interest. The Java platform-independence allows the centralized management of the implemented prototype and its deployment to each site where an intranet or internet connection is available. Giving healthcare providers effective support for comprehensively browsing, visualizing and evaluating medical images and records located in different remote repositories, the developed prototype can represent an important aid in providing more efficient diagnoses and medical treatments.

  16. VIPER: Virtual Intelligent Planetary Exploration Rover

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Flueckiger, Lorenzo; Nguyen, Laurent; Washington, Richard

    2001-01-01

    Simulation and visualization of rover behavior are critical capabilities for scientists and rover operators to construct, test, and validate plans for commanding a remote rover. The VIPER system links these capabilities. using a high-fidelity virtual-reality (VR) environment. a kinematically accurate simulator, and a flexible plan executive to allow users to simulate and visualize possible execution outcomes of a plan under development. This work is part of a larger vision of a science-centered rover control environment, where a scientist may inspect and explore the environment via VR tools, specify science goals, and visualize the expected and actual behavior of the remote rover. The VIPER system is constructed from three generic systems, linked together via a minimal amount of customization into the integrated system. The complete system points out the power of combining plan execution, simulation, and visualization for envisioning rover behavior; it also demonstrates the utility of developing generic technologies. which can be combined in novel and useful ways.

  17. Bringing "Scientific Expeditions" Into the Schools

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as simulations or measurements of fluid dynamics). The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics (CFD) and wind tunnel testing. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualiZation of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: 1. The visual is much higher in resolution (1280xl024 pixels with 24 bits of color) than typical video format transmitted over the network. 2. The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). 3. A rich variety of guided expeditions through the data can be included easily. 4. A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. 5. The scenes can be viewed in 3D using stereo vision. 6. The network bandwidth used for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.)

  18. Fuzzy Classification of High Resolution Remote Sensing Scenes Using Visual Attention Features.

    PubMed

    Li, Linyi; Xu, Tingbao; Chen, Yun

    2017-01-01

    In recent years the spatial resolutions of remote sensing images have been improved greatly. However, a higher spatial resolution image does not always lead to a better result of automatic scene classification. Visual attention is an important characteristic of the human visual system, which can effectively help to classify remote sensing scenes. In this study, a novel visual attention feature extraction algorithm was proposed, which extracted visual attention features through a multiscale process. And a fuzzy classification method using visual attention features (FC-VAF) was developed to perform high resolution remote sensing scene classification. FC-VAF was evaluated by using remote sensing scenes from widely used high resolution remote sensing images, including IKONOS, QuickBird, and ZY-3 images. FC-VAF achieved more accurate classification results than the others according to the quantitative accuracy evaluation indices. We also discussed the role and impacts of different decomposition levels and different wavelets on the classification accuracy. FC-VAF improves the accuracy of high resolution scene classification and therefore advances the research of digital image analysis and the applications of high resolution remote sensing images.

  19. Fuzzy Classification of High Resolution Remote Sensing Scenes Using Visual Attention Features

    PubMed Central

    Xu, Tingbao; Chen, Yun

    2017-01-01

    In recent years the spatial resolutions of remote sensing images have been improved greatly. However, a higher spatial resolution image does not always lead to a better result of automatic scene classification. Visual attention is an important characteristic of the human visual system, which can effectively help to classify remote sensing scenes. In this study, a novel visual attention feature extraction algorithm was proposed, which extracted visual attention features through a multiscale process. And a fuzzy classification method using visual attention features (FC-VAF) was developed to perform high resolution remote sensing scene classification. FC-VAF was evaluated by using remote sensing scenes from widely used high resolution remote sensing images, including IKONOS, QuickBird, and ZY-3 images. FC-VAF achieved more accurate classification results than the others according to the quantitative accuracy evaluation indices. We also discussed the role and impacts of different decomposition levels and different wavelets on the classification accuracy. FC-VAF improves the accuracy of high resolution scene classification and therefore advances the research of digital image analysis and the applications of high resolution remote sensing images. PMID:28761440

  20. The Effect of Retrieval Cues on Visual Preferences and Memory in Infancy: Evidence for a Four-Phase Attention Function.

    ERIC Educational Resources Information Center

    Bahrick, Lorraine E.; Hernandez-Reif, Maria; Pickens, Jeffrey N.

    1997-01-01

    Tested hypothesis from Bahrick and Pickens' infant attention model that retrieval cues increase memory accessibility and shift visual preferences toward greater novelty to resemble recent memories. Found that after retention intervals associated with remote or intermediate memory, previous familiarity preferences shifted to null or novelty…

  1. Accelerating Demand Paging for Local and Remote Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes a new algorithm that improves the performance of application-controlled demand paging for the out-of-core visualization of data sets that are on either local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The new algorithm can be applied to many different visualization algorithms since application-controlled demand paging is not specific to any visualization algorithm. The paper includes measurements that show that the new multi-threaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by up to 60%. Visualization runs using data from remote disk ran about as fast as ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  2. Running VisIt Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    kilobyte range. VisIt features a robust remote visualization capability. VisIt can be started on a local machine and used to visualize data on a remote compute cluster.The remote machine must be able to send VisIt module must be loaded as part of this process. To enable remote visualization the 'module load

  3. [Research on Three-dimensional Medical Image Reconstruction and Interaction Based on HTML5 and Visualization Toolkit].

    PubMed

    Gao, Peng; Liu, Peng; Su, Hongsen; Qiao, Liang

    2015-04-01

    Integrating visualization toolkit and the capability of interaction, bidirectional communication and graphics rendering which provided by HTML5, we explored and experimented on the feasibility of remote medical image reconstruction and interaction in pure Web. We prompted server-centric method which did not need to download the big medical data to local connections and avoided considering network transmission pressure and the three-dimensional (3D) rendering capability of client hardware. The method integrated remote medical image reconstruction and interaction into Web seamlessly, which was applicable to lower-end computers and mobile devices. Finally, we tested this method in the Internet and achieved real-time effects. This Web-based 3D reconstruction and interaction method, which crosses over internet terminals and performance limited devices, may be useful for remote medical assistant.

  4. UTM Safely Enabling UAS Operations in Low-Altitude Airspace

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal

    2017-01-01

    Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.

  5. UTM Safely Enabling UAS Operations in Low-Altitude Airspace

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal H.

    2016-01-01

    Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability.

  6. JPL Earth Science Center Visualization Multitouch Table

    NASA Astrophysics Data System (ADS)

    Kim, R.; Dodge, K.; Malhotra, S.; Chang, G.

    2014-12-01

    JPL Earth Science Center Visualization table is a specialized software and hardware to allow multitouch, multiuser, and remote display control to create seamlessly integrated experiences to visualize JPL missions and their remote sensing data. The software is fully GIS capable through time aware OGC WMTS using Lunar Mapping and Modeling Portal as the GIS backend to continuously ingest and retrieve realtime remote sending data and satellite location data. 55 inch and 82 inch unlimited finger count multitouch displays allows multiple users to explore JPL Earth missions and visualize remote sensing data through very intuitive and interactive touch graphical user interface. To improve the integrated experience, Earth Science Center Visualization Table team developed network streaming which allows table software to stream data visualization to near by remote display though computer network. The purpose of this visualization/presentation tool is not only to support earth science operation, but specifically designed for education and public outreach and will significantly contribute to STEM. Our presentation will include overview of our software, hardware, and showcase of our system.

  7. VHP - An environment for the remote visualization of heuristic processes

    NASA Technical Reports Server (NTRS)

    Crawford, Stuart L.; Leiner, Barry M.

    1991-01-01

    A software system called VHP is introduced which permits the visualization of heuristic algorithms on both resident and remote hardware platforms. The VHP is based on the DCF tool for interprocess communication and is applicable to remote algorithms which can be on different types of hardware and in languages other than VHP. The VHP system is of particular interest to systems in which the visualization of remote processes is required such as robotics for telescience applications.

  8. Development of systems and techniques for landing an aircraft using onboard television

    NASA Technical Reports Server (NTRS)

    Gee, S. W.; Carr, P. C.; Winter, W. R.; Manke, J. A.

    1978-01-01

    A flight program was conducted to develop a landing technique with which a pilot could consistently and safely land a remotely piloted research vehicle (RPRV) without outside visual reference except through television. Otherwise, instrumentation was standard. Such factors as the selection of video parameters, the pilot's understanding of the television presentation, the pilot's ground cockpit environment, and the operational procedures for landing were considered. About 30 landings were necessary for a pilot to become sufficiently familiar and competent with the test aircraft to make powered approaches and landings with outside visual references only through television. When steep approaches and landings were made by remote control, the pilot's workload was extremely high. The test aircraft was used as a simulator for the F-15 RPRV, and as such was considered to be essential to the success of landing the F-15 RPRV.

  9. Unmanned Aircraft Systems Traffic Management (UTM) Safely Enabling UAS Operations in Low-Altitude Airspace

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal H.

    2017-01-01

    Conduct research, development and testing to identify airspace operations requirements to enable large-scale visual and beyond visual line of sight UAS operations in the low-altitude airspace. Use build-a-little-test-a-little strategy remote areas to urban areas Low density: No traffic management required but understanding of airspace constraints. Cooperative traffic management: Understanding of airspace constraints and other operations. Manned and unmanned traffic management: Scalable and heterogeneous operations. UTM construct consistent with FAAs risk-based strategy. UTM research platform is used for simulations and tests. UTM offers path towards scalability

  10. Hiding the Disk and Network Latency of Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes an algorithm that improves the performance of application-controlled demand paging for out-of-core visualization by hiding the latency of reading data from both local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The paper includes measurements that show that the new multithreaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by two thirds. Visualization runs using data from remote disk actually ran faster than ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  11. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 4 : web-based bridge information database--visualization analytics and distributed sensing.

    DOT National Transportation Integrated Search

    2012-03-01

    This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...

  12. Development and Testing of Functionally Operative and Visually Appealing Remote Firing Room Displays and Applications

    NASA Technical Reports Server (NTRS)

    Quaranto, Kristy

    2014-01-01

    This internship provided an opportunity for an intern to work with NASA's Ground Support Equipment (GSE) for the Spaceport Command and Control System (SCCS) at Kennedy Space Center as a remote display developer, under NASA technical mentor Kurt Leucht. The main focus was on creating remote displays and applications for the hypergolic and high pressure helium subsystem team to help control the filling of the respective tanks. As a remote display and application developer for the GSE hypergolic and high pressure helium subsystem team the intern was responsible for creating and testing graphical remote displays and applications to be used in the Launch Control Center (LCC) on the Firing Room's computers. To become more familiar with the subsystem, the individual attended multiple project meetings and acquired their specific requirements regarding what needed to be included in the software. After receiving the requirements for the displays, the next step was to create displays that had both visual appeal and logical order using the Display Editor, on the Virtual Machine (VM). In doing so, all Compact Unique Identifiers (CUI), which are associated with specific components within the subsystem, were need to be included in each respective display for the system to run properly. Then, once the display was created it was to be tested to ensure that the display runs as intended by using the Test Driver, also found on the VM. This Test Driver is a specific application that checks to make sure all the CUIs in the display are running properly and returning the correct form of information. After creating and locally testing the display it needed to go through further testing and evaluation before deemed suitable for actual use. For the remote applications the intern was responsible for creating a project that focused on channelizing each component included in each display. The core of the application code was created by setting up spreadsheets and having an auto test generator, generate the complete code structure. This application code was then loaded and ran on a testing environment set to ensure the code runs as anticipated. By the end of the semester-long experience at NASA's Kennedy Space Center, the individual should have gained great knowledge and experience in various areas of both display and application development and testing. They were able to demonstrate this new knowledge obtained by creating multiple successful remote displays that will one day be used by the hypergolic and high pressure helium subsystem team in the LCC's firing rooms to service the new Orion spacecraft. The completed display channelization application will be used to receive verification from NASA quality engineers.

  13. High-resolution remotely sensed small target detection by imitating fly visual perception mechanism.

    PubMed

    Huang, Fengchen; Xu, Lizhong; Li, Min; Tang, Min

    2012-01-01

    The difficulty and limitation of small target detection methods for high-resolution remote sensing data have been a recent research hot spot. Inspired by the information capture and processing theory of fly visual system, this paper endeavors to construct a characterized model of information perception and make use of the advantages of fast and accurate small target detection under complex varied nature environment. The proposed model forms a theoretical basis of small target detection for high-resolution remote sensing data. After the comparison of prevailing simulation mechanism behind fly visual systems, we propose a fly-imitated visual system method of information processing for high-resolution remote sensing data. A small target detector and corresponding detection algorithm are designed by simulating the mechanism of information acquisition, compression, and fusion of fly visual system and the function of pool cell and the character of nonlinear self-adaption. Experiments verify the feasibility and rationality of the proposed small target detection model and fly-imitated visual perception method.

  14. Twelfth Annual Conference on Manual Control

    NASA Technical Reports Server (NTRS)

    Wempe, T. E.

    1976-01-01

    Main topics discussed cover multi-task decision making, attention allocation and workload measurement, displays and controls, nonvisual displays, tracking and other psychomotor tasks, automobile driving, handling qualities and pilot ratings, remote manipulation, system identification, control models, and motion and visual cues. Sixty-five papers are included with presentations on results of analytical studies to develop and evaluate human operator models for a range of control task, vehicle dynamics and display situations; results of tests of physiological control systems and applications to medical problems; and on results of simulator and flight tests to determine display, control and dynamics effects on operator performance and workload for aircraft, automobile, and remote control systems.

  15. Accessing and Visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, Bruce G.; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL 's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids These tools do one or more of the following tasks visualize local data sets for local users, visualize local data sets for remote users, and access and visualize remote data sets The tools are used for various types of data, including remotely sensed image data, digital elevation models, astronomical surveys, etc The paper attempts to pull some common elements out of these tools that may be useful for others who have to work with similarly large data sets.

  16. Do reference surfaces influence exocentric pointing?

    PubMed

    Doumen, M J A; Kappers, A M L; Koenderink, J J

    2008-06-01

    All elements of the visual field are known to influence the perception of the egocentric distances of objects. Not only the ground surface of a scene, but also the surface at the back or other objects in the scene can affect an observer's egocentric distance estimation of an object. We tested whether this is also true for exocentric direction estimations. We used an exocentric pointing task to test whether the presence of poster-boards in the visual scene would influence the perception of the exocentric direction between two test-objects. In this task the observer has to direct a pointer, with a remote control, to a target. We placed the poster-boards at various positions in the visual field to test whether these boards would affect the settings of the observer. We found that they only affected the settings when they directly served as a reference for orienting the pointer to the target.

  17. One-year-old fear memories rapidly activate human fusiform gyrus

    PubMed Central

    Pizzagalli, Diego A.

    2016-01-01

    Fast threat detection is crucial for survival. In line with such evolutionary pressure, threat-signaling fear-conditioned faces have been found to rapidly (<80 ms) activate visual brain regions including the fusiform gyrus on the conditioning day. Whether remotely fear conditioned stimuli (CS) evoke similar early processing enhancements is unknown. Here, 16 participants who underwent a differential face fear-conditioning and extinction procedure on day 1 were presented the initial CS 24 h after conditioning (Recent Recall Test) as well as 9-17 months later (Remote Recall Test) while EEG was recorded. Using a data-driven segmentation procedure of CS evoked event-related potentials, five distinct microstates were identified for both the recent and the remote memory test. To probe intracranial activity, EEG activity within each microstate was localized using low resolution electromagnetic tomography analysis (LORETA). In both the recent (41–55 and 150–191 ms) and remote (45–90 ms) recall tests, fear conditioned faces potentiated rapid activation in proximity of fusiform gyrus, even in participants unaware of the contingencies. These findings suggest that rapid processing enhancements of conditioned faces persist over time. PMID:26416784

  18. Remote manual operator for space station intermodule ventilation valve

    NASA Technical Reports Server (NTRS)

    Guyaux, James R.

    1996-01-01

    The Remote Manual Operator (RMO) is a mechanism used for manual operation of the Space Station Intermodule Ventilation (IMV) valve and for visual indication of valve position. The IMV is a butterfly-type valve, located in the ventilation or air circulation ducts of the Space Station, and is used to interconnect or isolate the various compartments. The IMV valve is normally operated by an electric motor-driven actuator under computer or astronaut control, but it can also be operated manually with the RMO. The IMV valve RMO consists of a handle with a deployment linkage, a gear-driven flexible shaft, and a linkage to disengage the electric motor actuator during manual operation. It also provides visual indication of valve position. The IMV valve RMO is currently being prepared for qualification testing.

  19. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 3 : use of scanning LiDAR in structural evaluation of bridges.

    DOT National Transportation Integrated Search

    2009-12-01

    This volume introduces several applications of remote bridge inspection technologies studied in : this Integrated Remote Sensing and Visualization (IRSV) study using ground-based LiDAR : systems. In particular, the application of terrestrial LiDAR fo...

  20. D3: A Collaborative Infrastructure for Aerospace Design

    NASA Technical Reports Server (NTRS)

    Walton, Joan; Filman, Robert E.; Knight, Chris; Korsmeyer, David J.; Lee, Diana D.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    DARWIN is a NASA developed, Internet-based system for enabling aerospace researchers to securely and remotely access and collaborate on the analysis of aerospace vehicle design data, primarily the results of wind-tunnel testing and numeric (e.g., computational fluid dynamics) model executions. DARWIN captures, stores and indexes data, manages derived knowledge (such as visualizations across multiple data sets) and provides an environment for designers to collaborate in the analysis of the results of testing. DARWIN is an interesting application because it supports high volumes of data, integrates multiple modalities of data display (e.g. images and data visualizations), and provides non-trivial access control mechanisms. DARWIN enables collaboration by allowing not only sharing visualizations of data, but also commentary about and view of data.

  1. KAMEDIN: a telemedicine system for computer supported cooperative work and remote image analysis in radiology.

    PubMed

    Handels, H; Busch, C; Encarnação, J; Hahn, C; Kühn, V; Miehe, J; Pöppl, S I; Rinast, E; Rossmanith, C; Seibert, F; Will, A

    1997-03-01

    The software system KAMEDIN (Kooperatives Arbeiten und MEdizinische Diagnostik auf Innovativen Netzen) is a multimedia telemedicine system for exchange, cooperative diagnostics, and remote analysis of digital medical image data. It provides components for visualisation, processing, and synchronised audio-visual discussion of medical images. Techniques of computer supported cooperative work (CSCW) synchronise user interactions during a teleconference. Visibility of both local and remote cursor on the conference workstations facilitates telepointing and reinforces the conference partner's telepresence. Audio communication during teleconferences is supported by an integrated audio component. Furthermore, brain tissue segmentation with artificial neural networks can be performed on an external supercomputer as a remote image analysis procedure. KAMEDIN is designed as a low cost CSCW tool for ISDN based telecommunication. However it can be used on any TCP/IP supporting network. In a field test, KAMEDIN was installed in 15 clinics and medical departments to validate the systems' usability. The telemedicine system KAMEDIN has been developed, tested, and evaluated within a research project sponsored by German Telekom.

  2. Remote Sensing Data Visualization, Fusion and Analysis via Giovanni

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Zubko, V.; Gopalan, A.; Khayat, M.

    2007-01-01

    We describe Giovanni, the NASA Goddard developed online visualization and analysis tool that allows users explore various phenomena without learning remote sensing data formats and downloading voluminous data. Using MODIS aerosol data as an example, we formulate an approach to the data fusion for Giovanni to further enrich online multi-sensor remote sensing data comparison and analysis.

  3. Bayesian evaluation of clinical diagnostic test characteristics of visual observations and remote monitoring to diagnose bovine respiratory disease in beef calves.

    PubMed

    White, Brad J; Goehl, Dan R; Amrine, David E; Booker, Calvin; Wildman, Brian; Perrett, Tye

    2016-04-01

    Accurate diagnosis of bovine respiratory disease (BRD) in beef cattle is a critical facet of therapeutic programs through promotion of prompt treatment of diseased calves in concert with judicious use of antimicrobials. Despite the known inaccuracies, visual observation (VO) of clinical signs is the conventional diagnostic modality for BRD diagnosis. Objective methods of remotely monitoring cattle wellness could improve diagnostic accuracy; however, little information exists describing the accuracy of this method compared to traditional techniques. The objective of this research is to employ Bayesian methodology to elicit diagnostic characteristics of conventional VO compared to remote early disease identification (REDI) to diagnose BRD. Data from previous literature on the accuracy of VO were combined with trial data consisting of direct comparison between VO and REDI for BRD in two populations. No true gold standard diagnostic test exists for BRD; therefore, estimates of diagnostic characteristics of each test were generated using Bayesian latent class analysis. Results indicate a 90.0% probability that the sensitivity of REDI (median 81.3%; 95% probability interval [PI]: 55.5, 95.8) was higher than VO sensitivity (64.5%; PI: 57.9, 70.8). The specificity of REDI (median 92.9%; PI: 88.2, 96.9) was also higher compared to VO (median 69.1%; PI: 66.3, 71.8). The differences in sensitivity and specificity resulted in REDI exhibiting higher positive and negative predictive values in both high (41.3%) and low (2.6%) prevalence situations. This research illustrates the potential of remote cattle monitoring to augment conventional methods of BRD diagnosis resulting in more accurate identification of diseased cattle. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Remote surface inspection system

    NASA Astrophysics Data System (ADS)

    Hayati, S.; Balaram, J.; Seraji, H.; Kim, W. S.; Tso, K.; Prasad, V.

    1993-02-01

    This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.

  5. Remote surface inspection system

    NASA Technical Reports Server (NTRS)

    Hayati, S.; Balaram, J.; Seraji, H.; Kim, W. S.; Tso, K.; Prasad, V.

    1993-01-01

    This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.

  6. Desktop Cloud Visualization: the new technology to remote access 3D interactive applications in the Cloud.

    PubMed

    Torterolo, Livia; Ruffino, Francesco

    2012-01-01

    In the proposed demonstration we will present DCV (Desktop Cloud Visualization): a unique technology that allows users to remote access 2D and 3D interactive applications over a standard network. This allows geographically dispersed doctors work collaboratively and to acquire anatomical or pathological images and visualize them for further investigations.

  7. The visual properties of proximal and remote distractors differentially influence reaching planning times: evidence from pro- and antipointing tasks.

    PubMed

    Heath, Matthew; DeSimone, Jesse C

    2016-11-01

    The saccade literature has consistently reported that the presentation of a distractor remote to a target increases reaction time (i.e., the remote distractor effect: RDE). As well, some studies have shown that a proximal distractor facilitates saccade reaction time. The lateral inhibition hypothesis attributes the aforementioned findings to the inhibition/facilitation of target selection mechanisms operating in the intermediate layers of the superior colliculus (SC). Although the impact of remote and proximal distractors has been extensively examined in the saccade literature, a paucity of work has examined whether such findings generalize to reaching responses, and to our knowledge, no work has directly contrasted reaching RTs for remote and proximal distractors. To that end, the present investigation had participants complete reaches in target only trials (i.e., TO) and when distractors were presented at "remote" (i.e., the opposite visual field) and "proximal" (i.e., the same visual field) locations along the same horizontal meridian as the target. As well, participants reached to the target's veridical (i.e., propointing) and mirror-symmetrical (i.e., antipointing) location. The basis for contrasting pro- and antipointing was to determine whether the distractor's visual- or motor-related activity influence reaching RTs. Results demonstrated that remote and proximal distractors, respectively, increased and decreased reaching RTs and the effect was consistent for pro- and antipointing. Accordingly, results evince that the RDE and the facilitatory effects of a proximal distractor are effector independent and provide behavioral support for the contention that the SC serves as a general target selection mechanism. As well, the comparable distractor-related effects for pro- and antipointing trials indicate that the visual properties of remote and proximal distractors respectively inhibit and facilitate target selection.

  8. Engineering education using a remote laboratory through the Internet

    NASA Astrophysics Data System (ADS)

    Axaopoulos, Petros J.; Moutsopoulos, Konstantinos N.; Theodoridis, Michael P.

    2012-03-01

    An experiment using real hardware and under real test conditions can be remotely conducted by engineering students and other interested individuals in the world via the Internet and with the capability of live video streaming from the test site. The presentation of this innovative experiment refers to the determination of the current voltage characteristic curve of a photovoltaic panel installed on the roof of a laboratory, facing south and with the ability to alter its tilt angle, using a closed loop servo motor mounted on the horizontal axis of the panel. The user has the sense of a direct contact with the system since they can intervene and alter the tilt of the panel and get a live visual feedback besides the remote instrumentation panel. The whole procedure takes a few seconds to complete and the characteristic curve is displayed in a chart giving the student and anyone else interested the chance to analyse the results and understand the respective theory; meanwhile, the test data are stored in a file for future use. This type of remote experiment could be used for distance education, training, part-time study and to help students with disabilities to participate in a laboratory environment.

  9. A telescopic cinema sound camera for observing high altitude aerospace vehicles

    NASA Astrophysics Data System (ADS)

    Slater, Dan

    2014-09-01

    Rockets and other high altitude aerospace vehicles produce interesting visual and aural phenomena that can be remotely observed from long distances. This paper describes a compact, passive and covert remote sensing system that can produce high resolution sound movies at >100 km viewing distances. The telescopic high resolution camera is capable of resolving and quantifying space launch vehicle dynamics including plume formation, staging events and payload fairing jettison. Flight vehicles produce sounds and vibrations that modulate the local electromagnetic environment. These audio frequency modulations can be remotely sensed by passive optical and radio wave detectors. Acousto-optic sensing methods were primarily used but an experimental radioacoustic sensor using passive micro-Doppler radar techniques was also tested. The synchronized combination of high resolution flight vehicle imagery with the associated vehicle sounds produces a cinema like experience that that is useful in both an aerospace engineering and a Hollywood film production context. Examples of visual, aural and radar observations of the first SpaceX Falcon 9 v1.1 rocket launch are shown and discussed.

  10. Mental visualization of objects from cross-sectional images

    PubMed Central

    Wu, Bing; Klatzky, Roberta L.; Stetten, George D.

    2011-01-01

    We extended the classic anorthoscopic viewing procedure to test a model of visualization of 3D structures from 2D cross-sections. Four experiments were conducted to examine key processes described in the model, localizing cross-sections within a common frame of reference and spatiotemporal integration of cross sections into a hierarchical object representation. Participants used a hand-held device to reveal a hidden object as a sequence of cross-sectional images. The process of localization was manipulated by contrasting two displays, in-situ vs. ex-situ, which differed in whether cross sections were presented at their source locations or displaced to a remote screen. The process of integration was manipulated by varying the structural complexity of target objects and their components. Experiments 1 and 2 demonstrated visualization of 2D and 3D line-segment objects and verified predictions about display and complexity effects. In Experiments 3 and 4, the visualized forms were familiar letters and numbers. Errors and orientation effects showed that displacing cross-sectional images to a remote display (ex-situ viewing) impeded the ability to determine spatial relationships among pattern components, a failure of integration at the object level. PMID:22217386

  11. Analysis and Selection of a Remote Docking Simulation Visual Display System

    NASA Technical Reports Server (NTRS)

    Shields, N., Jr.; Fagg, M. F.

    1984-01-01

    The development of a remote docking simulation visual display system is examined. Video system and operator performance are discussed as well as operator command and control requirements and a design analysis of the reconfigurable work station.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denslow, Kayte M.; Bontha, Jagannadha R.; Adkins, Harold E.

    This document presents the visual and ultrasonic PulseEcho critical velocity test results obtained from the System Performance test campaign that was completed in September 2012 with the Remote Sampler Demonstration (RSD)/Waste Feed Flow Loop cold-test platform located at the Monarch test facility in Pasco, Washington. This report is intended to complement and accompany the report that will be developed by WRPS on the design of the System Performance simulant matrix, the analysis of the slurry test sample concentration and particle size distribution (PSD) data, and the design and construction of the RSD/Waste Feed Flow Loop cold-test platform.

  13. Utilisation of Wearable Computing for Space Programmes Test Activities Optimasation

    NASA Astrophysics Data System (ADS)

    Basso, V.; Lazzari, D.; Alemanni, M.

    2004-08-01

    New technologies are assuming a relevant importance in the Space business domain also in the Assembly Integration and Test (AIT) activities allowing process optimization and capability that were unthinkable only few years ago. This paper has the aim to describe Alenia Spazio (ALS) gained experience on the remote interaction techniques as a results of collaborations established both on European Communities (EC) initiatives, with Alenia Aeronautica (ALA) and Politecnico of Torino (POLITO). The H/W and S/W components performances increase and costs reduction due to the home computing massive utilization (especially demanded by the games business) together with the network technology possibility (offered by the web as well as the hi-speed links and the wireless communications) allow today to re-think the traditional AIT process activities in the light of the multimedia data exchange: graphical, voice video and by sure more in the future. Aerospace business confirm its innovation vocation which in the year '80 represents the cradle of the CAD systems and today is oriented to the 3D data visualization/ interaction technologies and remote visualisation/ interaction in collaborative way on a much more user friendly bases (i.e. not for specialists). Fig. 1 collects AIT extended scenario studied and adopted by ALS in these years. ALS experimented two possibilities of remote visualization/interaction: Portable [e.g. Fig.2 Personal Digital Assistant (PDA), Wearable] and walls (e.g.VR-Lab) screens as both 2D/3D visualisation and interaction devices which could support many types of traditional (mainly based on EGSE and PDM/CAD utilisation/reports) company internal AIT applications: 1. design review support 2. facility management 3. storage management 4. personnel training 5. integration sequences definition 6. assembly and test operations follow up 7. documentation review and external access to AIT activities for remote operations (e.g. tele-testing) EGSE Portable Clean room Walls PDM/CAD Tele-operations Product Control room External World

  14. Remote surface inspection system. [of large space platforms

    NASA Technical Reports Server (NTRS)

    Hayati, Samad; Balaram, J.; Seraji, Homayoun; Kim, Won S.; Tso, Kam S.

    1993-01-01

    This paper reports on an on-going research and development effort in remote surface inspection of space platforms such as the Space Station Freedom (SSF). It describes the space environment and identifies the types of damage for which to search. This paper provides an overview of the Remote Surface Inspection System that was developed to conduct proof-of-concept demonstrations and to perform experiments in a laboratory environment. Specifically, the paper describes three technology areas: (1) manipulator control for sensor placement; (2) automated non-contact inspection to detect and classify flaws; and (3) an operator interface to command the system interactively and receive raw or processed sensor data. Initial findings for the automated and human visual inspection tests are reported.

  15. Use of a Remote Eye-Tracker for the Analysis of Gaze during Treadmill Walking and Visual Stimuli Exposition.

    PubMed

    Serchi, V; Peruzzi, A; Cereatti, A; Della Croce, U

    2016-01-01

    The knowledge of the visual strategies adopted while walking in cognitively engaging environments is extremely valuable. Analyzing gaze when a treadmill and a virtual reality environment are used as motor rehabilitation tools is therefore critical. Being completely unobtrusive, remote eye-trackers are the most appropriate way to measure the point of gaze. Still, the point of gaze measurements are affected by experimental conditions such as head range of motion and visual stimuli. This study assesses the usability limits and measurement reliability of a remote eye-tracker during treadmill walking while visual stimuli are projected. During treadmill walking, the head remained within the remote eye-tracker workspace. Generally, the quality of the point of gaze measurements declined as the distance from the remote eye-tracker increased and data loss occurred for large gaze angles. The stimulus location (a dot-target) did not influence the point of gaze accuracy, precision, and trackability during both standing and walking. Similar results were obtained when the dot-target was replaced by a static or moving 2D target and "region of interest" analysis was applied. These findings foster the feasibility of the use of a remote eye-tracker for the analysis of gaze during treadmill walking in virtual reality environments.

  16. Remote Advanced Payload Test Rig (RAPTR) Portable Payload Test System for the International Space Station

    NASA Technical Reports Server (NTRS)

    De La Cruz, Melinda; Henderson, Steve

    2016-01-01

    The RAPTR was developed to test ISS payloads for NASA. RAPTR is a simulation of the Command and Data Handling (C&DH) interfaces of the ISS (MIL-STD1553B, Ethernet and TAXI) and is designed for rapid testing and deployment of payload experiments to the ISS. The ISS's goal is to reduce the amount of time it takes for a payload developer to build, test and fly a payload, including payload software. The RAPTR meets this need with its user oriented, visually rich interface.

  17. Remote sensing image ship target detection method based on visual attention model

    NASA Astrophysics Data System (ADS)

    Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong

    2017-11-01

    The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.

  18. Web-based visualization of very large scientific astronomy imagery

    NASA Astrophysics Data System (ADS)

    Bertin, E.; Pillay, R.; Marmo, C.

    2015-04-01

    Visualizing and navigating through large astronomy images from a remote location with current astronomy display tools can be a frustrating experience in terms of speed and ergonomics, especially on mobile devices. In this paper, we present a high performance, versatile and robust client-server system for remote visualization and analysis of extremely large scientific images. Applications of this work include survey image quality control, interactive data query and exploration, citizen science, as well as public outreach. The proposed software is entirely open source and is designed to be generic and applicable to a variety of datasets. It provides access to floating point data at terabyte scales, with the ability to precisely adjust image settings in real-time. The proposed clients are light-weight, platform-independent web applications built on standard HTML5 web technologies and compatible with both touch and mouse-based devices. We put the system to the test and assess the performance of the system and show that a single server can comfortably handle more than a hundred simultaneous users accessing full precision 32 bit astronomy data.

  19. REMOTE SENSING, VISUALIZATION AND DECISION SUPPORT FOR WATERSHED MANAGEMENT AND SUSTAINABLE AGRICULTURE

    EPA Science Inventory

    The integration of satellite and airborne remote sensing, scientific visualization and decision support tools is discussed within the context of management techniques for minimizing the non-point source pollution load of inland waterways and the sustainability of food crop produc...

  20. Disaster Emergency Rapid Assessment Based on Remote Sensing and Background Data

    NASA Astrophysics Data System (ADS)

    Han, X.; Wu, J.

    2018-04-01

    The period from starting to the stable conditions is an important stage of disaster development. In addition to collecting and reporting information on disaster situations, remote sensing images by satellites and drones and monitoring results from disaster-stricken areas should be obtained. Fusion of multi-source background data such as population, geography and topography, and remote sensing monitoring information can be used in geographic information system analysis to quickly and objectively assess the disaster information. According to the characteristics of different hazards, the models and methods driven by the rapid assessment of mission requirements are tested and screened. Based on remote sensing images, the features of exposures quickly determine disaster-affected areas and intensity levels, and extract key disaster information about affected hospitals and schools as well as cultivated land and crops, and make decisions after emergency response with visual assessment results.

  1. Immersive Molecular Visualization with Omnidirectional Stereoscopic Ray Tracing and Remote Rendering

    PubMed Central

    Stone, John E.; Sherman, William R.; Schulten, Klaus

    2016-01-01

    Immersive molecular visualization provides the viewer with intuitive perception of complex structures and spatial relationships that are of critical interest to structural biologists. The recent availability of commodity head mounted displays (HMDs) provides a compelling opportunity for widespread adoption of immersive visualization by molecular scientists, but HMDs pose additional challenges due to the need for low-latency, high-frame-rate rendering. State-of-the-art molecular dynamics simulations produce terabytes of data that can be impractical to transfer from remote supercomputers, necessitating routine use of remote visualization. Hardware-accelerated video encoding has profoundly increased frame rates and image resolution for remote visualization, however round-trip network latencies would cause simulator sickness when using HMDs. We present a novel two-phase rendering approach that overcomes network latencies with the combination of omnidirectional stereoscopic progressive ray tracing and high performance rasterization, and its implementation within VMD, a widely used molecular visualization and analysis tool. The new rendering approach enables immersive molecular visualization with rendering techniques such as shadows, ambient occlusion lighting, depth-of-field, and high quality transparency, that are particularly helpful for the study of large biomolecular complexes. We describe ray tracing algorithms that are used to optimize interactivity and quality, and we report key performance metrics of the system. The new techniques can also benefit many other application domains. PMID:27747138

  2. Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

    PubMed

    Park, Chung Hyuk; Ryu, Eun-Seok; Howard, Ayanna M

    2015-01-01

    This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.

  3. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 1 : summary report.

    DOT National Transportation Integrated Search

    2009-12-01

    The Integrated Remote Sensing and Visualization System (IRSV) is being designed to accommodate the needs of todays Bridge : Engineers at the state and local level from the following aspects: : Better understanding and enforcement of a complex ...

  4. Prototype crawling robotics system for remote visual inspection of high-mast light poles.

    DOT National Transportation Integrated Search

    1997-01-01

    This report presents the results of a project to develop a crawling robotics system for the remote visual inspection of high-mast light poles in Virginia. The first priority of this study was to develop a simple robotics application that would reduce...

  5. Learning Methods of Remote Sensing In the 2013 Curriculum of Secondary School

    NASA Astrophysics Data System (ADS)

    Lili Somantri, Nandi

    2016-11-01

    The new remote sensing material included in the subjects of geography in the curriculum of 1994. For geography teachers generation of 90s and over who in college do not get the material remote sensing, for teaching is a tough matter. Most teachers only give a theoretical matter, and do not carry out practical reasons in the lack of facilities and infrastructure of computer laboratories. Therefore, in this paper studies the importance about the method or manner of teaching remote sensing material in schools. The purpose of this paper is 1) to explain the position of remote sensing material in the study of geography, 2) analyze the Geography Curriculum 2013 Subjects related to remote sensing material, 3) describes a method of teaching remote sensing material in schools. The method used in this paper is a descriptive analytical study supported by the literature. The conclusion of this paper that the position of remote sensing in the study of geography is a method or a way to obtain spatial data earth's surface. In the 2013 curriculum remote sensing material has been applied to the study of land use and transportation. Remote sensing methods of teaching must go through a practicum, which starts from the introduction of the theory of remote sensing, data extraction phase of remote sensing imagery to produce maps, both visually and digitally, field surveys, interpretation of test accuracy, and improved maps.

  6. Subscale Flight Testing for Aircraft Loss of Control: Accomplishments and Future Directions

    NASA Technical Reports Server (NTRS)

    Cox, David E.; Cunningham, Kevin; Jordan, Thomas L.

    2012-01-01

    Subscale flight-testing provides a means to validate both dynamic models and mitigation technologies in the high-risk flight conditions associated with aircraft loss of control. The Airborne Subscale Transport Aircraft Research (AirSTAR) facility was designed to be a flexible and efficient research facility to address this type of flight-testing. Over the last several years (2009-2011) it has been used to perform 58 research flights with an unmanned, remotely-piloted, dynamically-scaled airplane. This paper will present an overview of the facility and its architecture and summarize the experimental data collected. All flights to date have been conducted within visual range of a safety observer. Current plans for the facility include expanding the test volume to altitudes and distances well beyond visual range. The architecture and instrumentation changes associated with this upgrade will also be presented.

  7. Detection and Monitoring of Oil Spills Using Moderate/High-Resolution Remote Sensing Images.

    PubMed

    Li, Ying; Cui, Can; Liu, Zexi; Liu, Bingxin; Xu, Jin; Zhu, Xueyuan; Hou, Yongchao

    2017-07-01

    Current marine oil spill detection and monitoring methods using high-resolution remote sensing imagery are quite limited. This study presented a new bottom-up and top-down visual saliency model. We used Landsat 8, GF-1, MAMS, HJ-1 oil spill imagery as dataset. A simplified, graph-based visual saliency model was used to extract bottom-up saliency. It could identify the regions with high visual saliency object in the ocean. A spectral similarity match model was used to obtain top-down saliency. It could distinguish oil regions and exclude the other salient interference by spectrums. The regions of interest containing oil spills were integrated using these complementary saliency detection steps. Then, the genetic neural network was used to complete the image classification. These steps increased the speed of analysis. For the test dataset, the average running time of the entire process to detect regions of interest was 204.56 s. During image segmentation, the oil spill was extracted using a genetic neural network. The classification results showed that the method had a low false-alarm rate (high accuracy of 91.42%) and was able to increase the speed of the detection process (fast runtime of 19.88 s). The test image dataset was composed of different types of features over large areas in complicated imaging conditions. The proposed model was proved to be robust in complex sea conditions.

  8. Utility of remotely sensed data for identification of soil conservation practices

    NASA Technical Reports Server (NTRS)

    Pelletier, R. E.; Griffin, R. H.

    1986-01-01

    Discussed are a variety of remotely sensed data sources that may have utility in the identification of conservation practices and related linear features. Test sites were evaluated in Alabama, Kansas, Mississippi, and Oklahoma using one or more of a variety of remotely sensed data sources, including color infrared photography (CIR), LANDSAT Thematic Mapper (TM) data, and aircraft-acquired Thermal Infrared Multispectral Scanner (TIMS) data. Both visual examination and computer-implemented enhancement procedures were used to identify conservation practices and other linear features. For the Kansas, Mississippi, and Oklahoma test sites, photo interpretations of CIR identified up to 24 of the 109 conservation practices from a matrix derived from the SCS National Handbook of Conservation Practices. The conservation practice matrix was modified to predict the possibility of identifying the 109 practices at various photographic scales based on the observed results as well as photo interpreter experience. Some practices were successfully identified in TM data through visual identification, but a number of existing practices were of such size and shape that the resolution of the TM could not detect them accurately. A series of computer-automated decorrelation and filtering procedures served to enhance the conservation practices in TM data with only fair success. However, features such as field boundaries, roads, water bodies, and the Urban/Ag interface were easily differentiated. Similar enhancement techniques applied to 5 and 10 meter TIMS data proved much more useful in delineating terraces, grass waterways, and drainage ditches as well as the features mentioned above, due partly to improved resolution and partly to thermally influenced moisture conditions. Spatially oriented data such as those derived from remotely sensed data offer some promise in the inventory and monitoring of conservation practices as well as in supplying parameter data for a variety of computer-implemented agricultural models.

  9. Hybrid region merging method for segmentation of high-resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhang, Xueliang; Xiao, Pengfeng; Feng, Xuezhi; Wang, Jiangeng; Wang, Zuo

    2014-12-01

    Image segmentation remains a challenging problem for object-based image analysis. In this paper, a hybrid region merging (HRM) method is proposed to segment high-resolution remote sensing images. HRM integrates the advantages of global-oriented and local-oriented region merging strategies into a unified framework. The globally most-similar pair of regions is used to determine the starting point of a growing region, which provides an elegant way to avoid the problem of starting point assignment and to enhance the optimization ability for local-oriented region merging. During the region growing procedure, the merging iterations are constrained within the local vicinity, so that the segmentation is accelerated and can reflect the local context, as compared with the global-oriented method. A set of high-resolution remote sensing images is used to test the effectiveness of the HRM method, and three region-based remote sensing image segmentation methods are adopted for comparison, including the hierarchical stepwise optimization (HSWO) method, the local-mutual best region merging (LMM) method, and the multiresolution segmentation (MRS) method embedded in eCognition Developer software. Both the supervised evaluation and visual assessment show that HRM performs better than HSWO and LMM by combining both their advantages. The segmentation results of HRM and MRS are visually comparable, but HRM can describe objects as single regions better than MRS, and the supervised and unsupervised evaluation results further prove the superiority of HRM.

  10. View compensated compression of volume rendered images for remote visualization.

    PubMed

    Lalgudi, Hariharan G; Marcellin, Michael W; Bilgin, Ali; Oh, Han; Nadar, Mariappan S

    2009-07-01

    Remote visualization of volumetric images has gained importance over the past few years in medical and industrial applications. Volume visualization is a computationally intensive process, often requiring hardware acceleration to achieve a real time viewing experience. One remote visualization model that can accomplish this would transmit rendered images from a server, based on viewpoint requests from a client. For constrained server-client bandwidth, an efficient compression scheme is vital for transmitting high quality rendered images. In this paper, we present a new view compensation scheme that utilizes the geometric relationship between viewpoints to exploit the correlation between successive rendered images. The proposed method obviates motion estimation between rendered images, enabling significant reduction to the complexity of a compressor. Additionally, the view compensation scheme, in conjunction with JPEG2000 performs better than AVC, the state of the art video compression standard.

  11. Final report: Prototyping a combustion corridor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.; Leach, Joshua

    2001-12-15

    The Combustion Corridor is a concept in which researchers in combustion and thermal sciences have unimpeded access to large volumes of remote computational results. This will enable remote, collaborative analysis and visualization of state-of-the-art combustion science results. The Engine Research Center (ERC) at the University of Wisconsin - Madison partnered with Lawrence Berkeley National Laboratory, Argonne National Laboratory, Sandia National Laboratory, and several other universities to build and test the first stages of a combustion corridor. The ERC served two important functions in this partnership. First, we work extensively with combustion simulations so we were able to provide real worldmore » research data sets for testing the Corridor concepts. Second, the ERC was part of an extension of the high bandwidth based DOE National Laboratory connections to universities.« less

  12. Static and Motion-Based Visual Features Used by Airport Tower Controllers: Some Implications for the Design of Remote or Virtual Towers

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Liston, Dorion B.

    2011-01-01

    Visual motion and other visual cues are used by tower controllers to provide important support for their control tasks at and near airports. These cues are particularly important for anticipated separation. Some of them, which we call visual features, have been identified from structured interviews and discussions with 24 active air traffic controllers or supervisors. The visual information that these features provide has been analyzed with respect to possible ways it could be presented at a remote tower that does not allow a direct view of the airport. Two types of remote towers are possible. One could be based on a plan-view, map-like computer-generated display of the airport and its immediate surroundings. An alternative would present a composite perspective view of the airport and its surroundings, possibly provided by an array of radially mounted cameras positioned at the airport in lieu of a tower. An initial more detailed analyses of one of the specific landing cues identified by the controllers, landing deceleration, is provided as a basis for evaluating how controllers might detect and use it. Understanding other such cues will help identify the information that may be degraded or lost in a remote or virtual tower not located at the airport. Some initial suggestions how some of the lost visual information may be presented in displays are mentioned. Many of the cues considered involve visual motion, though some important static cues are also discussed.

  13. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 1 : outreach and commercialization of IRSV prototype.

    DOT National Transportation Integrated Search

    2012-03-01

    The Integrated Remote Sensing and Visualization System (IRSV) was developed in Phase One of this project in order to : accommodate the needs of todays Bridge Engineers at the state and local level. Overall goals of this project are: : Better u...

  14. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 2 : knowledge modeling and database development.

    DOT National Transportation Integrated Search

    2009-12-01

    The Integrated Remote Sensing and Visualization System (IRSV) is being designed to accommodate the needs of todays Bridge Engineers at the : state and local level from several aspects that were documented in Volume One, Summary Report. The followi...

  15. Live Aircraft Encounter Visualization at FutureFlight Central

    NASA Technical Reports Server (NTRS)

    Murphy, James R.; Chinn, Fay; Monheim, Spencer; Otto, Neil; Kato, Kenji; Archdeacon, John

    2018-01-01

    Researchers at the National Aeronautics and Space Administration (NASA) have developed an aircraft data streaming capability that can be used to visualize live aircraft in near real-time. During a joint Federal Aviation Administration (FAA)/NASA Airborne Collision Avoidance System flight series, test sorties between unmanned aircraft and manned intruder aircraft were shown in real-time at NASA Ames' FutureFlight Central tower facility as a virtual representation of the encounter. This capability leveraged existing live surveillance, video, and audio data streams distributed through a Live, Virtual, Constructive test environment, then depicted the encounter from the point of view of any aircraft in the system showing the proximity of the other aircraft. For the demonstration, position report data were sent to the ground from on-board sensors on the unmanned aircraft. The point of view can be change dynamically, allowing encounters from all angles to be observed. Visualizing the encounters in real-time provides a safe and effective method for observation of live flight testing and a strong alternative to travel to the remote test range.

  16. Virtual reality at work

    NASA Technical Reports Server (NTRS)

    Brooks, Frederick P., Jr.

    1991-01-01

    The utility of virtual reality computer graphics in telepresence applications is not hard to grasp and promises to be great. When the virtual world is entirely synthetic, as opposed to real but remote, the utility is harder to establish. Vehicle simulators for aircraft, vessels, and motor vehicles are proving their worth every day. Entertainment applications such as Disney World's StarTours are technologically elegant, good fun, and economically viable. Nevertheless, some of us have no real desire to spend our lifework serving the entertainment craze of our sick culture; we want to see this exciting technology put to work in medicine and science. The topics covered include the following: testing a force display for scientific visualization -- molecular docking; and testing a head-mounted display for scientific and medical visualization.

  17. Development of sea ice monitoring with aerial remote sensing technology

    NASA Astrophysics Data System (ADS)

    Jiang, Xuhui; Han, Lei; Dong, Liang; Cui, Lulu; Bie, Jun; Fan, Xuewei

    2014-11-01

    In the north China Sea district, sea ice disaster is very serious every winter, which brings a lot of adverse effects to shipping transportation, offshore oil exploitation, and coastal engineering. In recent years, along with the changing of global climate, the sea ice situation becomes too critical. The monitoring of sea ice is playing a very important role in keeping human life and properties in safety, and undertaking of marine scientific research. The methods to monitor sea ice mainly include: first, shore observation; second, icebreaker monitoring; third, satellite remote sensing; and then aerial remote sensing monitoring. The marine station staffs use relevant equipments to monitor the sea ice in the shore observation. The icebreaker monitoring means: the workers complete the test of the properties of sea ice, such as density, salinity and mechanical properties. MODIS data and NOAA data are processed to get sea ice charts in the satellite remote sensing means. Besides, artificial visual monitoring method and some airborne remote sensors are adopted in the aerial remote sensing to monitor sea ice. Aerial remote sensing is an important means in sea ice monitoring because of its strong maneuverability, wide watching scale, and high resolution. In this paper, several methods in the sea ice monitoring using aerial remote sensing technology are discussed.

  18. Legislated emergency locating transmitters and emergency position indicating radio beacons

    NASA Technical Reports Server (NTRS)

    Wade, William R. (Inventor)

    1988-01-01

    An emergency locating transmitting (ELT) system is disclosed which comprises a legislated ELT modified with an interface unit and connected by a multiwire cable to a remote control monitor (RCM), typically located at the pilot position. The RCM can remotely test the ELT by disabling the legislated swept tone and allowing transmission of a single tone, turn the ELT on for legislated ELT transmission, and reset the ELT to an armed condition. The RCM also provides visual and audio indications of transmitter operating condition as well as ELT battery condition. Removing the RCM or shorting or opening the interface input connections will not affect traditional ELT operation.

  19. Web-Altairis: An Internet-Enabled Ground System

    NASA Technical Reports Server (NTRS)

    Miller, Phil; Coleman, Jason; Gemoets, Darren; Hughes, Kevin

    2000-01-01

    This paper describes Web-Altairis, an Internet-enabled ground system software package funded by the Advanced Automation and Architectures Branch (Code 588) of NASA's Goddard Space Flight Center. Web-Altairis supports the trend towards "lights out" ground systems, where the control center is unattended and problems are resolved by remote operators. This client/server software runs on most popular platforms and provides for remote data visualization using the rich functionality of the VisAGE toolkit. Web-Altairis also supports satellite commanding over the Internet. This paper describes the structure of Web-Altairis and VisAGE, the underlying technologies, the provisions for security, and our experiences in developing and testing the software.

  20. 2011 NASA Lunabotics Mining Competition for Universities: Results and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Mueller, Robert P.; Murphy, Gloria A.

    2011-01-01

    Overview: Design, build & compete remote controlled robot (Lunabot). Excavate Black Point 1 (BP-1) Lunar Simulant. Deposit minimum of 10 kg of BP-1 within 15 minutes $5000, $2500, $1000 Scholarships for most BP-1 excavated. May 23-28, 2011. Kennedy Space Center, FL. International Teams Allowed for the First Time. What is a Lunabot? a) Robot Controlled Remotely or Autonomously. b) Visual and Auditory Isolation from Operator. c) Excavates Black Point 1 (BP-l) Simulant. d) Weight Limit - 80 kg. e)Dimension Limits -1.5m width x .75m length x 2m height. f) Designed, Built and Tested by University Student Teams.

  1. Learning by E-Learning for Visually Impaired Students: Opportunities or Again Marginalisation?

    ERIC Educational Resources Information Center

    Kharade, Kalpana; Peese, Hema

    2012-01-01

    In recent years, e-learning has become a valuable tool for an increasing number of visually impaired (VI) learners. The benefits of this technology include: (1) remote learning for VI students; (2) the possibility for teachers living far from schools or universities to provide remote instructional assistance to VI students; and (3) continuing…

  2. Techniques of remote sensing and GIS as tools for visualizing impact of climate change-induced flood in the southern African region.

    USDA-ARS?s Scientific Manuscript database

    This study employs remote sensing and Geographical Information Systems (GIS) data to visualize the impact of climate change caused by flooding in the Southern African region in order to assist decision makers’ plans for future occurrences. In pursuit of this objective, this study uses Digital Elevat...

  3. Remote infrared audible signage (RIAS) pilot program report.

    DOT National Transportation Integrated Search

    2011-07-01

    The Remote Infrared Audible Sign Model Accessibility Program (RIAS MAP) is a program funded by the Federal Transit Administration (FTA) to evaluate the effectiveness of remote infrared audible sign systems in enabling persons with visual and cognitiv...

  4. A Miniature Fiber-Optic Sensor for High-Resolution and High-Speed Temperature Sensing in Ocean Environment

    DTIC Science & Technology

    2015-11-05

    the SMF is superior when it comes to remote sensing in far and deep ocean. As an initial test , the real-time temperature structure within the water...4 ℃. The high resolution guarantees the visualization of subtle variation in the local water. To test the response time of the proposed sensor, the... Honey , "Optical trubulence in the sea," in Underwater Photo-optical Instrumentation Applications SPIE, 49-55 (1972). [6] J. D. Nash, D. R. Caldwell, M

  5. Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web

    NASA Technical Reports Server (NTRS)

    Watson, Val; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. (5) The scenes can be viewed in 3D using stereo vision. (6) The network bandwidth for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.) This talk will illustrate the use of these new technologies and present a proposal for using these technologies to improve science education.

  6. Remote Sensing of Martian Terrain Hazards via Visually Salient Feature Detection

    NASA Astrophysics Data System (ADS)

    Al-Milli, S.; Shaukat, A.; Spiteri, C.; Gao, Y.

    2014-04-01

    The main objective of the FASTER remote sensing system is the detection of rocks on planetary surfaces by employing models that can efficiently characterise rocks in terms of semantic descriptions. The proposed technique abates some of the algorithmic limitations of existing methods with no training requirements, lower computational complexity and greater robustness towards visual tracking applications over long-distance planetary terrains. Visual saliency models inspired from biological systems help to identify important regions (such as rocks) in the visual scene. Surface rocks are therefore completely described in terms of their local or global conspicuity pop-out characteristics. These local and global pop-out cues are (but not limited to); colour, depth, orientation, curvature, size, luminance intensity, shape, topology etc. The currently applied methods follow a purely bottom-up strategy of visual attention for selection of conspicuous regions in the visual scene without any topdown control. Furthermore the choice of models used (tested and evaluated) are relatively fast among the state-of-the-art and have very low computational load. Quantitative evaluation of these state-ofthe- art models was carried out using benchmark datasets including the Surrey Space Centre Lab Testbed, Pangu generated images, RAL Space SEEKER and CNES Mars Yard datasets. The analysis indicates that models based on visually salient information in the frequency domain (SRA, SDSR, PQFT) are the best performing ones for detecting rocks in an extra-terrestrial setting. In particular the SRA model seems to be the most optimum of the lot especially that it requires the least computational time while keeping errors competitively low. The salient objects extracted using these models can then be merged with the Digital Elevation Models (DEMs) generated from the same navigation cameras in order to be fused to the navigation map thus giving a clear indication of the rock locations.

  7. Using Avizo Software on the Peregrine System | High-Performance Computing |

    Science.gov Websites

    be run remotely from the Peregrine visualization node. First, launch a TurboVNC remote desktop. Then from a terminal in that remote desktop: % module load avizo % vglrun avizo Running Locally Avizo can

  8. Visual display angles of conventional and a remotely piloted aircraft.

    PubMed

    Kamine, Tovy Haber; Bendrick, Gregg A

    2009-04-01

    Instrument display separation and proximity are important human factor elements used in the design and grouping of aircraft instrument displays. To assess display proximity in practical operations, the viewing visual angles of various displays in several conventional aircraft and in a remotely piloted vehicle were assessed. The horizontal and vertical instrument display visual angles from the pilot's eye position were measured in 12 different types of conventional aircraft, and in the ground control station (GCS) of a remotely piloted aircraft (RPA). A total of 18 categories of instrument display were measured and compared. In conventional aircraft almost all of the vertical and horizontal visual display angles lay within a "cone of easy eye movement" (CEEM). Mission-critical instruments particular to specific aircraft types sometimes displaced less important instruments outside the CEEM. For the RPA, all horizontal visual angles lay within the CEEM, but most vertical visual angles lay outside this cone. Most instrument displays in conventional aircraft were consistent with display proximity principles, but several RPA displays lay outside the CEEM in the vertical plane. Awareness of this fact by RPA operators may be helpful in minimizing information access cost, and in optimizing RPA operations.

  9. Design Considerations for Remotely Operated Welding in Space: Task Definition and Visual Weld Monitoring Experiment

    DTIC Science & Technology

    1993-05-01

    processes [48] ................ 91 Figure 4.14 Energy effectiveness comparison between EBW, GMAW , and PAW [48...1 10 Figure 5.2 The spectrum of control modes [76] ................. 112 Figure 5.3 Levels of control for GMAW [26...vehicular activity FTS Flight Telerobotic Servicer GMAW Gas metal arc welding GTAW Gas tungsten arc welding LEO Low-earth orbit NDT Non-destructive test

  10. Fast Spectrometer Construction and Testing

    NASA Astrophysics Data System (ADS)

    Menke, John

    2012-05-01

    This paper describes the construction and operation of a medium resolution spectrometer used in the visual wavelength range. It is homebuilt, but has built in guiding and calibration, is fully remote operable, and operates at a resolution R=3000. It features a fast f3.5 system, which allows it to be used with a fast telescope (18 inch f3.5) with no Barlow or other optical matching devices.

  11. Exploring NASA and ESA Atmospheric Data Using GIOVANNI, the Online Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory

    2007-01-01

    Giovanni, the NASA Goddard online visualization and analysis tool (http://giovanni.gsfc.nasa.gov) allows users explore various atmospheric phenomena without learning remote sensing data formats and downloading voluminous data. Using NASA MODIS (Terra and Aqua) and ESA MERIS (ENVISAT) aerosol data as an example, we demonstrate Giovanni usage for online multi-sensor remote sensing data comparison and analysis.

  12. Modeling of Explorative Procedures for Remote Object Identification

    DTIC Science & Technology

    1991-09-01

    haptic sensory system and the simulated foveal component of the visual system. Eventually it will allow multiple applications in remote sensing and...superposition of sensory channels. The use of a force reflecting telemanipulator and computer simulated visual foveal component are the tools which...representation of human search models is achieved by using the proprioceptive component of the haptic sensory system and the simulated foveal component of the

  13. Visually Lossless JPEG 2000 for Remote Image Browsing

    PubMed Central

    Oh, Han; Bilgin, Ali; Marcellin, Michael

    2017-01-01

    Image sizes have increased exponentially in recent years. The resulting high-resolution images are often viewed via remote image browsing. Zooming and panning are desirable features in this context, which result in disparate spatial regions of an image being displayed at a variety of (spatial) resolutions. When an image is displayed at a reduced resolution, the quantization step sizes needed for visually lossless quality generally increase. This paper investigates the quantization step sizes needed for visually lossless display as a function of resolution, and proposes a method that effectively incorporates the resulting (multiple) quantization step sizes into a single JPEG2000 codestream. This codestream is JPEG2000 Part 1 compliant and allows for visually lossless decoding at all resolutions natively supported by the wavelet transform as well as arbitrary intermediate resolutions, using only a fraction of the full-resolution codestream. When images are browsed remotely using the JPEG2000 Interactive Protocol (JPIP), the required bandwidth is significantly reduced, as demonstrated by extensive experimental results. PMID:28748112

  14. Plugin free remote visualization in the browser

    NASA Astrophysics Data System (ADS)

    Tamm, Georg; Slusallek, Philipp

    2015-01-01

    Today, users access information and rich media from anywhere using the web browser on their desktop computers, tablets or smartphones. But the web evolves beyond media delivery. Interactive graphics applications like visualization or gaming become feasible as browsers advance in the functionality they provide. However, to deliver large-scale visualization to thin clients like mobile devices, a dedicated server component is necessary. Ideally, the client runs directly within the browser the user is accustomed to, requiring no installation of a plugin or native application. In this paper, we present the state-of-the-art of technologies which enable plugin free remote rendering in the browser. Further, we describe a remote visualization system unifying these technologies. The system transfers rendering results to the client as images or as a video stream. We utilize the upcoming World Wide Web Consortium (W3C) conform Web Real-Time Communication (WebRTC) standard, and the Native Client (NaCl) technology built into Chrome, to deliver video with low latency.

  15. Remote visual analysis of large turbulence databases at multiple scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  16. Remote visual analysis of large turbulence databases at multiple scales

    DOE PAGES

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...

    2018-06-15

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  17. Remote network control plasma diagnostic system for Tokamak T-10

    NASA Astrophysics Data System (ADS)

    Troynov, V. I.; Zimin, A. M.; Krupin, V. A.; Notkin, G. E.; Nurgaliev, M. R.

    2016-09-01

    The parameters of molecular plasma in closed magnetic trap is studied in this paper. Using the system of molecular diagnostics, which was designed by the authors on the «Tokamak T-10» facility, the radiation of hydrogen isotopes at the plasma edge is investigated. The scheme of optical radiation registration within visible spectrum is described. For visualization, identification and processing of registered molecular spectra a new software is developed using MatLab environment. The software also includes electronic atlas of electronic-vibrational-rotational transitions for molecules of protium and deuterium. To register radiation from limiter cross-section a network control system is designed using the means of the Internet/Intranet. Remote control system diagram and methods are given. The examples of web-interfaces for working out equipment control scenarios and viewing of results are provided. After test run in Intranet, the remote diagnostic system will be accessible through Internet.

  18. A comparative interregional analysis of selected data from LANDSAT-1 and EREP for the inventory and monitoring of natural ecosystems

    NASA Technical Reports Server (NTRS)

    Poulton, C. E.

    1975-01-01

    Comparative statistics were presented on the capability of LANDSAT-1 and three of the Skylab remote sensing systems (S-190A, S-190B, S-192) for the recognition and inventory of analogous natural vegetations and landscape features important in resource allocation and management. Two analogous regions presenting vegetational zonation from salt desert to alpine conditions above the timberline were observed, emphasizing the visual interpretation mode in the investigation. An hierarchical legend system was used as the basic classification of all land surface features. Comparative tests were run on image identifiability with the different sensor systems, and mapping and interpretation tests were made both in monocular and stereo interpretation with all systems except the S-192. Significant advantage was found in the use of stereo from space when image analysis is by visual or visual-machine-aided interactive systems. Some cost factors in mapping from space are identified. The various image types are compared and an operational system is postulated.

  19. The effects of spatially displaced visual feedback on remote manipulator performance

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Stuart, Mark A.

    1993-01-01

    The results of this evaluation have important implications for the arrangement of remote manipulation worksites and the design of workstations for telerobot operations. This study clearly illustrates the deleterious effects that can accompany the performance of remote manipulator tasks when viewing conditions are less than optimal. Future evaluations should emphasize telerobot camera locations and the use of image/graphical enhancement techniques in an attempt to lessen the adverse effects of displaced visual feedback. An important finding in this evaluation is the extent to which results from previously performed direct manipulation studies can be generalized to remote manipulation studies. Even though the results obtained were very similar to those of the direct manipulation evaluations, there were differences as well. This evaluation has demonstrated that generalizations to remote manipulation applications based upon the results of direct manipulation studies are quite useful, but they should be made cautiously.

  20. JackIn Head: Immersive Visual Telepresence System with Omnidirectional Wearable Camera.

    PubMed

    Kasahara, Shunichi; Nagai, Shohei; Rekimoto, Jun

    2017-03-01

    Sharing one's own immersive experience over the Internet is one of the ultimate goals of telepresence technology. In this paper, we present JackIn Head, a visual telepresence system featuring an omnidirectional wearable camera with image motion stabilization. Spherical omnidirectional video footage taken around the head of a local user is stabilized and then broadcast to others, allowing remote users to explore the immersive visual environment independently of the local user's head direction. We describe the system design of JackIn Head and report the evaluation results of real-time image stabilization and alleviation of cybersickness. Then, through an exploratory observation study, we investigate how individuals can remotely interact, communicate with, and assist each other with our system. We report our observation and analysis of inter-personal communication, demonstrating the effectiveness of our system in augmenting remote collaboration.

  1. Fast Spectrometer Construction and Testing (Abstract)

    NASA Astrophysics Data System (ADS)

    Menke, J.

    2012-12-01

    This paper describes the construction and operation of a medium resolution spectrometer used in the visual wavelength range. It is homebuilt, but has built in guiding and calibration, is fully remote-operable, and operates at a resolution R = 3000. It features a fast f/3.5 system, which allows it to be used with a fast telescope (18-inch f/3.5) with no Barlow or other optical matching devices.

  2. Integration of centrifuge testing in undergraduate geotechnical engineering education at remote campuses

    NASA Astrophysics Data System (ADS)

    El Shamy, Usama; Abdoun, Tarek; McMartin, Flora; Pando, Miguel A.

    2013-06-01

    We report the results of a pilot study aimed at developing, implementing, and assessing an educational module that integrates remote major research instrumentation into undergraduate classes. Specifically, this study employs Internet Web-based technologies to allow for real-time video monitoring and execution of cutting-edge experiments. The students' activities within the module are centred on building a model of a shallow foundation on a sand deposit utilising a centrifuge facility and using this model for: (1) visual observation of the response of soil-foundation systems, (2) learning the use of instrumentation, (3) interpretation of acquired data, and (4) comparing experimental results to theoretical predictions. Testing a soil-foundation system helped the students identify the lab experiments needed to analyse and design the system. A survey was used to gauge students' perceptions of learning as a result of introducing the module, which were found to be positive.

  3. Understanding heterogeneity in metropolitan India: The added value of remote sensing data for analyzing sub-standard residential areas

    NASA Astrophysics Data System (ADS)

    Baud, Isa; Kuffer, Monika; Pfeffer, Karin; Sliuzas, Richard; Karuppannan, Sadasivam

    2010-10-01

    Analyzing the heterogeneity in metropolitan areas of India utilizing remote sensing data can help to identify more precise patterns of sub-standard residential areas. Earlier work analyzing inequalities in Indian cities employed a constructed index of multiple deprivations (IMDs) utilizing data from the Census of India 2001 ( http://censusindia.gov.in). While that index, described in an earlier paper, provided a first approach to identify heterogeneity at the citywide scale, it neither provided information on spatial variations within the geographical boundaries of the Census database, nor about physical characteristics, such as green spaces and the variation in housing density and quality. In this article, we analyze whether different types of sub-standard residential areas can be identified through remote sensing data, combined, where relevant, with ground-truthing and local knowledge. The specific questions address: (1) the extent to which types of residential sub-standard areas can be drawn from remote sensing data, based on patterns of green space, structure of layout, density of built-up areas, size of buildings and other site characteristics; (2) the spatial diversity of these residential types for selected electoral wards; and (3) the correlation between different types of sub-standard residential areas and the results of the index of multiple deprivations utilized at electoral ward level found previously. The results of a limited number of test wards in Delhi showed that it was possible to extract different residential types matching existing settlement categories using the physical indicators structure of layout, built-up density, building size and other site characteristics. However, the indicator 'amount of green spaces' was not useful to identify informal areas. The analysis of heterogeneity showed that wards with higher IMD scores displayed more or less the full range of residential types, implying that visual image interpretation is able to zoom in on clusters of deprivation of varying size. Finally, the visual interpretation of the diversity of residential types matched the results of the IMD analysis quite well, although the limited number of test wards would need to be expanded to strengthen this statement. Visual image analysis strengthens the robustness of the IMD, and in addition, gives a better idea of the degree of heterogeneity in deprivations within a ward.

  4. The primary visual cortex in the neural circuit for visual orienting

    NASA Astrophysics Data System (ADS)

    Zhaoping, Li

    The primary visual cortex (V1) is traditionally viewed as remote from influencing brain's motor outputs. However, V1 provides the most abundant cortical inputs directly to the sensory layers of superior colliculus (SC), a midbrain structure to command visual orienting such as shifting gaze and turning heads. I will show physiological, anatomical, and behavioral data suggesting that V1 transforms visual input into a saliency map to guide a class of visual orienting that is reflexive or involuntary. In particular, V1 receives a retinotopic map of visual features, such as orientation, color, and motion direction of local visual inputs; local interactions between V1 neurons perform a local-to-global computation to arrive at a saliency map that highlights conspicuous visual locations by higher V1 responses. The conspicuous location are usually, but not always, where visual input statistics changes. The population V1 outputs to SC, which is also retinotopic, enables SC to locate, by lateral inhibition between SC neurons, the most salient location as the saccadic target. Experimental tests of this hypothesis will be shown. Variations of the neural circuit for visual orienting across animal species, with more or less V1 involvement, will be discussed. Supported by the Gatsby Charitable Foundation.

  5. Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground

    NASA Astrophysics Data System (ADS)

    Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.

    2011-11-01

    U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.

  6. Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground

    NASA Astrophysics Data System (ADS)

    Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.

    2012-05-01

    U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.

  7. A Hybrid Synthetic Vision System for the Tele-operation of Unmanned Vehicles

    NASA Technical Reports Server (NTRS)

    Delgado, Frank; Abernathy, Mike

    2004-01-01

    A system called SmartCam3D (SC3D) has been developed to provide enhanced situational awareness for operators of a remotely piloted vehicle. SC3D is a Hybrid Synthetic Vision System (HSVS) that combines live sensor data with information from a Synthetic Vision System (SVS). By combining the dual information sources, the operators are afforded the advantages of each approach. The live sensor system provides real-time information for the region of interest. The SVS provides information rich visuals that will function under all weather and visibility conditions. Additionally, the combination of technologies allows the system to circumvent some of the limitations from each approach. Video sensor systems are not very useful when visibility conditions are hampered by rain, snow, sand, fog, and smoke, while a SVS can suffer from data freshness problems. Typically, an aircraft or satellite flying overhead collects the data used to create the SVS visuals. The SVS data could have been collected weeks, months, or even years ago. To that extent, the information from an SVS visual could be outdated and possibly inaccurate. SC3D was used in the remote cockpit during flight tests of the X-38 132 and 131R vehicles at the NASA Dryden Flight Research Center. SC3D was also used during the operation of military Unmanned Aerial Vehicles. This presentation will provide an overview of the system, the evolution of the system, the results of flight tests, and future plans. Furthermore, the safety benefits of the SC3D over traditional and pure synthetic vision systems will be discussed.

  8. Introduction and Testing of a Monitoring and Colony-Mapping Method for Waterbird Populations That Uses High-Speed and Ultra-Detailed Aerial Remote Sensing

    PubMed Central

    Bakó, Gábor; Tolnai, Márton; Takács, Ádám

    2014-01-01

    Remote sensing is a method that collects data of the Earth's surface without causing disturbances. Thus, it is worthwhile to use remote sensing methods to survey endangered ecosystems, as the studied species will behave naturally while undisturbed. The latest passive optical remote sensing solutions permit surveys from long distances. State-of-the-art highly sensitive sensor systems allow high spatial resolution image acquisition at high altitudes and at high flying speeds, even in low-visibility conditions. As the aerial imagery captured by an airplane covers the entire study area, all the animals present in that area can be recorded. A population assessment is conducted by visual interpretations of an ortho image map. The basic objective of this study is to determine whether small- and medium-sized bird species are recognizable in the ortho images by using high spatial resolution aerial cameras. The spatial resolution needed for identifying the bird species in the ortho image map was studied. The survey was adjusted to determine the number of birds in a colony at a given time. PMID:25046012

  9. A Spatial-Spectral Approach for Visualization of Vegetation Stress Resulting from Pipeline Leakage.

    PubMed

    Van derWerff, Harald; Van der Meijde, Mark; Jansma, Fokke; Van der Meer, Freek; Groothuis, Gert Jan

    2008-06-04

    Hydrocarbon leakage into the environment has large economic and environmental impact. Traditional methods for investigating seepages and their resulting pollution, such as drilling, are destructive, time consuming and expensive. Remote sensing is an efficient tool that offers a non-destructive investigation method. Optical remote sensing has been extensively tested for exploration of onshore hydrocarbon reservoirs and detection of hydrocarbons at the Earth's surface. In this research, we investigate indirect manifestations of pipeline leakage by way of visualizing vegetation anomalies in airborne hyperspectral imagery. Agricultural land-use causes a heterogeneous landcover; variation in red edge position between fields was much larger than infield red edge position variation that could be related to hydrocarbon pollution. A moving and growing kernel procedure was developed to normalzie red edge values relative to values of neighbouring pixels to enhance pollution related anomalies in the image. Comparison of the spatial distribution of anomalies with geochemical data obtained by drilling showed that 8 out of 10 polluted sites were predicted correctly while 2 out of 30 sites that were predicted clean were actually polluted.

  10. A Spatial-Spectral Approach for Visualization of Vegetation Stress Resulting from Pipeline Leakage

    PubMed Central

    van der Werff, Harald; van der Meijde, Mark; Jansma, Fokke; van der Meer, Freek; Groothuis, Gert Jan

    2008-01-01

    Hydrocarbon leakage into the environment has large economic and environmental impact. Traditional methods for investigating seepages and their resulting pollution, such as drilling, are destructive, time consuming and expensive. Remote sensing is an efficient tool that offers a non-destructive investigation method. Optical remote sensing has been extensively tested for exploration of onshore hydrocarbon reservoirs and detection of hydrocarbons at the Earth's surface. In this research, we investigate indirect manifestations of pipeline leakage by way of visualizing vegetation anomalies in airborne hyperspectral imagery. Agricultural land-use causes a heterogeneous landcover; variation in red edge position between fields was much larger than infield red edge position variation that could be related to hydrocarbon pollution. A moving and growing kernel procedure was developed to normalzie red edge values relative to values of neighbouring pixels to enhance pollution related anomalies in the image. Comparison of the spatial distribution of anomalies with geochemical data obtained by drilling showed that 8 out of 10 polluted sites were predicted correctly while 2 out of 30 sites that were predicted clean were actually polluted. PMID:27879905

  11. Is This Real Life? Is This Just Fantasy?: Realism and Representations in Learning with Technology

    NASA Astrophysics Data System (ADS)

    Sauter, Megan Patrice

    Students often engage in hands-on activities during science learning; however, financial and practical constraints often limit the availability of these activities. Recent advances in technology have led to increases in the use of simulations and remote labs, which attempt to recreate hands-on science learning via computer. Remote labs and simulations are interesting from a cognitive perspective because they allow for different relations between representations and their referents. Remote labs are unique in that they provide a yoked representation, meaning that the representation of the lab on the computer screen is actually linked to that which it represents: a real scientific device. Simulations merely represent the lab and are not connected to any real scientific devices. However, the type of visual representations used in the lab may modify the effects of the lab technology. The purpose of this dissertation is to examine the relation between representation and technology and its effects of students' psychological experiences using online science labs. Undergraduates participated in two studies that investigated the relation between technology and representation. In the first study, participants performed either a remote lab or a simulation incorporating one of two visual representations, either a static image or a video of the equipment. Although participants in both lab conditions learned, participants in the remote lab condition had more authentic experiences. However, effects were moderated by the realism of the visual representation. Participants who saw a video were more invested and felt the experience was more authentic. In a second study, participants performed a remote lab and either saw the same video as in the first study, an animation, or the video and an animation. Most participants had an authentic experience because both representations evoked strong feelings of presence. However, participants who saw the video were more likely to believe the remote technology was real. Overall, the findings suggest that participants' experiences with technology were shaped by representation. Students had more authentic experiences using the remote lab than the simulation. However, incorporating visual representations that enhance presence made these experiences even more authentic and meaningful than afforded by the technology alone.

  12. Panoramic-image-based rendering solutions for visualizing remote locations via the web

    NASA Astrophysics Data System (ADS)

    Obeysekare, Upul R.; Egts, David; Bethmann, John

    2000-05-01

    With advances in panoramic image-based rendering techniques and the rapid expansion of web advertising, new techniques are emerging for visualizing remote locations on the WWW. Success of these techniques depends on how easy and inexpensive it is to develop a new type of web content that provides pseudo 3D visualization at home, 24-hours a day. Furthermore, the acceptance of this new visualization medium depends on the effectiveness of the familiarization tools by a segment of the population that was never exposed to this type of visualization. This paper addresses various hardware and software solutions available to collect, produce, and view panoramic content. While cost and effectiveness of building the content is being addressed using a few commercial hardware solutions, effectiveness of familiarization tools is evaluated using a few sample data sets.

  13. An optimized web-based approach for collaborative stereoscopic medical visualization

    PubMed Central

    Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C

    2013-01-01

    Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008

  14. Demonstrating NaradaBrokering as a Middleware Fabric for Grid-based Remote Visualization Services

    NASA Astrophysics Data System (ADS)

    Pallickara, S.; Erlebacher, G.; Yuen, D.; Fox, G.; Pierce, M.

    2003-12-01

    Remote Visualization Services (RVS) have tended to rely on approaches based on the client server paradigm. Here we demonstrate our approach - based on a distributed brokering infrastructure, NaradaBrokering [1] - that relies on distributed, asynchronous and loosely coupled interactions to meet the requirements and constraints of RVS. In our approach to RVS, services advertise their capabilities to the broker network that manages these service advertisements. Among the services considered within our system are those that perform graphic transformations, mediate access to specialized datasets and finally those that manage the execution of specified tasks. There could be multiple instances of each of these services and the system ensures that load for a given service is distributed efficiently over these service instances. We will demonstrate implementation of concepts that we outlined in the oral presentation. This would involve two or more visualization servers interacting asynchronously with multiple clients through NaradaBrokering. The communicating entities may exchange SOAP [2] (Simple Object Access Protocol) messages. SOAP is a lightweight protocol for exchange of information in a decentralized, distributed environment. It is an XML based protocol that consists of three parts: an envelope that describes what is in a message and how to process it, rules for expressing instances of application-defined data types, and a convention for representing remote invocation related operations. Furthermore, we will also demonstrate how clients can retrieve their results after prolonged disconnects or after any failures that might have taken place. The entities, services and clients alike, are not limited by the geographical distances that separate them. We are planning to test this system in the context of trans-Atlantic links separating interacting entities. {[1]} The NaradaBrokering Project: http://www.naradabrokering.org {[2]} Newcomer, E., 2002, Understanding web services: XML, WSDL, SOAP, and UDDI, Addison Wesley Professional.

  15. PolarBRDF: A general purpose Python package for visualization and quantitative analysis of multi-angular remote sensing measurements

    NASA Astrophysics Data System (ADS)

    Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh

    2016-11-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  16. PolarBRDF: A general purpose Python package for visualization and quantitative analysis of multi-angular remote sensing measurements

    NASA Astrophysics Data System (ADS)

    Poudyal, R.; Singh, M.; Gautam, R.; Gatebe, C. K.

    2016-12-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR)- http://car.gsfc.nasa.gov/. Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  17. Polarbrdf: A General Purpose Python Package for Visualization Quantitative Analysis of Multi-Angular Remote Sensing Measurements

    NASA Technical Reports Server (NTRS)

    Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh

    2016-01-01

    The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wild fire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.

  18. Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays

    NASA Astrophysics Data System (ADS)

    Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko

    The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.

  19. Nomad rover field experiment, Atacama Desert, Chile 1. Science results overview

    NASA Astrophysics Data System (ADS)

    Cabrol, N. A.; Thomas, G.; Witzke, B.

    2001-04-01

    Nomad was deployed for a 45 day traverse in the Atacama Desert, Chile, during the summer of 1997. During this traverse, 1 week was devoted to science experiments. The goal of the science experiments was to test different planetary surface exploration strategies that included (1) a Mars mission simulation, (2) a science on the fly experiment, where the rover was kept moving 75% of the operation time. (The goal of this operation was to determine whether or not successful interpretation of the environment is related to the time spent on a target. The role of mobility in helping the interpretation was also assessed.) (3) a meteorite search using visual and instrumental methods to remotely identify meteorites in extreme environments, and (4) a time-delay experiment with and without using the panospheric camera. The results were as follow: the remote science team positively identified the main characteristics of the test site geological environment. The science on the fly experiment showed that the selection of appropriate targets might be even more critical than the time spent on a study area to reconstruct the history of a site. During the same operation the science team members identified and sampled a rock from a Jurassic outcrop that they proposed to be a fossil. The presence of paleolife indicators in this rock was confirmed later by laboratory analysis. Both visual and instrumental modes demonstrated the feasibility, in at least some conditions, of carrying out a field search for meteorites by using remote-controlled vehicles. Finally, metrics collected from the observation of the science team operations, and the use team members made of mission data, provided critical information on what operation sequences could be automated on board rovers in future planetary surface explorations.

  20. Association of Visual Impairment and All-Cause 10-Year Mortality Among Indigenous Australian Individuals Within Central Australia: The Central Australian Ocular Health Study.

    PubMed

    Ng, Soo Khai; Kahawita, Shyalle; Andrew, Nicholas Howard; Henderson, Tim; Craig, Jamie Evan; Landers, John

    2018-05-01

    It is well established from different population-based studies that visual impairment is associated with increased mortality rate. However, to our knowledge, the association of visual impairment with increased mortality rate has not been reported among indigenous Australian individuals. To assess the association between visual impairment and 10-year mortality risk among the remote indigenous Australian population. Prospective cohort study recruiting indigenous Australian individuals from 30 remote communities located within the central Australian statistical local area over a 36-month period between July 2005 and June 2008. The data were analyzed in January 2017. Visual acuity, slitlamp biomicroscopy, and fundus examination were performed on all patients at recruitment. Visual impairment was defined as a visual acuity of less than 6/12 in the better eye. Mortality rate and mortality cause were obtained at 10 years, and statistical analyses were performed. Hazard ratios for 10-year mortality with 95% confidence intervals are presented. One thousand three hundred forty-seven patients were recruited from a total target population number of 2014. The mean (SD) age was 56 (11) years, and 62% were women. The total all-cause mortality was found to be 29.3% at 10 years. This varied from 21.1% among those without visual impairment to 48.5% among those with visual impairment. After adjustment for age, sex, and the presence of diabetes and hypertension, those with visual impairment were 40% more likely to die (hazard ratio, 1.40; 95% CI, 1.16-1.70; P = .001) during the 10-year follow-up period compared with those with normal vision. Bilateral visual impairment among remote indigenous Australian individuals was associated with 40% higher 10-year mortality risk compared with those who were not visually impaired. Resource allocation toward improving visual acuity may therefore aid in closing the gap in mortality outcomes between indigenous and nonindigenous Australian individuals.

  1. Real-time, interactive, visually updated simulator system for telepresence

    NASA Technical Reports Server (NTRS)

    Schebor, Frederick S.; Turney, Jerry L.; Marzwell, Neville I.

    1991-01-01

    Time delays and limited sensory feedback of remote telerobotic systems tend to disorient teleoperators and dramatically decrease the operator's performance. To remove the effects of time delays, key components were designed and developed of a prototype forward simulation subsystem, the Global-Local Environment Telerobotic Simulator (GLETS) that buffers the operator from the remote task. GLETS totally immerses an operator in a real-time, interactive, simulated, visually updated artificial environment of the remote telerobotic site. Using GLETS, the operator will, in effect, enter into a telerobotic virtual reality and can easily form a gestalt of the virtual 'local site' that matches the operator's normal interactions with the remote site. In addition to use in space based telerobotics, GLETS, due to its extendable architecture, can also be used in other teleoperational environments such as toxic material handling, construction, and undersea exploration.

  2. A Fresh Look at Spatio-Temporal Remote Sensing Data: Data Formats, Processing Flow, and Visualization

    NASA Astrophysics Data System (ADS)

    Gens, R.

    2017-12-01

    With increasing number of experimental and operational satellites in orbit, remote sensing based mapping and monitoring of the dynamic Earth has entered into the realm of `big data'. Just the Landsat series of satellites provide a near continuous archive of 45 years of data. The availability of such spatio-temporal datasets has created opportunities for long-term monitoring diverse features and processes operating on the Earth's terrestrial and aquatic systems. Processes such as erosion, deposition, subsidence, uplift, evapotranspiration, urbanization, land-cover regime shifts can not only be monitored and change can be quantified using time-series data analysis. This unique opportunity comes with new challenges in management, analysis, and visualization of spatio-temporal datasets. Data need to be stored in a user-friendly format, and relevant metadata needs to be recorded, to allow maximum flexibility for data exchange and use. Specific data processing workflows need to be defined to support time-series analysis for specific applications. Value-added data products need to be generated keeping in mind the needs of the end-users, and using best practices in complex data visualization. This presentation systematically highlights the various steps for preparing spatio-temporal remote sensing data for time series analysis. It showcases a prototype workflow for remote sensing based change detection that can be generically applied while preserving the application-specific fidelity of the datasets. The prototype includes strategies for visualizing change over time. This has been exemplified using a time-series of optical and SAR images for visualizing the changing glacial, coastal, and wetland landscapes in parts of Alaska.

  3. Assessing disease stress and modeling yield losses in alfalfa

    NASA Astrophysics Data System (ADS)

    Guan, Jie

    Alfalfa is the most important forage crop in the U.S. and worldwide. Fungal foliar diseases are believed to cause significant yield losses in alfalfa, yet, little quantitative information exists regarding the amount of crop loss. Different fungicides and application frequencies were used as tools to generate a range of foliar disease intensities in Ames and Nashua, IA. Visual disease assessments (disease incidence, disease severity, and percentage defoliation) were obtained weekly for each alfalfa growth cycle (two to three growing cycles per season). Remote sensing assessments were performed using a hand-held, multispectral radiometer to measure the amount and quality of sunlight reflected from alfalfa canopies. Factors such as incident radiation, sun angle, sensor height, and leaf wetness were all found to significantly affect the percentage reflectance of sunlight reflected from alfalfa canopies. The precision of visual and remote sensing assessment methods was quantified. Precision was defined as the intra-rater repeatability and inter-rater reliability of assessment methods. F-tests, slopes, intercepts, and coefficients of determination (R2) were used to compare assessment methods for precision. Results showed that among the three visual disease assessment methods (disease incidence, disease severity, and percentage defoliation), percentage defoliation had the highest intra-rater repeatability and inter-rater reliability. Remote sensing assessment method had better precision than the percentage defoliation assessment method based upon higher intra-rater repeatability and inter-rater reliability. Significant linear relationships between canopy reflectance (810 nm), percentage defoliation and yield were detected using linear regression and percentage reflectance (810 nm) assessments were found to have a stronger relationship with yield than percentage defoliation assessments. There were also significant linear relationships between percentage defoliation, dry weight, percentage reflectance (810 nm), and green leaf area index (GLAI). Percentage reflectance (810 nm) assessments had a stronger relationship with dry weight and green leaf area index than percentage defoliation assessments. Our research conclusively demonstrates that percentage reflectance measurements can be used to nondestructively assess green leaf area index which is a direct measure of plant health and an indirect measure of productivity. This research conclusively demonstrates that remote sensing is superior to visual assessment method to assess alfalfa stress and to model yield and GLAI in the alfalfa foliar disease pathosystem.

  4. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1991-01-01

    A program of research embracing teleoperator and automatic navigational control of freely flying satellite robots is presented. Current research goals include: (1) developing visual operator interfaces for improved vehicle teleoperation; (2) determining the effects of different visual interface system designs on operator performance; and (3) achieving autonomous vision-based vehicle navigation and control. This research program combines virtual-environment teleoperation studies and neutral-buoyancy experiments using a space-robot simulator vehicle currently under development. Visual-interface design options under investigation include monoscopic versus stereoscopic displays and cameras, helmet-mounted versus panel-mounted display monitors, head-tracking versus fixed or manually steerable remote cameras, and the provision of vehicle-fixed visual cues, or markers, in the remote scene for improved sensing of vehicle position, orientation, and motion.

  5. Integrating Time-Synchronized Video with Other Geospatial and Temporal Data for Remote Science Operations

    NASA Technical Reports Server (NTRS)

    Cohen, Tamar E.; Lees, David S.; Deans, Matthew C.; Lim, Darlene S. S.; Lee, Yeon Jin Grace

    2018-01-01

    Exploration Ground Data Systems (xGDS) supports rapid scientific decision making by synchronizing video in context with map, instrument data visualization, geo-located notes and any other collected data. xGDS is an open source web-based software suite developed at NASA Ames Research Center to support remote science operations in analog missions and prototype solutions for remote planetary exploration. (See Appendix B) Typical video systems are designed to play or stream video only, independent of other data collected in the context of the video. Providing customizable displays for monitoring live video and data as well as replaying recorded video and data helps end users build up a rich situational awareness. xGDS was designed to support remote field exploration with unreliable networks. Commercial digital recording systems operate under the assumption that there is a stable and reliable network between the source of the video and the recording system. In many field deployments and space exploration scenarios, this is not the case - there are both anticipated and unexpected network losses. xGDS' Video Module handles these interruptions, storing the available video, organizing and characterizing the dropouts, and presenting the video for streaming or replay to the end user including visualization of the dropouts. Scientific instruments often require custom or expensive software to analyze and visualize collected data. This limits the speed at which the data can be visualized and limits access to the data to those users with the software. xGDS' Instrument Module integrates with instruments that collect and broadcast data in a single snapshot or that continually collect and broadcast a stream of data. While seeing a visualization of collected instrument data is informative, showing the context for the collected data, other data collected nearby along with events indicating current status helps remote science teams build a better understanding of the environment. Further, sharing geo-located, tagged notes recorded by the scientists and others on the team spurs deeper analysis of the data.

  6. A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment.

    PubMed

    Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M

    2016-01-26

    Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.

  7. A miniature cable-driven robot for crawling on the heart.

    PubMed

    Patronik, N A; Zenati, M A; Riviere, C N

    2005-01-01

    This document describes the design and preliminary testing of a cable-driven robot for the purpose of traveling on the surface of the beating heart to administer therapy. This methodology obviates mechanical stabilization and lung deflation, which are typically required during minimally invasive cardiac surgery. Previous versions of the robot have been remotely actuated through push-pull wires, while visual feedback was provided by fiber optic transmission. Although these early models were able to perform locomotion in vivo on porcine hearts, the stiffness of the wire-driven transmission and fiber optic camera limited the mobility of the robots. The new prototype described in this document is actuated by two antagonistic cable pairs, and contains a color CCD camera located in the front section of the device. These modifications have resulted in superior mobility and visual feedback. The cable-driven prototype has successfully demonstrated prehension, locomotion, and tissue dye injection during in vitro testing with a poultry model.

  8. A New Definition for Ground Control

    NASA Technical Reports Server (NTRS)

    2002-01-01

    LandForm(R) VisualFlight(R) blends the power of a geographic information system with the speed of a flight simulator to transform a user's desktop computer into a "virtual cockpit." The software product, which is fully compatible with all Microsoft(R) Windows(R) operating systems, provides distributed, real-time three-dimensional flight visualization over a host of networks. From a desktop, a user can immediately obtain a cockpit view, a chase-plane view, or an airborne tracker view. A customizable display also allows the user to overlay various flight parameters, including latitude, longitude, altitude, pitch, roll, and heading information. Rapid Imaging Software sought assistance from NASA, and the VisualFlight technology came to fruition under a Phase II SBIR contract with Johnson Space Center in 1998. Three years later, on December 13, 2001, Ken Ham successfully flew NASA's X-38 spacecraft from a remote, ground-based cockpit using LandForm VisualFlight as part of his primary situation awareness display in a flight test at Edwards Air Force Base, California.

  9. Image fusion based on Bandelet and sparse representation

    NASA Astrophysics Data System (ADS)

    Zhang, Jiuxing; Zhang, Wei; Li, Xuzhi

    2018-04-01

    Bandelet transform could acquire geometric regular direction and geometric flow, sparse representation could represent signals with as little as possible atoms on over-complete dictionary, both of which could be used to image fusion. Therefore, a new fusion method is proposed based on Bandelet and Sparse Representation, to fuse Bandelet coefficients of multi-source images and obtain high quality fusion effects. The test are performed on remote sensing images and simulated multi-focus images, experimental results show that the performance of new method is better than tested methods according to objective evaluation indexes and subjective visual effects.

  10. Flow visualization study of the HiMAT RPRV

    NASA Technical Reports Server (NTRS)

    Lorincz, D. J.

    1980-01-01

    Water tunnel studies were performed to qualitatively define the flow field of the highly maneuverable aircraft technology remotely piloted research vehicle (HiMAT RPRV). Particular emphasis was placed on defining the vortex flows generated at high angles of attack. The flow visualization tests were conducted in the Northrop water tunnel using a 1/15 scale model of the HiMAT RPRV. Flow visualization photographs were obtained for angles of attack up to 40 deg and sideslip angles up to 5 deg. The HiMAT model was investigated in detail to determine the canard and wing vortex flow field development, vortex paths, and vortex breakdown characteristics as a function of angle of attack and sideslip. The presence of the canard caused the wing vortex to form further outboard and delayed the breakdown of the wing vortex to higher angles of attack. An increase in leading edge camber of the maneuver configuration delayed both the formation and the breakdown of the wing and canard vortices. Additional tests showed that the canard vortex was sensitive to variations in inlet mass flow ratio and canard flap deflection angle.

  11. 3D visualization of optical ray aberration and its broadcasting to smartphones by ray aberration generator

    NASA Astrophysics Data System (ADS)

    Hellman, Brandon; Bosset, Erica; Ender, Luke; Jafari, Naveed; McCann, Phillip; Nguyen, Chris; Summitt, Chris; Wang, Sunglin; Takashima, Yuzuru

    2017-11-01

    The ray formalism is critical to understanding light propagation, yet current pedagogy relies on inadequate 2D representations. We present a system in which real light rays are visualized through an optical system by using a collimated laser bundle of light and a fog chamber. Implementation for remote and immersive access is enabled by leveraging a commercially available 3D viewer and gesture-based remote controlling of the tool via bi-directional communication over the Internet.

  12. Effects of heavy ions on visual function and electrophysiology of rodents: the ALTEA-MICE project

    NASA Technical Reports Server (NTRS)

    Sannita, W. G.; Acquaviva, M.; Ball, S. L.; Belli, F.; Bisti, S.; Bidoli, V.; Carozzo, S.; Casolino, M.; Cucinotta, F.; De Pascale, M. P.; hide

    2004-01-01

    ALTEA-MICE will supplement the ALTEA project on astronauts and provide information on the functional visual impairment possibly induced by heavy ions during prolonged operations in microgravity. Goals of ALTEA-MICE are: (1) to investigate the effects of heavy ions on the visual system of normal and mutant mice with retinal defects; (2) to define reliable experimental conditions for space research; and (3) to develop animal models to study the physiological consequences of space travels on humans. Remotely controlled mouse setup, applied electrophysiological recording methods, remote particle monitoring, and experimental procedures were developed and tested. The project has proved feasible under laboratory-controlled conditions comparable in important aspects to those of astronauts' exposure to particle in space. Experiments are performed at the Brookhaven National Laboratories [BNL] (Upton, NY, USA) and the Gesellschaft fur Schwerionenforschung mbH [GSI]/Biophysik (Darmstadt, FRG) to identify possible electrophysiological changes and/or activation of protective mechanisms in response to pulsed radiation. Offline data analyses are in progress and observations are still anecdotal. Electrophysiological changes after pulsed radiation are within the limits of spontaneous variability under anesthesia, with only indirect evidence of possible retinal/cortical responses. Immunostaining showed changes (e.g. increased expression of FGF2 protein in the outer nuclear layer) suggesting a retinal stress reaction to high-energy particles of potential relevance in space. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  13. 2005 AG20/20 Annual Review

    NASA Technical Reports Server (NTRS)

    Ross, Kenton W.; McKellip, Rodney D.

    2005-01-01

    Topics covered include: Implementation and Validation of Sensor-Based Site-Specific Crop Management; Enhanced Management of Agricultural Perennial Systems (EMAPS) Using GIS and Remote Sensing; Validation and Application of Geospatial Information for Early Identification of Stress in Wheat; Adapting and Validating Precision Technologies for Cotton Production in the Mid-Southern United States - 2004 Progress Report; Development of a System to Automatically Geo-Rectify Images; Economics of Precision Agriculture Technologies in Cotton Production-AG 2020 Prescription Farming Automation Algorithms; Field Testing a Sensor-Based Applicator for Nitrogen and Phosphorus Application; Early Detection of Citrus Diseases Using Machine Vision and DGPS; Remote Sensing of Citrus Tree Stress Levels and Factors; Spectral-based Nitrogen Sensing for Citrus; Characterization of Tree Canopies; In-field Sensing of Shallow Water Tables and Hydromorphic Soils with an Electromagnetic Induction Profiler; Maintaining the Competitiveness of Tree Fruit Production Through Precision Agriculture; Modeling and Visualizing Terrain and Remote Sensing Data for Research and Education in Precision Agriculture; Thematic Soil Mapping and Crop-Based Strategies for Site-Specific Management; and Crop-Based Strategies for Site-Specific Management.

  14. [Investigation on remote measurement of air pollution by a method of infrared passive scanning imaging].

    PubMed

    Jiao, Yang; Xu, Liang; Gao, Min-Guang; Feng, Ming-Chun; Jin, Ling; Tong, Jing-Jing; Li, Sheng

    2012-07-01

    Passive remote sensing by Fourier-transform infrared (FTIR) spectrometry allows detection of air pollution. However, for the localization of a leak and a complete assessment of the situation in the case of the release of a hazardous cloud, information about the position and the distribution of a cloud is essential. Therefore, an imaging passive remote sensing system comprising an interferometer, a data acquisition and processing software, scan system, a video system, and a personal computer has been developed. The remote sensing of SF6 was done. The column densities of all directions in which a target compound has been identified may be retrieved by a nonlinear least squares fitting algorithm and algorithm of radiation transfer, and a false color image is displayed. The results were visualized by a video image, overlaid by false color concentration distribution image. The system has a high selectivity, and allows visualization and quantification of pollutant clouds.

  15. Visual information mining in remote sensing image archives

    NASA Astrophysics Data System (ADS)

    Pelizzari, Andrea; Descargues, Vincent; Datcu, Mihai P.

    2002-01-01

    The present article focuses on the development of interactive exploratory tools for visually mining the image content in large remote sensing archives. Two aspects are treated: the iconic visualization of the global information in the archive and the progressive visualization of the image details. The proposed methods are integrated in the Image Information Mining (I2M) system. The images and image structure in the I2M system are indexed based on a probabilistic approach. The resulting links are managed by a relational data base. Both the intrinsic complexity of the observed images and the diversity of user requests result in a great number of associations in the data base. Thus new tools have been designed to visualize, in iconic representation the relationships created during a query or information mining operation: the visualization of the query results positioned on the geographical map, quick-looks gallery, visualization of the measure of goodness of the query, visualization of the image space for statistical evaluation purposes. Additionally the I2M system is enhanced with progressive detail visualization in order to allow better access for operator inspection. I2M is a three-tier Java architecture and is optimized for the Internet.

  16. VERDEX: A virtual environment demonstrator for remote driving applications

    NASA Technical Reports Server (NTRS)

    Stone, Robert J.

    1991-01-01

    One of the key areas of the National Advanced Robotics Centre's enabling technologies research program is that of the human system interface, phase 1 of which started in July 1989 and is currently addressing the potential of virtual environments to permit intuitive and natural interactions between a human operator and a remote robotic vehicle. The aim of the first 12 months of this program (to September, 1990) is to develop a virtual human-interface demonstrator for use later as a test bed for human factors experimentation. This presentation will describe the current state of development of the test bed, and will outline some human factors issues and problems for more general discussion. In brief, the virtual telepresence system for remote driving has been designed to take the following form. The human operator will be provided with a helmet-mounted stereo display assembly, facilities for speech recognition and synthesis (using the Marconi Macrospeak system), and a VPL DataGlove Model 2 unit. The vehicle to be used for the purposes of remote driving is a Cybermotion Navmaster K2A system, which will be equipped with a stereo camera and microphone pair, mounted on a motorized high-speed pan-and-tilt head incorporating a closed-loop laser ranging sensor for camera convergence control (currently under contractual development). It will be possible to relay information to and from the vehicle and sensory system via an umbilical or RF link. The aim is to develop an interactive audio-visual display system capable of presenting combined stereo TV pictures and virtual graphics windows, the latter featuring control representations appropriate for vehicle driving and interaction using a graphical 'hand,' slaved to the flex and tracking sensors of the DataGlove and an additional helmet-mounted Polhemus IsoTrack sensor. Developments planned for the virtual environment test bed include transfer of operator control between remote driving and remote manipulation, dexterous end effector integration, virtual force and tactile sensing (also the focus of a current ARRL contract, initially employing a 14-pneumatic bladder glove attachment), and sensor-driven world modeling for total virtual environment generation and operator-assistance in remote scene interrogation.

  17. Research on optimal path planning algorithm of task-oriented optical remote sensing satellites

    NASA Astrophysics Data System (ADS)

    Liu, Yunhe; Xu, Shengli; Liu, Fengjing; Yuan, Jingpeng

    2015-08-01

    GEO task-oriented optical remote sensing satellite, is very suitable for long-term continuous monitoring and quick access to imaging. With the development of high resolution optical payload technology and satellite attitude control technology, GEO optical remote sensing satellites will become an important developing trend for aerospace remote sensing satellite in the near future. In the paper, we focused on GEO optical remote sensing satellite plane array stare imaging characteristics and real-time leading mission of earth observation mode, targeted on satisfying needs of the user with the minimum cost of maneuver, and put forward the optimal path planning algorithm centered on transformation from geographic coordinate space to Field of plane, and finally reduced the burden of the control system. In this algorithm, bounded irregular closed area on the ground would be transformed based on coordinate transformation relations in to the reference plane for field of the satellite payload, and then using the branch and bound method to search for feasible solutions, cutting off the non-feasible solution in the solution space based on pruning strategy; and finally trimming some suboptimal feasible solutions based on the optimization index until a feasible solution for the global optimum. Simulation and visualization presentation software testing results verified the feasibility and effectiveness of the strategy.

  18. Smarter Instruments, Smarter Archives: Machine Learning for Tactical Science

    NASA Astrophysics Data System (ADS)

    Thompson, D. R.; Kiran, R.; Allwood, A.; Altinok, A.; Estlin, T.; Flannery, D.

    2014-12-01

    There has been a growing interest by Earth and Planetary Sciences in machine learning, visualization and cyberinfrastructure to interpret ever-increasing volumes of instrument data. Such tools are commonly used to analyze archival datasets, but they can also play a valuable real-time role during missions. Here we discuss ways that machine learning can benefit tactical science decisions during Earth and Planetary Exploration. Machine learning's potential begins at the instrument itself. Smart instruments endowed with pattern recognition can immediately recognize science features of interest. This allows robotic explorers to optimize their limited communications bandwidth, triaging science products and prioritizing the most relevant data. Smart instruments can also target their data collection on the fly, using principles of experimental design to reduce redundancy and generally improve sampling efficiency for time-limited operations. Moreover, smart instruments can respond immediately to transient or unexpected phenomena. Examples include detections of cometary plumes, terrestrial floods, or volcanism. We show recent examples of smart instruments from 2014 tests including: aircraft and spacecraft remote sensing instruments that recognize cloud contamination, field tests of a "smart camera" for robotic surface geology, and adaptive data collection by X-Ray fluorescence spectrometers. Machine learning can also assist human operators when tactical decision making is required. Terrestrial scenarios include airborne remote sensing, where the decision to re-fly a transect must be made immediately. Planetary scenarios include deep space encounters or planetary surface exploration, where the number of command cycles is limited and operators make rapid daily decisions about where next to collect measurements. Visualization and modeling can reveal trends, clusters, and outliers in new data. This can help operators recognize instrument artifacts or spot anomalies in real time. We show recent examples from science data pipelines deployed onboard aircraft as well as tactical visualizations for non-image instrument data.

  19. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia

    NASA Astrophysics Data System (ADS)

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  20. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia.

    PubMed

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  1. Learning and Prediction of Slip from Visual Information

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Matthies, Larry; Helmick, Daniel; Perona, Pietro

    2007-01-01

    This paper presents an approach for slip prediction from a distance for wheeled ground robots using visual information as input. Large amounts of slippage which can occur on certain surfaces, such as sandy slopes, will negatively affect rover mobility. Therefore, obtaining information about slip before entering such terrain can be very useful for better planning and avoiding these areas. To address this problem, terrain appearance and geometry information about map cells are correlated to the slip measured by the rover while traversing each cell. This relationship is learned from previous experience, so slip can be predicted remotely from visual information only. The proposed method consists of terrain type recognition and nonlinear regression modeling. The method has been implemented and tested offline on several off-road terrains including: soil, sand, gravel, and woodchips. The final slip prediction error is about 20%. The system is intended for improved navigation on steep slopes and rough terrain for Mars rovers.

  2. Dynamic changes in ear temperature in relation to separation distress in dogs.

    PubMed

    Riemer, Stefanie; Assis, Luciana; Pike, Thomas W; Mills, Daniel S

    2016-12-01

    Infrared thermography can visualize changes in body surface temperature that result from stress-induced physiological changes and alterations of blood flow patterns. Here we explored its use for remote stress monitoring (i.e. removing need for human presence) in a sample of six pet dogs. Dogs were tested in a brief separation test involving contact with their owner, a stranger, and social isolation for two one-minute-periods. Tests were filmed using a thermographic camera set up in a corner of the room, around 7m from where the subjects spent most of the time. Temperature was measured from selected regions of both ear pinnae simultaneously. Temperatures of both ear pinnae showed a pattern of decrease during separation and increase when a person (either the owner or a stranger) was present, with no lateralized temperature differences between the two ears. Long distance thermographic measurement is a promising technique for non-invasive remote stress assessment, although there are some limitations related to dogs' hair structure over the ears, making it unsuitable for some subjects. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Monitoring Architectural Heritage by Wireless Sensors Networks: San Gimignano — A Case Study

    PubMed Central

    Mecocci, Alessandro; Abrardo, Andrea

    2014-01-01

    This paper describes a wireless sensor network (WSN) used to monitor the health state of architectural heritage in real-time. The WSN has been deployed and tested on the “Rognosa” tower in the medieval village of San Gimignano, Tuscany, Italy. This technology, being non-invasive, mimetic, and long lasting, is particularly well suited for long term monitoring and on-line diagnosis of the conservation state of heritage buildings. The proposed monitoring system comprises radio-equipped nodes linked to suitable sensors capable of monitoring crucial parameters like: temperature, humidity, masonry cracks, pouring rain, and visual light. The access to data is granted by a user interface for remote control. The WSN can autonomously send remote alarms when predefined thresholds are reached. PMID:24394600

  4. An automatic calibration procedure for remote eye-gaze tracking systems.

    PubMed

    Model, Dmitri; Guestrin, Elias D; Eizenman, Moshe

    2009-01-01

    Remote gaze estimation systems use calibration procedures to estimate subject-specific parameters that are needed for the calculation of the point-of-gaze. In these procedures, subjects are required to fixate on a specific point or points at specific time instances. Advanced remote gaze estimation systems can estimate the optical axis of the eye without any personal calibration procedure, but use a single calibration point to estimate the angle between the optical axis and the visual axis (line-of-sight). This paper presents a novel automatic calibration procedure that does not require active user participation. To estimate the angles between the optical and visual axes of each eye, this procedure minimizes the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display (e.g., watching a video clip). Simulation results demonstrate that the performance of the algorithm improves as the range of viewing angles increases. For a subject sitting 75 cm in front of an 80 cm x 60 cm display (40" TV) the standard deviation of the error in the estimation of the angles between the optical and visual axes is 0.5 degrees.

  5. A multi-mode manipulator display system for controlling remote robotic systems

    NASA Technical Reports Server (NTRS)

    Massimino, Michael J.; Meschler, Michael F.; Rodriguez, Alberto A.

    1994-01-01

    The objective and contribution of the research presented in this paper is to provide a Multi-Mode Manipulator Display System (MMDS) to assist a human operator with the control of remote manipulator systems. Such systems include space based manipulators such as the space shuttle remote manipulator system (SRMS) and future ground controlled teleoperated and telescience space systems. The MMDS contains a number of display modes and submodes which display position control cues position data in graphical formats, based primarily on manipulator position and joint angle data. Therefore the MMDS is not dependent on visual information for input and can assist the operator especially when visual feedback is inadequate. This paper provides descriptions of the new modes and experiment results to date.

  6. A client–server framework for 3D remote visualization of radiotherapy treatment space

    PubMed Central

    Santhanam, Anand P.; Min, Yugang; Dou, Tai H.; Kupelian, Patrick; Low, Daniel A.

    2013-01-01

    Radiotherapy is safely employed for treating wide variety of cancers. The radiotherapy workflow includes a precise positioning of the patient in the intended treatment position. While trained radiation therapists conduct patient positioning, consultation is occasionally required from other experts, including the radiation oncologist, dosimetrist, or medical physicist. In many circumstances, including rural clinics and developing countries, this expertise is not immediately available, so the patient positioning concerns of the treating therapists may not get addressed. In this paper, we present a framework to enable remotely located experts to virtually collaborate and be present inside the 3D treatment room when necessary. A multi-3D camera framework was used for acquiring the 3D treatment space. A client–server framework enabled the acquired 3D treatment room to be visualized in real-time. The computational tasks that would normally occur on the client side were offloaded to the server side to enable hardware flexibility on the client side. On the server side, a client specific real-time stereo rendering of the 3D treatment room was employed using a scalable multi graphics processing units (GPU) system. The rendered 3D images were then encoded using a GPU-based H.264 encoding for streaming. Results showed that for a stereo image size of 1280 × 960 pixels, experts with high-speed gigabit Ethernet connectivity were able to visualize the treatment space at approximately 81 frames per second. For experts remotely located and using a 100 Mbps network, the treatment space visualization occurred at 8–40 frames per second depending upon the network bandwidth. This work demonstrated the feasibility of remote real-time stereoscopic patient setup visualization, enabling expansion of high quality radiation therapy into challenging environments. PMID:23440605

  7. Seasat-A and the commercial ocean community

    NASA Technical Reports Server (NTRS)

    Montgomery, D. R.; Wolff, P.

    1977-01-01

    The Seasat-A program has been initiated as a 'proof-of-concept' mission to evaluate the effectiveness of remotely sensing oceanology and related meteorological phenomena from a satellite platform in space utilizing sensors developed on previous space and aircraft test programs. The sensors include three active microwave sensors; a radar altimeter, a windfield scatterometer, and a synthetic aperture radar. A passive scanning multifrequency microwave radiometer, visual and infrared radiometer are also included. All weather, day-night measurements of sea surface temperature, surface wind speed/direction and sea state and directional wave spectra will be made. Two key programs are planned for data utilization with users during the mission. Foremost is a program with the commercial ocean community to test the utility of Seasat-A data and to begin the transfer of ocean remote sensing technology to the civil sector. A second program is a solicitation of investigations, led by NOAA, to involve the ocean science community in a series of scientific investigations.

  8. Managing Construction Operations Visually: 3-D Techniques for Complex Topography and Restricted Visibility

    ERIC Educational Resources Information Center

    Rodriguez, Walter; Opdenbosh, Augusto; Santamaria, Juan Carlos

    2006-01-01

    Visual information is vital in planning and managing construction operations, particularly, where there is complex terrain topography and salvage operations with limited accessibility and visibility. From visually-assessing site operations and preventing equipment collisions to simulating material handling activities to supervising remotes sites…

  9. REMOTE SENSING APPLICATIONS FOR SUSTAINABLE WATERSHED MANAGEMENT AND FOOD SECURITY

    EPA Science Inventory

    The integration of IKONOS satellite data, airborne color infrared remote sensing, visualization, and decision support tools is discussed, within the contexts of management techniques for minimizing non-point source pollution in inland waterways, such s riparian buffer restoration...

  10. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    NASA Astrophysics Data System (ADS)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  11. Trachoma, cataracts and uncorrected refractive error are still important contributors to visual morbidity in two remote indigenous communities of the Northern Territory, Australia.

    PubMed

    Wright, Heathcote R; Keeffe, Jill E; Taylor, Hugh R

    2009-08-01

    To assess the contribution of trachoma, cataract and refractive error to visual morbidity among Indigenous adults living in two remote communities of the Northern Territory. Cross-sectional survey of all adults aged 40 and over within a desert and coastal community. Visual acuity, clinical signs of trachoma using the simplified WHO grading system and assessment of cataract through a non-dilated pupil. Two hundred and sixty individuals over the age of 40 years participated in the study. The prevalence of visual impairment (<6/12) was 17%. The prevalence of blindness (<3/60) was 2%, 40-fold higher than seen in an urban Australian population when adjusted for age. In total, 78% of adults who grew up in a desert community had trachomatous scarring compared with 26% of those who grew up in a coastal community (P < or = 0.001). In the desert community the prevalence of trachomatous trichiasis was 10% and corneal opacity was 6%. No trachomatous trichiasis or corneal opacity was seen in the coastal community. Trachoma, cataract and uncorrected refractive error remain significant contributors to visual morbidity in at least two remote indigenous communities. A wider survey is required to determine if these findings represent a more widespread pattern and existing eye care services may need to be re-assessed to determine the cause of this unmet need.

  12. [Remote Slit Lamp Microscope Consultation System Based on Web].

    PubMed

    Chen, Junfa; Zhuo, Yong; Liu, Zuguo; Chen, Yanping

    2015-11-01

    To realize the remote operation of the slit lamp microscope for department of ophthalmology consultation, and visual display the real-time status of remote slit lamp microscope, a remote slit lamp microscope consultation system based on B/S structure is designed and implemented. Through framing the slit lamp microscope on the website system, the realtime acquisition and transmission of remote control and image data is realized. The three dimensional model of the slit lamp microscope is established and rendered on the web by using WebGL technology. The practical application results can well show the real-time interactive of the remote consultation system.

  13. Low-cost, smartphone based frequency doubling technology visual field testing using virtual reality (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Alawa, Karam A.; Sayed, Mohamed; Arboleda, Alejandro; Durkee, Heather A.; Aguilar, Mariela C.; Lee, Richard K.

    2017-02-01

    Glaucoma is the leading cause of irreversible blindness worldwide. Due to its wide prevalence, effective screening tools are necessary. The purpose of this project is to design and evaluate a system that enables portable, cost effective, smartphone based visual field screening based on frequency doubling technology. The system is comprised of an Android smartphone to display frequency doubling stimuli and handle processing, a Bluetooth remote for user input, and a virtual reality headset to simulate the exam. The LG Nexus 5 smartphone and BoboVR Z3 virtual reality headset were used for their screen size and lens configuration, respectively. The system is capable of running the C-20, N-30, 24-2, and 30-2 testing patterns. Unlike the existing system, the smartphone FDT tests both eyes concurrently by showing the same background to both eyes but only displaying the stimulus to one eye at a time. Both the Humphrey Zeiss FDT and the smartphone FDT were tested on five subjects without a history of ocular disease with the C-20 testing pattern. The smartphone FDT successfully produced frequency doubling stimuli at the correct spatial and temporal frequency. Subjects could not tell which eye was being tested. All five subjects preferred the smartphone FDT to the Humphrey Zeiss FDT due to comfort and ease of use. The smartphone FDT is a low-cost, portable visual field screening device that can be used as a screening tool for glaucoma.

  14. The design of PC/MISI, a PC-based common user interface to remote information storage and retrieval systems. Presentation visuals. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, The Design of PC/MISI, a PC-Based Common User Interface to Remote Information Storage and Retrieval Systems, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-15. The paper discusses the following: problem definition; the PC solution; the goals of system design; the design description; future considerations, the research environment; conclusions.

  15. A Dedicated Environmental Remote Sensing Facility for the Columbia Earth Institute

    NASA Technical Reports Server (NTRS)

    Weissel, Jeffrey K.; Small, Christopher

    1999-01-01

    This paper presents a final technical report on a dedicated environmental remote sensing facility for the Columbia Earth Institute. The above-referenced award enabled the Lamont-Doherty Earth Observatory to establish a state-of-the-art remote sensing image analysis and data visualization facility to serve the research and educational needs of students and staff at Lamont and the Columbia Earth Institute.

  16. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 4 : use of knowledge integrated visual analytics system in supporting bridge management.

    DOT National Transportation Integrated Search

    2009-12-01

    The goals of integration should be: Supporting domain oriented data analysis through the use of : knowledge augmented visual analytics system. In this project, we focus on: : Providing interactive data exploration for bridge managements. : ...

  17. Fault-Tolerant Control For A Robotic Inspection System

    NASA Technical Reports Server (NTRS)

    Tso, Kam Sing

    1995-01-01

    Report describes first phase of continuing program of research on fault-tolerant control subsystem of telerobotic visual-inspection system. Goal of program to develop robotic system for remotely controlled visual inspection of structures in outer space.

  18. Quantifying Pilot Visual Attention in Low Visibility Terminal Operations

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.

    2012-01-01

    Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation

  19. A far-field-viewing sensor for making analytical measurements in remote locations.

    PubMed

    Michael, K L; Taylor, L C; Walt, D R

    1999-07-15

    We demonstrate a far-field-viewing GRINscope sensor for making analytical measurements in remote locations. The GRINscope was fabricated by permanently affixing a micro-Gradient index (GRIN) lens on the distal face of a 350-micron-diameter optical imaging fiber. The GRINscope can obtain both chemical and visual information. In one application, a thin, pH-sensitive polymer layer was immobilized on the distal end of the GRINscope. The ability of the GRINscope to visually image its far-field surroundings and concurrently detect pH changes in a flowing stream was demonstrated. In a different application, the GRINscope was used to image pH- and O2-sensitive particles on a remote substrate and simultaneously measure their fluorescence intensity in response to pH or pO2 changes.

  20. A computer simulation experiment of supervisory control of remote manipulation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Mccandlish, S. G.

    1966-01-01

    A computer simulation of a remote manipulation task and a rate-controlled manipulator is described. Some low-level automatic decision making ability which could be used at the operator's discretion to augment his direct continuous control was built into the manipulator. Experiments were made on the effect of transmission delay, dynamic lag, and intermittent vision on human manipulative ability. Delay does not make remote manipulation impossible. Intermittent visual feedback, and the absence of rate information in the display presented to the operator do not seem to impair the operator's performance. A small-capacity visual feedback channel may be sufficient for remote manipulation tasks, or one channel might be time-shared between several operators. In other experiments the operator called in sequence various on-site automatic control programs of the machine, and thereby acted as a supervisor. The supervisory mode of operation has some advantages when the task to be performed is difficult for a human controlling directly.

  1. Sensible Success

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Commercial remote sensing uses satellite imagery to provide valuable information about the planet's features. By capturing light reflected from the Earth's surface with cameras or sensor systems, usually mounted on an orbiting satellite, data is obtained for business enterprises with an interest in land feature distribution. Remote sensing is practical when applied to large-area coverage, such as agricultural monitoring, regional mapping, environmental assessment, and infrastructure planning. For example, cellular service providers use satellite imagery to select the most ideal location for a communication tower. Crowsey Incorporated has the ability to use remote sensing capabilities to conduct spatial geographic visualizations and other remote-sensing services. Presently, the company has found a demand for these services in the area of litigation support. By using spatial information and analyses, Crowsey helps litigators understand and visualize complex issues and then to communicate a clear argument, with complete indisputable evidence. Crowsey Incorporated is a proud partner in NASA's Mississippi Space Commerce Initiative, with research offices at the John C. Stennis Space Center.

  2. Distribution of man-machine controls in space teleoperation

    NASA Technical Reports Server (NTRS)

    Bejczy, A. K.

    1982-01-01

    The distribution of control between man and machine is dependent on the tasks, available technology, human performance characteristics and control goals. This dependency has very specific projections on systems designed for teleoperation in space. This paper gives a brief outline of the space-related issues and presents the results of advanced teleoperator research and development at the Jet Propulsion Laboratory (JPL). The research and development work includes smart sensors, flexible computer controls and intelligent man-machine interface devices in the area of visual displays and kinesthetic man-machine coupling in remote control of manipulators. Some of the development results have been tested at the Johnson Space Center (JSC) using the simulated full-scale Shuttle Remote Manipulator System (RMS). The research and development work for advanced space teleoperation is far from complete and poses many interdisciplinary challenges.

  3. The Art in Visualizing Natural Landscapes from Space

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Shipman, J. S.; Adams, T.

    2017-12-01

    Satellite remote sensing data can capture the changing Earth at cm resolution, across hundreds of spectral channels, and multiple times per hour. There is an art in combining these datasets together to fully capture the beauty of our planet. The resulting artistic piece can be further transformed by building in an accompanying musical score, allowing for a deeper emotional connection with the public. We make use of visible, near, middle and long wave infrared and radar data as well as different remote sensing techniques to uniquely capture our changing landscape in the spaceborne data. We will generate visually compelling imagery and videos that represent hazardous events from dust storms to landslides and from volcanic eruptions to forest fires. We will demonstrate how specific features of the Earth's landscape can be emphasized through the use of different datasets and color combinations and how, by adding a musical score, we can directly connect with the viewer and heighten their experience. We will also discuss our process to integrate the different aspects of our project together and how it could be developed to capture the beauty of other planets across the solar system using spaceborne imagery and data. Bringing together experts in art installations, composing musical scores, and remote sensing image visualization can lead to new and exciting artistic representations of geoscience data. The resulting product demonstrates there is an art to visualizing remote sensing data to capture the beauty of our planet and that incorporating a musical score can take us all to new places and emotions to enhance our experience.

  4. A streaming-based solution for remote visualization of 3D graphics on mobile devices.

    PubMed

    Lamberti, Fabrizio; Sanna, Andrea

    2007-01-01

    Mobile devices such as Personal Digital Assistants, Tablet PCs, and cellular phones have greatly enhanced user capability to connect to remote resources. Although a large set of applications are now available bridging the gap between desktop and mobile devices, visualization of complex 3D models is still a task hard to accomplish without specialized hardware. This paper proposes a system where a cluster of PCs, equipped with accelerated graphics cards managed by the Chromium software, is able to handle remote visualization sessions based on MPEG video streaming involving complex 3D models. The proposed framework allows mobile devices such as smart phones, Personal Digital Assistants (PDAs), and Tablet PCs to visualize objects consisting of millions of textured polygons and voxels at a frame rate of 30 fps or more depending on hardware resources at the server side and on multimedia capabilities at the client side. The server is able to concurrently manage multiple clients computing a video stream for each one; resolution and quality of each stream is tailored according to screen resolution and bandwidth of the client. The paper investigates in depth issues related to latency time, bit rate and quality of the generated stream, screen resolutions, as well as frames per second displayed.

  5. A medical application integrating remote 3D visualization tools to access picture archiving and communication system on mobile devices.

    PubMed

    He, Longjun; Ming, Xing; Liu, Qian

    2014-04-01

    With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. However, for direct interactive 3D visualization, which plays an important role in radiological diagnosis, the mobile device cannot provide a satisfactory quality of experience for radiologists. This paper developed a medical system that can get medical images from the picture archiving and communication system on the mobile device over the wireless network. In the proposed application, the mobile device got patient information and medical images through a proxy server connecting to the PACS server. Meanwhile, the proxy server integrated a range of 3D visualization techniques, including maximum intensity projection, multi-planar reconstruction and direct volume rendering, to providing shape, brightness, depth and location information generated from the original sectional images for radiologists. Furthermore, an algorithm that changes remote render parameters automatically to adapt to the network status was employed to improve the quality of experience. Finally, performance issues regarding the remote 3D visualization of the medical images over the wireless network of the proposed application were also discussed. The results demonstrated that this proposed medical application could provide a smooth interactive experience in the WLAN and 3G networks.

  6. Region of interest extraction based on multiscale visual saliency analysis for remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhang, Yinggang; Zhang, Libao; Yu, Xianchuan

    2015-01-01

    Region of interest (ROI) extraction is an important component of remote sensing image processing. However, traditional ROI extraction methods are usually prior knowledge-based and depend on classification, segmentation, and a global searching solution, which are time-consuming and computationally complex. We propose a more efficient ROI extraction model for remote sensing images based on multiscale visual saliency analysis (MVS), implemented in the CIE L*a*b* color space, which is similar to visual perception of the human eye. We first extract the intensity, orientation, and color feature of the image using different methods: the visual attention mechanism is used to eliminate the intensity feature using a difference of Gaussian template; the integer wavelet transform is used to extract the orientation feature; and color information content analysis is used to obtain the color feature. Then, a new feature-competition method is proposed that addresses the different contributions of each feature map to calculate the weight of each feature image for combining them into the final saliency map. Qualitative and quantitative experimental results of the MVS model as compared with those of other models show that it is more effective and provides more accurate ROI extraction results with fewer holes inside the ROI.

  7. Memory dysfunction and autonomic neuropathy in non-insulin-dependent (type 2) diabetic patients.

    PubMed

    Zaslavsky, L M; Gross, J L; Chaves, M L; Machado, R

    1995-11-01

    Considering the nervous system as a unit, it might be expected that diabetic patients with autonomic neuropathy could have a central abnormality expressed as cognitive dysfunction. To determine whether autonomic neuropathy is independently associated with cognitive dysfunction, we studied a cross-section of 20 non-insulin-dependent diabetic patients with autonomic neuropathy (14 males and six females; age (mean) = 60 + or - 1 years); 29 non-insulin-dependent diabetic patients without autonomic neuropathy (14 males and 15 females; age = 59 + or - 1 years) and 34 non-diabetic patients (10 males and 24 females; age = 58 + or - 1 years), matched by age, education and duration of disease. Cognitive function was evaluated by tests of immediate, recent and remote memory: verbal (digit span; word span) and visual (recognition of towers and famous faces). Diabetic patients with autonomic neuropathy scored (median) lower in visual memory tests than diabetic patients without autonomic neuropathy and controls (towers immediate = 5 versus 7 and 6; towers recent = 4 versus 6 and 6; faces = 16 versus 18 and 18; respectively; Kruskal-Wallis; P < 0.05). There was no difference in verbal memory performance (Kruskal-Wallis; P > 0.05). Entering age, education, duration of disease and fasting plasma glucose in a stepwise multiple regression, the performance in these tests remained associated with autonomic neuropathy (towers immediate, P = 0.0054, partial r2 = 0.166; towers recent, P = 0.0076, partial r2 = 0.163). Scores in visual tests correlated negatively with the number of abnormal cardiovascular tests (faces, r = -0.25; towers recent, r = -0.24; Spearman; P < 0.05). Decreased visual cognitive function in non-insulin-dependent diabetic patients is associated with the presence and degree of autonomic neuropathy.

  8. Employing Omnidirectional Visual Control for Mobile Robotics.

    ERIC Educational Resources Information Center

    Wright, J. R., Jr.; Jung, S.; Steplight, S.; Wright, J. R., Sr.; Das, A.

    2000-01-01

    Describes projects using conventional technologies--incorporation of relatively inexpensive visual control with mobile robots using a simple remote control vehicle platform, a camera, a mirror, and a computer. Explains how technology teachers can apply them in the classroom. (JOW)

  9. Development of a geographic visualization and communications systems (GVCS) for monitoring remote vehicles

    DOT National Transportation Integrated Search

    1998-03-30

    The purpose of this project is to integrate a variety of geographic information systems : capabilities and telecommunication technologies for potential use in geographic network and : visualization applications. The specific technical goals of the pr...

  10. The Influence of Loss of Visual Cues on Pilot Performance During the Final Approach and Landing Phase of a Remotely Piloted Vehicle Mission

    NASA Technical Reports Server (NTRS)

    Howard, James C.

    1976-01-01

    Remotely piloted research vehicles (RPRVS) are currently being flown from fixed-base control centers, and visual information is supplied to the remote pilot by a TV camera mounted in the vehicle. In these circumstances, the possibility of a TV failure or an interruption in the downlink to the pilot must be considered. To determine the influence of loss of TV information on pilot performance during the final approach and landing phase of a mission, an experiment was conducted in which pilots were asked to fly a fixed-base simulation of a Piper PA-30 aircraft with loss of TV information occurring at altitudes of 15.24, 30.48, and 45.72 m (50, 100, and 150 ft). For this experiment, a specially designed display configuration was presented to four pilots in accordance with a Latin square design. Initial results indicate that pilots could not ensure successful landings from altitudes exceeding 15.24 m (.50 ft) without the visual cues supplied by the TV picture.

  11. A virtual reality-based method of decreasing transmission time of visual feedback for a tele-operative robotic catheter operating system.

    PubMed

    Guo, Jin; Guo, Shuxiang; Tamiya, Takashi; Hirata, Hideyuki; Ishihara, Hidenori

    2016-03-01

    An Internet-based tele-operative robotic catheter operating system was designed for vascular interventional surgery, to afford unskilled surgeons the opportunity to learn basic catheter/guidewire skills, while allowing experienced physicians to perform surgeries cooperatively. Remote surgical procedures, limited by variable transmission times for visual feedback, have been associated with deterioration in operability and vascular wall damage during surgery. At the patient's location, the catheter shape/position was detected in real time and converted into three-dimensional coordinates in a world coordinate system. At the operation location, the catheter shape was reconstructed in a virtual-reality environment, based on the coordinates received. The data volume reduction significantly reduced visual feedback transmission times. Remote transmission experiments, conducted over inter-country distances, demonstrated the improved performance of the proposed prototype. The maximum error for the catheter shape reconstruction was 0.93 mm and the transmission time was reduced considerably. The results were positive and demonstrate the feasibility of remote surgery using conventional network infrastructures. Copyright © 2015 John Wiley & Sons, Ltd.

  12. The use of multimedia and programmed teaching machines for remote sensing education

    NASA Technical Reports Server (NTRS)

    Ulliman, J. J.

    1980-01-01

    The advantages, limitations, and uses of various audio visual equipments and techniques used in various universities for individualized and group instruction in the interpretation and classification of remotely sensed data are considered as well as systems for programmed and computer-assisted instruction.

  13. Human operator performance of remotely controlled tasks: Teleoperator research conducted at NASA's George C. Marshall Space Flight Center. Executive summary

    NASA Technical Reports Server (NTRS)

    Shields, N., Jr.; Piccione, F.; Kirkpatrick, M., III; Malone, T. B.

    1982-01-01

    The combination of human and machine capabilities into an integrated engineering system which is complex and interactive interdisciplinary undertaking is discussed. Human controlled remote systems referred to as teleoperators, are reviewed. The human factors requirements for remotely manned systems are identified. The data were developed in three principal teleoperator laboratories and the visual, manipulator and mobility laboratories are described. Three major sections are identified: (1) remote system components, (2) human operator considerations; and (3) teleoperator system simulation and concept verification.

  14. Forecasting and visualization of wildfires in a 3D geographical information system

    NASA Astrophysics Data System (ADS)

    Castrillón, M.; Jorge, P. A.; López, I. J.; Macías, A.; Martín, D.; Nebot, R. J.; Sabbagh, I.; Quintana, F. M.; Sánchez, J.; Sánchez, A. J.; Suárez, J. P.; Trujillo, A.

    2011-03-01

    This paper describes a wildfire forecasting application based on a 3D virtual environment and a fire simulation engine. A novel open-source framework is presented for the development of 3D graphics applications over large geographic areas, offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community. The application includes a remote module that allows simultaneous connections of several users for monitoring a real wildfire event. The system is able to make a realistic composition of what is really happening in the area of the wildfire with dynamic 3D objects and location of human and material resources in real time, providing a new perspective to analyze the wildfire information. The user is enabled to simulate and visualize the propagation of a fire on the terrain integrating at the same time spatial information on topography and vegetation types with weather and wind data. The application communicates with a remote web service that is in charge of the simulation task. The user may specify several parameters through a friendly interface before the application sends the information to the remote server responsible of carrying out the wildfire forecasting using the FARSITE simulation model. During the process, the server connects to different external resources to obtain up-to-date meteorological data. The client application implements a realistic 3D visualization of the fire evolution on the landscape. A Level Of Detail (LOD) strategy contributes to improve the performance of the visualization system.

  15. Rapid Change Detection Algorithm for Disaster Management

    NASA Astrophysics Data System (ADS)

    Michel, U.; Thunig, H.; Ehlers, M.; Reinartz, P.

    2012-07-01

    This paper focuses on change detection applications in areas where catastrophic events took place which resulted in rapid destruction especially of manmade objects. Standard methods for automated change detection prove not to be sufficient; therefore a new method was developed and tested. The presented method allows a fast detection and visualization of change in areas of crisis or catastrophes. While often new methods of remote sensing are developed without user oriented aspects, organizations and authorities are not able to use these methods because of absence of remote sensing know how. Therefore a semi-automated procedure was developed. Within a transferable framework, the developed algorithm can be implemented for a set of remote sensing data among different investigation areas. Several case studies are the base for the retrieved results. Within a coarse dividing into statistical parts and the segmentation in meaningful objects, the framework is able to deal with different types of change. By means of an elaborated Temporal Change Index (TCI) only panchromatic datasets are used to extract areas which are destroyed, areas which were not affected and in addition areas where rebuilding has already started.

  16. Development of a High Level Waste Tank Inspection System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, D.K.; Loibl, M.W.; Meese, D.C.

    1995-03-21

    The Westinghouse Savannah River Technology Center was requested by it`s sister site, West Valley Nuclear Service (WVNS), to develop a remote inspection system to gather wall thickness readings of their High Level Waste Tanks. WVNS management chose to take a proactive approach to gain current information on two tanks t hat had been in service since the early 70`s. The tanks contain high level waste, are buried underground, and have only two access ports to an annular space between the tank and the secondary concrete vault. A specialized remote system was proposed to provide both a visual surveillance and ultrasonicmore » thickness measurements of the tank walls. A magnetic wheeled crawler was the basis for the remote delivery system integrated with an off-the-shelf Ultrasonic Data Acquisition System. A development program was initiated for Savannah River Technology Center (SRTC) to design, fabricate, and test a remote system based on the Crawler. The system was completed and involved three crawlers to perform the needed tasks, an Ultrasonic Crawler, a Camera Crawler, and a Surface Prep Crawler. The crawlers were computer controlled so that their operation could be done remotely and their position on the wall could be tracked. The Ultrasonic Crawler controls were interfaced with ABB Amdata`s I-PC, Ultrasonic Data Acquisition System so that thickness mapping of the wall could be obtained. A second system was requested by Westinghouse Savannah River Company (WSRC), to perform just ultrasonic mapping on their similar Waste Storage Tanks; however, the system needed to be interfaced with the P-scan Ultrasonic Data Acquisition System. Both remote inspection systems were completed 9/94. Qualifications tests were conducted by WVNS prior to implementation on the actual tank and tank development was achieved 10/94. The second inspection system was deployed at WSRC 11/94 with success, and the system is now in continuous service inspecting the remaining high level waste tanks at WSRC.« less

  17. An operator interface design for a telerobotic inspection system

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Tso, Kam S.; Hayati, Samad

    1993-01-01

    The operator interface has recently emerged as an important element for efficient and safe interactions between human operators and telerobotics. Advances in graphical user interface and graphics technologies enable us to produce very efficient operator interface designs. This paper describes an efficient graphical operator interface design newly developed for remote surface inspection at NASA-JPL. The interface, designed so that remote surface inspection can be performed by a single operator with an integrated robot control and image inspection capability, supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.

  18. OpenGl Visualization Tool and Library Version: 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2010-06-22

    GLVis is an OpenGL tool for visualization of finite element meshes and functions. When started without any options, GLVis starts a server, which waits for a socket connections and visualizes any recieved data. This way the results of simulations on a remote (parallel) machine can be visualized on the lical user desktop. GLVis can also be used to visualize a mesh with or without a finite element function (solution). It can run a batch sequence of commands (GLVis scripts), or display previously saved socket streams.

  19. 46 CFR 154.1365 - Audible and visual alarms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment... it to be turned off after sounding. For remote group alarms this arrangement must not interrupt the..., except for remote group alarms, the location of each fault that actuates it. (d) Each vessel must have...

  20. Method of interpretation of remotely sensed data and applications to land use

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Dossantos, A. P.; Foresti, C.; Demoraesnovo, E. M. L.; Niero, M.; Lombardo, M. A.

    1981-01-01

    Instructional material describing a methodology of remote sensing data interpretation and examples of applicatons to land use survey are presented. The image interpretation elements are discussed for different types of sensor systems: aerial photographs, radar, and MSS/LANDSAT. Visual and automatic LANDSAT image interpretation is emphasized.

  1. 46 CFR 154.1335 - Pressure and vacuum protection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... audible and visual alarm at the cargo control station, and a remote group alarm in the wheelhouse. (c) If... SAFETY STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and...) Has remote readouts at the cargo control station. (2) If vacuum protection is required under § 154.804...

  2. 46 CFR 154.1335 - Pressure and vacuum protection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... audible and visual alarm at the cargo control station, and a remote group alarm in the wheelhouse. (c) If... SAFETY STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and...) Has remote readouts at the cargo control station. (2) If vacuum protection is required under § 154.804...

  3. 46 CFR 154.1335 - Pressure and vacuum protection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... audible and visual alarm at the cargo control station, and a remote group alarm in the wheelhouse. (c) If... SAFETY STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and...) Has remote readouts at the cargo control station. (2) If vacuum protection is required under § 154.804...

  4. 46 CFR 154.1335 - Pressure and vacuum protection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... audible and visual alarm at the cargo control station, and a remote group alarm in the wheelhouse. (c) If... SAFETY STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and...) Has remote readouts at the cargo control station. (2) If vacuum protection is required under § 154.804...

  5. Audiographics for Distance Education: An Alternative Technology.

    ERIC Educational Resources Information Center

    Fredrickson, Scott

    Audiographics is the merging of microcomputer graphics, telephone communications systems, and teaching strategies into a cost effective method of delivering distance education classes. The teacher creates visual images that are sent to and stored on computers at the remote sites. At the appropriate time the teacher and the remote site assistants…

  6. Hot gas ingestion testing of an advanced STOVL concept in the NASA Lewis 9- by 15-foot low speed wind tunnel with flow visualization

    NASA Technical Reports Server (NTRS)

    Johns, Albert L.; Flood, Joseph D.; Strock, Thomas W.; Amuedo, Kurt C.

    1988-01-01

    Advanced Short Takeoff/Vertical Landing (STOVL) aircraft capable of operating from remote sites, damaged runways, and small air capable ships are being pursued for deployment around the turn of the century. To achieve this goal, it is important that the technologies critical to this unique class of aircraft be developed. Recognizing this need, NASA Lewis Research Center, McDonnell Douglas Aircraft, and DARPA defined a cooperative program for testing in the NASA Lewis 9- by 15-Foot Low Speed Wind Tunnel (LSWT) to establish a database for hot gas ingestion, one of the technologies critical to STOVL. Results from a test program are presented along with a discussion of the facility modifications allowing this type of testing at model scale. These modifications to the tunnel include a novel ground plane, an elaborate model support which included 4 degrees of freedom, heated high pressure air for nozzle flow, a suction system exhaust for inlet flow, and tunnel sidewall modifications. Several flow visualization techniques were employed including water mist in the nozzle flows and tufts on the ground plane. Headwind (free-stream) velocity was varied from 8 to 23 knots.

  7. Air STAR Beyond Visual Range UAS Description and Preliminary Test Results

    NASA Technical Reports Server (NTRS)

    Cunningham, Kevin; Cox, David E.; Foster, John V.; Riddick, Stephen E.; Laughter, Sean A.

    2016-01-01

    The NASA Airborne Subscale Transport Aircraft Research Unmanned Aerial System project's capabilities were expanded by updating the system design and concept of operations. The new remotely piloted airplane system design was flight tested to assess integrity and operational readiness of the design to perform flight research. The purpose of the system design is to improve aviation safety by providing a capability to validate, in high-risk conditions, technologies to prevent airplane loss of control. Two principal design requirements were to provide a high degree of reliability and that the new design provide a significant increase in test volume (relative to operations using the previous design). The motivation for increased test volume is to improve test efficiency and allow new test capabilities that were not possible with the previous design and concept of operations. Three successful test flights were conducted from runway 4-22 at NASA Goddard Space Flight Center's Wallops Flight Facility.

  8. Reprogramming of orientation columns in visual cortex: a domino effect

    PubMed Central

    Bachatene, Lyes; Bharmauria, Vishal; Cattan, Sarah; Rouat, Jean; Molotchnikoff, Stéphane

    2015-01-01

    Cortical organization rests upon the fundamental principle that neurons sharing similar properties are co-located. In the visual cortex, neurons are organized into orientation columns. In a column, most neurons respond optimally to the same axis of an oriented edge, that is, the preferred orientation. This orientation selectivity is believed to be absolute in adulthood. However, in a fully mature brain, it has been established that neurons change their selectivity following sensory experience or visual adaptation. Here, we show that after applying an adapter away from the tested cells, neurons whose receptive fields were located remotely from the adapted site also exhibit a novel selectivity in spite of the fact that they were not adapted. These results indicate a robust reconfiguration and remapping of the orientation domains with respect to each other thus removing the possibility of an orientation hole in the new hypercolumn. These data suggest that orientation columns transcend anatomy, and are almost strictly functionally dynamic. PMID:25801392

  9. High-Performance 3D Articulated Robot Display

    NASA Technical Reports Server (NTRS)

    Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Kurien, James A.; Abramyan, Lucy

    2011-01-01

    In the domain of telerobotic operations, the primary challenge facing the operator is to understand the state of the robotic platform. One key aspect of understanding the state is to visualize the physical location and configuration of the platform. As there is a wide variety of mobile robots, the requirements for visualizing their configurations vary diversely across different platforms. There can also be diversity in the mechanical mobility, such as wheeled, tracked, or legged mobility over surfaces. Adaptable 3D articulated robot visualization software can accommodate a wide variety of robotic platforms and environments. The visualization has been used for surface, aerial, space, and water robotic vehicle visualization during field testing. It has been used to enable operations of wheeled and legged surface vehicles, and can be readily adapted to facilitate other mechanical mobility solutions. The 3D visualization can render an articulated 3D model of a robotic platform for any environment. Given the model, the software receives real-time telemetry from the avionics system onboard the vehicle and animates the robot visualization to reflect the telemetered physical state. This is used to track the position and attitude in real time to monitor the progress of the vehicle as it traverses its environment. It is also used to monitor the state of any or all articulated elements of the vehicle, such as arms, legs, or control surfaces. The visualization can also render other sorts of telemetered states visually, such as stress or strains that are measured by the avionics. Such data can be used to color or annotate the virtual vehicle to indicate nominal or off-nominal states during operation. The visualization is also able to render the simulated environment where the vehicle is operating. For surface and aerial vehicles, it can render the terrain under the vehicle as the avionics sends it location information (GPS, odometry, or star tracking), and locate the vehicle over or on the terrain correctly. For long traverses over terrain, the visualization can stream in terrain piecewise in order to maintain the current area of interest for the operator without incurring unreasonable resource constraints on the computing platform. The visualization software is designed to run on laptops that can operate in field-testing environments without Internet access, which is a frequently encountered situation when testing in remote locations that simulate planetary environments such as Mars and other planetary bodies.

  10. VID-R and SCAN: Tools and Methods for the Automated Analysis of Visual Records.

    ERIC Educational Resources Information Center

    Ekman, Paul; And Others

    The VID-R (Visual Information Display and Retrieval) system that enables computer-aided analysis of visual records is composed of a film-to-television chain, two videotape recorders with complete remote control of functions, a video-disc recorder, three high-resolution television monitors, a teletype, a PDP-8, a video and audio interface, three…

  11. Recovery from retinal lesions: molecular plasticity mechanisms in visual cortex far beyond the deprived zone.

    PubMed

    Hu, Tjing-Tjing; Van den Bergh, Gert; Thorrez, Lieven; Heylen, Kevin; Eysel, Ulf T; Arckens, Lutgarde

    2011-12-01

    In cats with central retinal lesions, deprivation of the lesion projection zone (LPZ) in primary visual cortex (area 17) induces remapping of the cortical topography. Recovery of visually driven cortical activity in the LPZ involves distinct changes in protein expression. Recent observations, about molecular activity changes throughout area 17, challenge the view that its remote nondeprived parts would not be involved in this recovery process. We here investigated the dynamics of the protein expression pattern of remote nondeprived area 17 triggered by central retinal lesions to explore to what extent far peripheral area 17 would contribute to the topographic map reorganization inside the visual cortex. Using functional proteomics, we identified 40 proteins specifically differentially expressed between far peripheral area 17 of control and experimental animals 14 days to 8 months postlesion. Our results demonstrate that far peripheral area 17 is implicated in the functional adaptation to the visual deprivation, involving a meshwork of interacting proteins, operating in diverse pathways. In particular, endocytosis/exocytosis processes appeared to be essential via their intimate correlation with long-term potentiation and neurite outgrowth mechanisms.

  12. Discovering new methods of data fusion, visualization, and analysis in 3D immersive environments for hyperspectral and laser altimetry data

    NASA Astrophysics Data System (ADS)

    Moore, C. A.; Gertman, V.; Olsoy, P.; Mitchell, J.; Glenn, N. F.; Joshi, A.; Norpchen, D.; Shrestha, R.; Pernice, M.; Spaete, L.; Grover, S.; Whiting, E.; Lee, R.

    2011-12-01

    Immersive virtual reality environments such as the IQ-Station or CAVE° (Cave Automated Virtual Environment) offer new and exciting ways to visualize and explore scientific data and are powerful research and educational tools. Combining remote sensing data from a range of sensor platforms in immersive 3D environments can enhance the spectral, textural, spatial, and temporal attributes of the data, which enables scientists to interact and analyze the data in ways never before possible. Visualization and analysis of large remote sensing datasets in immersive environments requires software customization for integrating LiDAR point cloud data with hyperspectral raster imagery, the generation of quantitative tools for multidimensional analysis, and the development of methods to capture 3D visualizations for stereographic playback. This study uses hyperspectral and LiDAR data acquired over the China Hat geologic study area near Soda Springs, Idaho, USA. The data are fused into a 3D image cube for interactive data exploration and several methods of recording and playback are investigated that include: 1) creating and implementing a Virtual Reality User Interface (VRUI) patch configuration file to enable recording and playback of VRUI interactive sessions within the CAVE and 2) using the LiDAR and hyperspectral remote sensing data and GIS data to create an ArcScene 3D animated flyover, where left- and right-eye visuals are captured from two independent monitors for playback in a stereoscopic player. These visualizations can be used as outreach tools to demonstrate how integrated data and geotechnology techniques can help scientists see, explore, and more adequately comprehend scientific phenomena, both real and abstract.

  13. Using a visual discrimination model for the detection of compression artifacts in virtual pathology images.

    PubMed

    Johnson, Jeffrey P; Krupinski, Elizabeth A; Yan, Michelle; Roehrig, Hans; Graham, Anna R; Weinstein, Ronald S

    2011-02-01

    A major issue in telepathology is the extremely large and growing size of digitized "virtual" slides, which can require several gigabytes of storage and cause significant delays in data transmission for remote image interpretation and interactive visualization by pathologists. Compression can reduce this massive amount of virtual slide data, but reversible (lossless) methods limit data reduction to less than 50%, while lossy compression can degrade image quality and diagnostic accuracy. "Visually lossless" compression offers the potential for using higher compression levels without noticeable artifacts, but requires a rate-control strategy that adapts to image content and loss visibility. We investigated the utility of a visual discrimination model (VDM) and other distortion metrics for predicting JPEG 2000 bit rates corresponding to visually lossless compression of virtual slides for breast biopsy specimens. Threshold bit rates were determined experimentally with human observers for a variety of tissue regions cropped from virtual slides. For test images compressed to their visually lossless thresholds, just-noticeable difference (JND) metrics computed by the VDM were nearly constant at the 95th percentile level or higher, and were significantly less variable than peak signal-to-noise ratio (PSNR) and structural similarity (SSIM) metrics. Our results suggest that VDM metrics could be used to guide the compression of virtual slides to achieve visually lossless compression while providing 5-12 times the data reduction of reversible methods.

  14. Visual aids: a range of uses.

    PubMed

    Bradley, S

    1995-01-01

    The author explains why pictures have such impact. Images catch people's attention and to some extent can substitute for written words. They can be either still images like posters and flipcharts, three-dimensional images such as models or puppets, or they can show live events through drama, film, and video. Each of these are considered visual aids when used as teaching tools. When choosing visual aids, it is important to know which audience is being addressed and why, and to choose the visual aid which is most appropriate for the occasion. It is very important to pre-test pictures, especially when they will be used on their own without a facilitator to help participants analyze them. While some visual aids, such as maps and diagrams, are understood by everyone, people in some remote areas where there are very few books or papers may find pictures hard to understand. Facilitators are crucial to the successful use of visual aids. It is therefore very important that facilitators receive quality training. Well-trained facilitators from the local area will be more aware of local culture and concerns, and may be more trusted by participants. Poor training must be avoided. Finally, even though pictures can be misinterpreted, visual aids can make teaching and learning more enjoyable for many people. For people who find reading or speaking out difficult, the use of pictures may be the only way they can participate in discussions and decisions.

  15. Interactive Scripting for Analysis and Visualization of Arbitrarily Large, Disparately Located Climate Data Ensembles Using a Progressive Runtime Server

    NASA Astrophysics Data System (ADS)

    Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.

    2017-12-01

    Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the community through both docker and anaconda.

  16. A novel interface for the telementoring of robotic surgery.

    PubMed

    Shin, Daniel H; Dalag, Leonard; Azhar, Raed A; Santomauro, Michael; Satkunasivam, Raj; Metcalfe, Charles; Dunn, Matthew; Berger, Andre; Djaladat, Hooman; Nguyen, Mike; Desai, Mihir M; Aron, Monish; Gill, Inderbir S; Hung, Andrew J

    2015-08-01

    To prospectively evaluate the feasibility and safety of a novel, second-generation telementoring interface (Connect(™) ; Intuitive Surgical Inc., Sunnyvale, CA, USA) for the da Vinci robot. Robotic surgery trainees were mentored during portions of robot-assisted prostatectomy and renal surgery cases. Cases were assigned as traditional in-room mentoring or remote mentoring using Connect. While viewing two-dimensional, real-time video of the surgical field, remote mentors delivered verbal and visual counsel, using two-way audio and telestration (drawing) capabilities. Perioperative and technical data were recorded. Trainee robotic performance was rated using a validated assessment tool by both mentors and trainees. The mentoring interface was rated using a multi-factorial Likert-based survey. The Mann-Whitney and t-tests were used to determine statistical differences. We enrolled 55 mentored surgical cases (29 in-room, 26 remote). Perioperative variables of operative time and blood loss were similar between in-room and remote mentored cases. Robotic skills assessment showed no significant difference (P > 0.05). Mentors preferred remote over in-room telestration (P = 0.05); otherwise no significant difference existed in evaluation of the interfaces. Remote cases using wired (vs wireless) connections had lower latency and better data transfer (P = 0.005). Three of 18 (17%) wireless sessions were disrupted; one was converted to wired, one continued after restarting Connect, and the third was aborted. A bipolar injury to the colon occurred during one (3%) in-room mentored case; no intraoperative injuries were reported during remote sessions. In a tightly controlled environment, the Connect interface allows trainee robotic surgeons to be telementored in a safe and effective manner while performing basic surgical techniques. Significant steps remain prior to widespread use of this technology. © 2014 The Authors BJU International © 2014 BJU International Published by John Wiley & Sons Ltd.

  17. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    PubMed Central

    Moore, Eider B; Poliakov, Andrew V; Lincoln, Peter; Brinkley, James F

    2007-01-01

    Background Three-dimensional (3-D) visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: . Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine. PMID:17937818

  18. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data.

    PubMed

    Moore, Eider B; Poliakov, Andrew V; Lincoln, Peter; Brinkley, James F

    2007-10-15

    Three-dimensional (3-D) visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  19. Computational Modeling and Real-Time Control of Patient-Specific Laser Treatment of Cancer

    PubMed Central

    Fuentes, D.; Oden, J. T.; Diller, K. R.; Hazle, J. D.; Elliott, A.; Shetty, A.; Stafford, R. J.

    2014-01-01

    An adaptive feedback control system is presented which employs a computational model of bioheat transfer in living tissue to guide, in real-time, laser treatments of prostate cancer monitored by magnetic resonance thermal imaging (MRTI). The system is built on what can be referred to as cyberinfrastructure - a complex structure of high-speed network, large-scale parallel computing devices, laser optics, imaging, visualizations, inverse-analysis algorithms, mesh generation, and control systems that guide laser therapy to optimally control the ablation of cancerous tissue. The computational system has been successfully tested on in-vivo, canine prostate. Over the course of an 18 minute laser induced thermal therapy (LITT) performed at M.D. Anderson Cancer Center (MDACC) in Houston, Texas, the computational models were calibrated to intra-operative real time thermal imaging treatment data and the calibrated models controlled the bioheat transfer to within 5°C of the predetermined treatment plan. The computational arena is in Austin, Texas and managed at the Institute for Computational Engineering and Sciences (ICES). The system is designed to control the bioheat transfer remotely while simultaneously providing real-time remote visualization of the on-going treatment. Post operative histology of the canine prostate reveal that the damage region was within the targeted 1.2cm diameter treatment objective. PMID:19148754

  20. Computational modeling and real-time control of patient-specific laser treatment of cancer.

    PubMed

    Fuentes, D; Oden, J T; Diller, K R; Hazle, J D; Elliott, A; Shetty, A; Stafford, R J

    2009-04-01

    An adaptive feedback control system is presented which employs a computational model of bioheat transfer in living tissue to guide, in real-time, laser treatments of prostate cancer monitored by magnetic resonance thermal imaging. The system is built on what can be referred to as cyberinfrastructure-a complex structure of high-speed network, large-scale parallel computing devices, laser optics, imaging, visualizations, inverse-analysis algorithms, mesh generation, and control systems that guide laser therapy to optimally control the ablation of cancerous tissue. The computational system has been successfully tested on in vivo, canine prostate. Over the course of an 18 min laser-induced thermal therapy performed at M.D. Anderson Cancer Center (MDACC) in Houston, Texas, the computational models were calibrated to intra-operative real-time thermal imaging treatment data and the calibrated models controlled the bioheat transfer to within 5 degrees C of the predetermined treatment plan. The computational arena is in Austin, Texas and managed at the Institute for Computational Engineering and Sciences (ICES). The system is designed to control the bioheat transfer remotely while simultaneously providing real-time remote visualization of the on-going treatment. Post-operative histology of the canine prostate reveal that the damage region was within the targeted 1.2 cm diameter treatment objective.

  1. Real-Time Workload Monitoring: Improving Cognitive Process Models

    DTIC Science & Technology

    2010-10-01

    Research or comparable systems with similar technical properties having been made available on the market by now. Remote sensors lack the required visual...questionnaire. This includes age, gender, alcohol and nicotine consumption, visual status, sleep during the last three days and last night, sportive

  2. Large-scale functional models of visual cortex for remote sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brumby, Steven P; Kenyon, Garrett; Rasmussen, Craig E

    Neuroscience has revealed many properties of neurons and of the functional organization of visual cortex that are believed to be essential to human vision, but are missing in standard artificial neural networks. Equally important may be the sheer scale of visual cortex requiring {approx}1 petaflop of computation. In a year, the retina delivers {approx}1 petapixel to the brain, leading to massively large opportunities for learning at many levels of the cortical system. We describe work at Los Alamos National Laboratory (LANL) to develop large-scale functional models of visual cortex on LANL's Roadrunner petaflop supercomputer. An initial run of a simplemore » region VI code achieved 1.144 petaflops during trials at the IBM facility in Poughkeepsie, NY (June 2008). Here, we present criteria for assessing when a set of learned local representations is 'complete' along with general criteria for assessing computer vision models based on their projected scaling behavior. Finally, we extend one class of biologically-inspired learning models to problems of remote sensing imagery.« less

  3. Local and Global Processing: Observations from a Remote Culture

    ERIC Educational Resources Information Center

    Davidoff, Jules; Fonteneau, Elisabeth; Fagot, Joel

    2008-01-01

    In Experiment 1, a normal adult population drawn from a remote culture (Himba) in northern Namibia made similarity matches to [Navon, D. (1977). Forest before trees: The precedence of global features in visual perception. "Cognitive Psychology", 9, 353-383] hierarchical figures. The Himba showed a local bias stronger than that has been…

  4. Remote sensing for detection of termite infestations—Proof of Concept

    Treesearch

    Frederick Green III; Rachel A. Arango; Charles R. Boardman; Keith J. Bourne; John C. Hermanson; Robert A. Munson

    2015-01-01

    This paper reports the results of a search to discover the most cost effective and robust method of detecting Reticulitermes flavipes infestations in structural members of remote bridges, homes and other wooden structures and transmitting these results to internet cloud storage thus obviating routine travel to these structures for periodic visual...

  5. Remote sensing and urban public health

    NASA Technical Reports Server (NTRS)

    Rush, M.; Vernon, S.

    1975-01-01

    The applicability of remote sensing in the form of aerial photography to urban public health problems is examined. Environmental characteristics are analyzed to determine if health differences among areas could be predicted from the visual expression of remote sensing data. The analysis is carried out on a socioeconomic cross-sectional sample of census block groups. Six morbidity and mortality rates are the independent variables while environmental measures from aerial photographs and from the census constitute the two independent variable sets. It is found that environmental data collected by remote sensing are as good as census data in evaluating rates of health outcomes.

  6. LEARNERS: Interdisciplinary Learning Technology Projects Provide Visualizations and Simulations for Use of Geospatial Data in the Classroom

    NASA Astrophysics Data System (ADS)

    Farrell, N.; Hoban, S.

    2001-05-01

    The NASA Leading Educators to Applications, Research and NASA-related Educational Resources in Science (LEARNERS) initiative supports seven projects for enhancing kindergarten-to-high school science, geography, technology and mathematics education through Internet-based products derived from content on NASA's mission. Topics incorporated in LEARNERS projects include remote sensing of the Earth for agriculture and weather/climate studies, virtual exploration of remote worlds using robotics and digital imagery. Learners are engaged in inquiry or problem-based learning, often assuming the role of an expert scientist as part of an interdisciplinary science team, to study and explain practical problems using real-time NASA data. The presentation/poster will demonstrate novel uses of remote sensing data for K-12 and Post-Secondary students. This will include the use of visualizations, tools for educators, datasets, and classroom scenarios.

  7. Y0: An innovative tool for spatial data analysis

    NASA Astrophysics Data System (ADS)

    Wilson, Jeremy C.

    1993-08-01

    This paper describes an advanced analysis and visualization tool, called Y0 (pronounced ``Why not?!''), that has been developed to directly support the scientific process for earth and space science research. Y0 aids the scientific research process by enabling the user to formulate algorithms and models within an integrated environment, and then interactively explore the solution space with the aid of appropriate visualizations. Y0 has been designed to provide strong support for both quantitative analysis and rich visualization. The user's algorithm or model is defined in terms of algebraic formulas in cells on worksheets, in a similar fashion to spreadsheet programs. Y0 is specifically designed to provide the data types and rich function set necessary for effective analysis and manipulation of remote sensing data. This includes various types of arrays, geometric objects, and objects for representing geographic coordinate system mappings. Visualization of results is tailored to the needs of remote sensing, with straightforward methods of composing, comparing, and animating imagery and graphical information, with reference to geographical coordinate systems. Y0 is based on advanced object-oriented technology. It is implemented in C++ for use in Unix environments, with a user interface based on the X window system. Y0 has been delivered under contract to Unidata, a group which provides data and software support to atmospheric researches in universities affiliated with UCAR. This paper will explore the key concepts in Y0, describe its utility for remote sensing analysis and visualization, and will give a specific example of its application to the problem of measuring glacier flow rates from Landsat imagery.

  8. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  9. Virtual interactive presence for real-time, long-distance surgical collaboration during complex microsurgical procedures.

    PubMed

    Shenai, Mahesh B; Tubbs, R Shane; Guthrie, Barton L; Cohen-Gadol, Aaron A

    2014-08-01

    The shortage of surgeons compels the development of novel technologies that geographically extend the capabilities of individual surgeons and enhance surgical skills. The authors have developed "Virtual Interactive Presence" (VIP), a platform that allows remote participants to simultaneously view each other's visual field, creating a shared field of view for real-time surgical telecollaboration. The authors demonstrate the capability of VIP to facilitate long-distance telecollaboration during cadaveric dissection. Virtual Interactive Presence consists of local and remote workstations with integrated video capture devices and video displays. Each workstation mutually connects via commercial teleconferencing devices, allowing worldwide point-to-point communication. Software composites the local and remote video feeds, displaying a hybrid perspective to each participant. For demonstration, local and remote VIP stations were situated in Indianapolis, Indiana, and Birmingham, Alabama, respectively. A suboccipital craniotomy and microsurgical dissection of the pineal region was performed in a cadaveric specimen using VIP. Task and system performance were subjectively evaluated, while additional video analysis was used for objective assessment of delay and resolution. Participants at both stations were able to visually and verbally interact while identifying anatomical structures, guiding surgical maneuvers, and discussing overall surgical strategy. Video analysis of 3 separate video clips yielded a mean compositing delay of 760 ± 606 msec (when compared with the audio signal). Image resolution was adequate to visualize complex intracranial anatomy and provide interactive guidance. Virtual Interactive Presence is a feasible paradigm for real-time, long-distance surgical telecollaboration. Delay, resolution, scaling, and registration are parameters that require further optimization, but are within the realm of current technology. The paradigm potentially enables remotely located experts to mentor less experienced personnel located at the surgical site with applications in surgical training programs, remote proctoring for proficiency, and expert support for rural settings and across different counties.

  10. Visual Features Involving Motion Seen from Airport Control Towers

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Liston, Dorion

    2010-01-01

    Visual motion cues are used by tower controllers to support both visual and anticipated separation. Some of these cues are tabulated as part of the overall set of visual features used in towers to separate aircraft. An initial analyses of one motion cue, landing deceleration, is provided as a basis for evaluating how controllers detect and use it for spacing aircraft on or near the surface. Understanding cues like it will help determine if they can be safely used in a remote/virtual tower in which their presentation may be visually degraded.

  11. Accessing, Utilizing and Visualizing NASA Remote Sensing Data for Malaria Modeling and Surveillance

    NASA Technical Reports Server (NTRS)

    Kiang, Richard K.; Adimi, Farida; Kempler, Steven

    2007-01-01

    This poster presentation reviews the use of NASA remote sensing data that can be used to extract environmental information for modeling malaria transmission. The authors discuss the remote sensing data from Landsat, Advanced Very High Resolution Radiometer (AVHRR), Moderate Resolution Imaging Spectroradiometer (MODIS), Tropical Rainfall Measuring Mission (TRMM), Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Earth Observing One (EO-1), Advanced Land Imager (ALI) and Seasonal to Interannual Earth Science Information Partner (SIESIP) dataset.

  12. Stereoscopic visualization and haptic technology used to create a virtual environment for remote surgery - biomed 2011.

    PubMed

    Bornhoft, J M; Strabala, K W; Wortman, T D; Lehman, A C; Oleynikov, D; Farritor, S M

    2011-01-01

    The objective of this research is to study the effectiveness of using a stereoscopic visualization system for performing remote surgery. The use of stereoscopic vision has become common with the advent of the da Vinci® system (Intuitive, Sunnyvale CA). This system creates a virtual environment that consists of a 3-D display for visual feedback and haptic tactile feedback, together providing an intuitive environment for remote surgical applications. This study will use simple in vivo robotic surgical devices and compare the performance of surgeons using the stereoscopic interfacing system to the performance of surgeons using one dimensional monitors. The stereoscopic viewing system consists of two cameras, two monitors, and four mirrors. The cameras are mounted to a multi-functional miniature in vivo robot; and mimic the depth perception of the actual human eyes. This is done by placing the cameras at a calculated angle and distance apart. Live video streams from the left and right cameras are displayed on the left and right monitors, respectively. A system of angled mirrors allows the left and right eyes to see the video stream from the left and right monitor, respectively, creating the illusion of depth. The haptic interface consists of two PHANTOM Omni® (SensAble, Woburn Ma) controllers. These controllers measure the position and orientation of a pen-like end effector with three degrees of freedom. As the surgeon uses this interface, they see a 3-D image and feel force feedback for collision and workspace limits. The stereoscopic viewing system has been used in several surgical training tests and shows a potential improvement in depth perception and 3-D vision. The haptic system accurately gives force feedback that aids in surgery. Both have been used in non-survival animal surgeries, and have successfully been used in suturing and gallbladder removal. Bench top experiments using the interfacing system have also been conducted. A group of participants completed two different surgical training tasks using both a two dimensional visual system and the stereoscopic visual system. Results suggest that the stereoscopic visual system decreased the amount of time taken to complete the tasks. All participants also reported that the stereoscopic system was easier to utilize than the two dimensional system. Haptic controllers combined with stereoscopic vision provides for a more intuitive virtual environment. This system provides the surgeon with 3-D vision, depth perception, and the ability to receive feedback through forces applied in the haptic controller while performing surgery. These capabilities potentially enable the performance of more complex surgeries with a higher level of precision.

  13. Real-time visual mosaicking and navigation on the seafloor

    NASA Astrophysics Data System (ADS)

    Richmond, Kristof

    Remote robotic exploration holds vast potential for gaining knowledge about extreme environments accessible to humans only with great difficulty. Robotic explorers have been sent to other solar system bodies, and on this planet into inaccessible areas such as caves and volcanoes. In fact, the largest unexplored land area on earth lies hidden in the airless cold and intense pressure of the ocean depths. Exploration in the oceans is further hindered by water's high absorption of electromagnetic radiation, which both inhibits remote sensing from the surface, and limits communications with the bottom. The Earth's oceans thus provide an attractive target for developing remote exploration capabilities. As a result, numerous robotic vehicles now routinely survey this environment, from remotely operated vehicles piloted over tethers from the surface to torpedo-shaped autonomous underwater vehicles surveying the mid-waters. However, these vehicles are limited in their ability to navigate relative to their environment. This limits their ability to return to sites with precision without the use of external navigation aids, and to maneuver near and interact with objects autonomously in the water and on the sea floor. The enabling of environment-relative positioning on fully autonomous underwater vehicles will greatly extend their power and utility for remote exploration in the furthest reaches of the Earth's waters---even under ice and under ground---and eventually in extraterrestrial liquid environments such as Europa's oceans. This thesis presents an operational, fielded system for visual navigation of underwater robotic vehicles in unexplored areas of the seafloor. The system does not depend on external sensing systems, using only instruments on board the vehicle. As an area is explored, a camera is used to capture images and a composite view, or visual mosaic, of the ocean bottom is created in real time. Side-to-side visual registration of images is combined with dead-reckoned navigation information in a framework allowing the creation and updating of large, locally consistent mosaics. These mosaics are used as maps in which the vehicle can navigate and localize itself with respect to points in the environment. The system achieves real-time performance in several ways. First, wherever possible, direct sensing of motion parameters is used in place of extracting them from visual data. Second, trajectories are chosen to enable a hierarchical search for side-to-side links which limits the amount of searching performed without sacrificing robustness. Finally, the map estimation is formulated as a sparse, linear information filter allowing rapid updating of large maps. The visual navigation enabled by the work in this thesis represents a new capability for remotely operated vehicles, and an enabling capability for a new generation of autonomous vehicles which explore and interact with remote, unknown and unstructured underwater environments. The real-time mosaic can be used on current tethered vehicles to create pilot aids and provide a vehicle user with situational awareness of the local environment and the position of the vehicle within it. For autonomous vehicles, the visual navigation system enables precise environment-relative positioning and mapping, without requiring external navigation systems, opening the way for ever-expanding autonomous exploration capabilities. The utility of this system was demonstrated in the field at sites of scientific interest using the ROVs Ventana and Tiburon operated by the Monterey Bay Aquarium Research Institute. A number of sites in and around Monterey Bay, California were mosaicked using the system, culminating in a complete imaging of the wreck site of the USS Macon , where real-time visual mosaics containing thousands of images were generated while navigating using only sensor systems on board the vehicle.

  14. 46 CFR 154.804 - Vacuum protection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment Cargo Vent.... (2) There must be a pressure switch that operates an audible and visual alarm in the cargo control station identifying the tank and the alarm condition and a remote group audible and visual alarm in the...

  15. 46 CFR 154.804 - Vacuum protection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment Cargo Vent.... (2) There must be a pressure switch that operates an audible and visual alarm in the cargo control station identifying the tank and the alarm condition and a remote group audible and visual alarm in the...

  16. 46 CFR 154.804 - Vacuum protection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment Cargo Vent.... (2) There must be a pressure switch that operates an audible and visual alarm in the cargo control station identifying the tank and the alarm condition and a remote group audible and visual alarm in the...

  17. Study of Man-Machine Communications Systems for the Handicapped. Interim Report.

    ERIC Educational Resources Information Center

    Kafafian, Haig

    Newly developed communications systems for exceptional children include Cybercom; CYBERTYPE; Cyberplace, a keyless keyboard; Cyberphone, a telephonic communication system for deaf and speech impaired persons; Cyberlamp, a visual display; Cyberview, a fiber optic bundle remote visual display; Cybersem, an interface for the blind, fingerless, and…

  18. Land Cover Change and Remote Sensing in the Classroom: An Exercise to Study Urban Growth

    ERIC Educational Resources Information Center

    Delahunty, Tina; Lewis-Gonzales, Sarah; Phelps, Jack; Sawicki, Ben; Roberts, Charles; Carpenter, Penny

    2012-01-01

    The processes and implications of urban growth are studied in a variety of disciplines as urban growth affects both the physical and human landscape. Remote sensing methods provide ways to visualize and mathematically represent urban growth; and resultant land cover change data enable both quantitative and qualitative analysis. This article helps…

  19. Association of disease-specific causes of visual impairment and 10-year mortality amongst Indigenous Australians: the Central Australian Ocular Health Study.

    PubMed

    Estevez, José; Kaidonis, Georgia; Henderson, Tim; Craig, Jamie E; Landers, John

    2018-01-01

    Visual impairment significantly impairs the length and quality of life, but little is known of its impact in Indigenous Australians. To investigate the association of disease-specific causes of visual impairment with all-cause mortality. A retrospective cohort analysis. A total of 1347 Indigenous Australians aged over 40 years. Participants visiting remote medical clinics underwent clinical examinations including visual acuity, subjective refraction and slit-lamp examination of the anterior and posterior segments. The major ocular cause of visual impairment was determined. Patients were assessed periodically in these remote clinics for the succeeding 10 years after recruitment. Mortality rates were obtained from relevant departments. All-cause 10-year mortality and its association with disease-specific causes of visual impairment. The all-cause mortality rate for the entire cohort was 29.3% at the 10-year completion of follow-up. Of those with visual impairment, the overall mortality rate was 44.9%. The mortality rates differed for those with visual impairment due to cataract (59.8%), diabetic retinopathy (48.4%), trachoma (46.6%), 'other' (36.2%) and refractive error (33.4%) (P < 0.0001). Only those with visual impairment from diabetic retinopathy were any more likely to die during the 10 years of follow-up when compared with those without visual impairment (HR 1.70; 95% CI, 1.00-2.87; P = 0.049). Visual impairment was associated with all-cause mortality in a cohort of Indigenous Australians. However, diabetic retinopathy was the only ocular disease that significantly increased the risk of mortality. Visual impairment secondary to diabetic retinopathy may be an important predictor of mortality. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  20. Remote sensing programs and courses in engineering and water resources

    NASA Technical Reports Server (NTRS)

    Kiefer, R. W.

    1981-01-01

    The content of typical basic and advanced remote sensing and image interpretation courses are described and typical remote sensing graduate programs of study in civil engineering and in interdisciplinary environmental remote sensing and water resources management programs are outlined. Ideally, graduate programs with an emphasis on remote sensing and image interpretation should be built around a core of five courses: (1) a basic course in fundamentals of remote sensing upon which the more specialized advanced remote sensing courses can build; (2) a course dealing with visual image interpretation; (3) a course dealing with quantitative (computer-based) image interpretation; (4) a basic photogrammetry course; and (5) a basic surveying course. These five courses comprise up to one-half of the course work required for the M.S. degree. The nature of other course work and thesis requirements vary greatly, depending on the department in which the degree is being awarded.

  1. Research on assessment and improvement method of remote sensing image reconstruction

    NASA Astrophysics Data System (ADS)

    Sun, Li; Hua, Nian; Yu, Yanbo; Zhao, Zhanping

    2018-01-01

    Remote sensing image quality assessment and improvement is an important part of image processing. Generally, the use of compressive sampling theory in remote sensing imaging system can compress images while sampling which can improve efficiency. A method of two-dimensional principal component analysis (2DPCA) is proposed to reconstruct the remote sensing image to improve the quality of the compressed image in this paper, which contain the useful information of image and can restrain the noise. Then, remote sensing image quality influence factors are analyzed, and the evaluation parameters for quantitative evaluation are introduced. On this basis, the quality of the reconstructed images is evaluated and the different factors influence on the reconstruction is analyzed, providing meaningful referential data for enhancing the quality of remote sensing images. The experiment results show that evaluation results fit human visual feature, and the method proposed have good application value in the field of remote sensing image processing.

  2. Using component technologies for web based wavelet enhanced mammographic image visualization.

    PubMed

    Sakellaropoulos, P; Costaridou, L; Panayiotakis, G

    2000-01-01

    The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.

  3. Initial Experience Using a Telerobotic Ultrasound System for Adult Abdominal Sonography.

    PubMed

    Adams, Scott J; Burbridge, Brent E; Badea, Andreea; Langford, Leanne; Vergara, Vincent; Bryce, Rhonda; Bustamante, Luis; Mendez, Ivar M; Babyn, Paul S

    2017-08-01

    The study sought to assess the feasibility of performing adult abdominal examinations using a telerobotic ultrasound system in which radiologists or sonographers can control fine movements of a transducer and all ultrasound settings from a remote location. Eighteen patients prospectively underwent a conventional sonography examination (using EPIQ 5 [Philips] or LOGIQ E9 [GE Healthcare]) followed by a telerobotic sonography examination (using the MELODY System [AdEchoTech] and SonixTablet [BK Ultrasound]) according to a standardized abdominal imaging protocol. For telerobotic examinations, patients were scanned remotely by a sonographer 2.75 km away. Conventional examinations were read independently from telerobotic examinations. Image quality and acceptability to patients and sonographers was assessed. Ninety-two percent of organs visualized on conventional examinations were sufficiently visualized on telerobotic examinations. Five pathological findings were identified on both telerobotic and conventional examinations, 3 findings were identified using only conventional sonography, and 2 findings were identified using only telerobotic sonography. A paired sample t test showed no significant difference between the 2 modalities in measurements of the liver, spleen, and diameter of the proximal aorta; however, telerobotic assessments overestimated distal aorta and common bile duct diameters and underestimated kidney lengths (P values < .05). All patients responded that they would be willing to have another telerobotic examination. A telerobotic ultrasound system is feasible for performing abdominal ultrasound examinations at a distant location with minimal training and setup requirements and a moderate learning curve. Telerobotic sonography (robotic telesonography) may open up the possibility of remote ultrasound clinics for communities that lack skilled sonographers and radiologists, thereby improving access to care. Copyright © 2016 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  4. Providing views of the driving scene to drivers' conversation partners mitigates cell-phone-related distraction.

    PubMed

    Gaspar, John G; Street, Whitney N; Windsor, Matthew B; Carbonari, Ronald; Kaczmarski, Henry; Kramer, Arthur F; Mathewson, Kyle E

    2014-12-01

    Cell-phone use impairs driving safety and performance. This impairment may stem from the remote partner's lack of awareness about the driving situation. In this study, pairs of participants completed a driving simulator task while conversing naturally in the car and while talking on a hands-free cell phone. In a third condition, the driver drove while the remote conversation partner could see video of both the road ahead and the driver's face. We tested the extent to which this additional visual information diminished the negative effects of cell-phone distraction and increased situational awareness. Collision rates for unexpected merging events were high when participants drove in a cell-phone condition but were reduced when they were in a videophone condition, reaching a level equal to that observed when they drove with an in-car passenger or drove alone. Drivers and their partners made shorter utterances and made longer, more frequent traffic references when they spoke in the videophone rather than the cell-phone condition. Providing a view of the driving scene allows remote partners to help drivers by modulating their conversation and referring to traffic more often. © The Author(s) 2014.

  5. Potential reliability and validity of a modified version of the Unified Parkinson’s Disease Rating Scale that could be administered remotely

    PubMed Central

    Abdolahi, Amir; Scoglio, Nicholas; Killoran, Annie; Dorsey, Ray; Biglan, Kevin M.

    2013-01-01

    Background By permitting remote assessments of patients and research participants, telemedicine has the potential to reshape clinical care and clinical trials for Parkinson disease. While the majority of the motor Unified Parkinson’s Disease Rating Scale (UPDRS) items can be conducted visually, rigidity and retropulsion pull testing require hands-on assessment by the rater and are less feasible to perform remotely in patients' homes. Methods In a secondary data analysis of the Comparison of the Agonist pramipexole vs. Levodopa on Motor complications in Parkinson’s Disease (CALM-PD) study, a randomized clinical trial, we assessed the cross-sectional (baseline and 2 years) and longitudinal (change from baseline to 2 years) reliability of a modified motor UPDRS (removing rigidity and retropulsion items) compared to the standard motor UPDRS (all items) using intraclass correlation coefficients (ICC), stratified by treatment group. Internal consistency of the modified UPDRS (mUPDRS) was measured using Cronbach’s alpha, and concurrent validity was assessed using Pearson’s correlation coefficient (r) between the standard motor UPDRS and mUPDRS. Results The mUPDRS versus standard motor UPDRS is cross-sectionally (ICC ≥ 0.92) and longitudinally (ICC ≥ 0.92) reliable for both treatment groups. High internal consistencies were also observed (α ≥ 0.96). The mUPDRS had high concurrent validity with the standard UPDRS at both time points and longitudinally (r ≥ 0.93, p < 0.0001). Conclusions A modified version of the motor UPDRS without rigidity and retropulsion pull testing is reliable and valid and may lay the foundation for its use in remote assessments of patients and research participants. PMID:23102808

  6. Potential reliability and validity of a modified version of the Unified Parkinson's Disease Rating Scale that could be administered remotely.

    PubMed

    Abdolahi, Amir; Scoglio, Nicholas; Killoran, Annie; Dorsey, E Ray; Biglan, Kevin M

    2013-02-01

    By permitting remote assessments of patients and research participants, telemedicine has the potential to reshape clinical care and clinical trials for Parkinson disease. While the majority of the motor Unified Parkinson's Disease Rating Scale (UPDRS) items can be conducted visually, rigidity and retropulsion pull testing require hands-on assessment by the rater and are less feasible to perform remotely in patients' homes. In a secondary data analysis of the Comparison of the Agonist pramipexole vs. Levodopa on Motor complications in Parkinson's Disease (CALM-PD) study, a randomized clinical trial, we assessed the cross-sectional (baseline and 2 years) and longitudinal (change from baseline to 2 years) reliability of a modified motor UPDRS (removing rigidity and retropulsion items) compared to the standard motor UPDRS (all items) using intraclass correlation coefficients (ICC), stratified by treatment group. Internal consistency of the modified UPDRS (mUPDRS) was measured using Cronbach's alpha, and concurrent validity was assessed using Pearson's correlation coefficient (r) between the standard motor UPDRS and mUPDRS. The mUPDRS versus standard motor UPDRS is cross-sectionally (ICC ≥ 0.92) and longitudinally (ICC ≥ 0.92) reliable for both treatment groups. High internal consistencies were also observed (α ≥ 0.96). The mUPDRS had high concurrent validity with the standard UPDRS at both time points and longitudinally (r ≥ 0.93, p < 0.0001). A modified version of the motor UPDRS without rigidity and retropulsion pull testing is reliable and valid and may lay the foundation for its use in remote assessments of patients and research participants. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. VisPort: Web-Based Access to Community-Specific Visualization Functionality [Shedding New Light on Exploding Stars: Visualization for TeraScale Simulation of Neutrino-Driven Supernovae (Final Technical Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, M Pauline

    2007-06-30

    The VisPort visualization portal is an experiment in providing Web-based access to visualization functionality from any place and at any time. VisPort adopts a service-oriented architecture to encapsulate visualization functionality and to support remote access. Users employ browser-based client applications to choose data and services, set parameters, and launch visualization jobs. Visualization products typically images or movies are viewed in the user's standard Web browser. VisPort emphasizes visualization solutions customized for specific application communities. Finally, VisPort relies heavily on XML, and introduces the notion of visualization informatics - the formalization and specialization of information related to the process and productsmore » of visualization.« less

  8. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase one, volume 5 : automated management bridge information system.

    DOT National Transportation Integrated Search

    2009-12-01

    This volume focuses on one of the key components of the IRSV system, i.e., the AMBIS module. This module serves as one of : the tools used in this study to translate raw remote sensing data in the form of either high-resolution aerial photos or v...

  9. Improvement of Hungarian Joint Terminal Attack Program

    DTIC Science & Technology

    2013-06-13

    LST Laser Spot Tracker NVG Night Vision Goggle ROMAD Radio Operator Maintainer and Driver ROVER Remotely Operated Video Enhanced Receiver TACP...visual target designation. The other component consists of a laser spot tracker (LST), which identifies targets by tracking laser energy reflecting...capability for every type of night time missions, laser spot tracker for laser spot search missions, remotely operated video enhanced receiver

  10. The nature of the (visualization) game: Challenges and opportunities from computational geophysics

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    As the geosciences enters the era of big data, modeling and visualization become increasingly vital tools for discovery, understanding, education, and communication. Here, we focus on modeling and visualization of the structure and dynamics of the Earth's surface and interior. The past decade has seen accelerated data acquisition, including higher resolution imaging and modeling of Earth's deep interior, complex models of geodynamics, and high resolution topographic imaging of the changing surface, with an associated acceleration of computational modeling through better scientific software, increased computing capability, and the use of innovative methods of scientific visualization. The role of modeling is to describe a system, answer scientific questions, and test hypotheses; the term "model" encompasses mathematical models, computational models, physical models, conceptual models, statistical models, and visual models of a structure or process. These different uses of the term require thoughtful communication to avoid confusion. Scientific visualization is integral to every aspect of modeling. Not merely a means of communicating results, the best uses of visualization enable scientists to interact with their data, revealing the characteristics of the data and models to enable better interpretation and inform the direction of future investigation. Innovative immersive technologies like virtual reality, augmented reality, and remote collaboration techniques, are being adapted more widely and are a magnet for students. Time-varying or transient phenomena are especially challenging to model and to visualize; researchers and students may need to investigate the role of initial conditions in driving phenomena, while nonlinearities in the governing equations of many Earth systems make the computations and resulting visualization especially challenging. Training students how to use, design, build, and interpret scientific modeling and visualization tools prepares them to better understand the nature of complex, multiscale geoscience data.

  11. Remote sensing of wet lands in irrigated areas

    NASA Technical Reports Server (NTRS)

    Ham, H. H.

    1972-01-01

    The use of airborne remote sensing techniques to: (1) detect drainage problem areas, (2) delineate the problem in terms of areal extent, depth to the water table, and presence of excessive salinity, and (3) evaluate the effectiveness of existing subsurface drainage facilities, is discussed. Experimental results show that remote sensing, as demonstrated in this study and as presently constituted and priced, does not represent a practical alternative as a management tool to presently used visual and conventional photographic methods in the systematic and repetitive detection and delineation of wetlands.

  12. Graphics to H.264 video encoding for 3D scene representation and interaction on mobile devices using region of interest

    NASA Astrophysics Data System (ADS)

    Le, Minh Tuan; Nguyen, Congdu; Yoon, Dae-Il; Jung, Eun Ku; Jia, Jie; Kim, Hae-Kwang

    2007-12-01

    In this paper, we propose a method of 3D graphics to video encoding and streaming that are embedded into a remote interactive 3D visualization system for rapidly representing a 3D scene on mobile devices without having to download it from the server. In particular, a 3D graphics to video framework is presented that increases the visual quality of regions of interest (ROI) of the video by performing more bit allocation to ROI during H.264 video encoding. The ROI are identified by projection 3D objects to a 2D plane during rasterization. The system offers users to navigate the 3D scene and interact with objects of interests for querying their descriptions. We developed an adaptive media streaming server that can provide an adaptive video stream in term of object-based quality to the client according to the user's preferences and the variation of network bandwidth. Results show that by doing ROI mode selection, PSNR of test sample slightly change while visual quality of objects increases evidently.

  13. Patients with migraine correctly estimate the visual verticality.

    PubMed

    Crevits, Luc; Vanacker, Leen; Verraes, Anouk

    2012-05-01

    We wanted to study otolith function by measuring the static subjective visual vertical (SVV) in migraine patients and in controls with and without kinetosis (motion sickness). Forty-seven patients with moderately severe migraine and 96 healthy controls were enrolled. Using a questionnaire, persons with kinetosis were identified. The SVV test was performed in a totally dark room. Subjects wore a stiffneck to stabilize the head in an erect position. They were required to adjust an infrared line to the gravitational vertical with a hand-held infrared remote controlled potentiometer. The deviation of SVV in the group of migraine patients was not significantly different from that of controls, regardless of whether an aura was associated. SVV was not significantly influenced by the presence of dizziness/non specific vertigo or kinetosis. Patients with moderately severe migraine under prophylactic medication correctly estimate the visual verticality in the headache-free period. It is suggested that a deviation of SVV in a headache-free migraine patient may not be attributed to his migraine disorder as such regardless whether kinetosis is associated. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Human visual system consistent quality assessment for remote sensing image fusion

    NASA Astrophysics Data System (ADS)

    Liu, Jun; Huang, Junyi; Liu, Shuguang; Li, Huali; Zhou, Qiming; Liu, Junchen

    2015-07-01

    Quality assessment for image fusion is essential for remote sensing application. Generally used indices require a high spatial resolution multispectral (MS) image for reference, which is not always readily available. Meanwhile, the fusion quality assessments using these indices may not be consistent with the Human Visual System (HVS). As an attempt to overcome this requirement and inconsistency, this paper proposes an HVS-consistent image fusion quality assessment index at the highest resolution without a reference MS image using Gaussian Scale Space (GSS) technology that could simulate the HVS. The spatial details and spectral information of original and fused images are first separated in GSS, and the qualities are evaluated using the proposed spatial and spectral quality index respectively. The overall quality is determined without a reference MS image by a combination of the proposed two indices. Experimental results on various remote sensing images indicate that the proposed index is more consistent with HVS evaluation compared with other widely used indices that may or may not require reference images.

  15. a Coarse-To Model for Airplane Detection from Large Remote Sensing Images Using Saliency Modle and Deep Learning

    NASA Astrophysics Data System (ADS)

    Song, Z. N.; Sui, H. G.

    2018-04-01

    High resolution remote sensing images are bearing the important strategic information, especially finding some time-sensitive-targets quickly, like airplanes, ships, and cars. Most of time the problem firstly we face is how to rapidly judge whether a particular target is included in a large random remote sensing image, instead of detecting them on a given image. The problem of time-sensitive-targets target finding in a huge image is a great challenge: 1) Complex background leads to high loss and false alarms in tiny object detection in a large-scale images. 2) Unlike traditional image retrieval, what we need to do is not just compare the similarity of image blocks, but quickly find specific targets in a huge image. In this paper, taking the target of airplane as an example, presents an effective method for searching aircraft targets in large scale optical remote sensing images. Firstly, we used an improved visual attention model utilizes salience detection and line segment detector to quickly locate suspected regions in a large and complicated remote sensing image. Then for each region, without region proposal method, a single neural network predicts bounding boxes and class probabilities directly from full images in one evaluation is adopted to search small airplane objects. Unlike sliding window and region proposal-based techniques, we can do entire image (region) during training and test time so it implicitly encodes contextual information about classes as well as their appearance. Experimental results show the proposed method is quickly identify airplanes in large-scale images.

  16. Integrated Visualization of Multi-sensor Ocean Data across the Web

    NASA Astrophysics Data System (ADS)

    Platt, F.; Thompson, C. K.; Roberts, J. T.; Tsontos, V. M.; Hin Lam, C.; Arms, S. C.; Quach, N.

    2017-12-01

    Whether for research or operational decision support, oceanographic applications rely on the visualization of multivariate in situ and remote sensing data as an integral part of analysis workflows. However, given their inherently 3D-spatial and temporally dynamic nature, the visual representation of marine in situ data in particular poses a challenge. The Oceanographic In situ data Interoperability Project (OIIP) is a collaborative project funded under the NASA/ACCESS program that seeks to leverage and enhance higher TRL (technology readiness level) informatics technologies to address key data interoperability and integration issues associated with in situ ocean data, including the dearth of effective web-based visualization solutions. Existing web tools for the visualization of key in situ data types - point, profile, trajectory series - are limited in their support for integrated, dynamic and coordinated views of the spatiotemporal characteristics of the data. Via the extension of the JPL Common Mapping Client (CMC) software framework, OIIP seeks to provide improved visualization support for oceanographic in situ data sets. More specifically, this entails improved representation of both horizontal and vertical aspects of these data, which inherently are depth resolved and time referenced, as well as the visual synchronization with relevant remotely-sensed gridded data products, such as sea surface temperature and salinity. Electronic tagging datasets, which are a focal use case for OIIP, provide a representative, if somewhat complex, visualization challenge in this regard. Critical to the achievement of these development objectives has been compilation of a well-rounded set of visualization use cases and requirements based on a series of end-user consultations aimed at understanding their satellite-in situ visualization needs. Here we summarize progress on aspects of the technical work and our approach.

  17. Interactive intelligent remote operations: application to space robotics

    NASA Astrophysics Data System (ADS)

    Dupuis, Erick; Gillett, G. R.; Boulanger, Pierre; Edwards, Eric; Lipsett, Michael G.

    1999-11-01

    A set of tolls addressing the problems specific to the control and monitoring of remote robotic systems from extreme distances has been developed. The tools include the capability to model and visualize the remote environment, to generate and edit complex task scripts, to execute the scripts to supervisory control mode and to monitor and diagnostic equipment from multiple remote locations. Two prototype systems are implemented for demonstration. The first demonstration, using a prototype joint design called Dexter, shows the applicability of the approach to space robotic operation in low Earth orbit. The second demonstration uses a remotely controlled excavator in an operational open-pit tar sand mine. This demonstrates that the tools developed can also be used for planetary exploration operations as well as for terrestrial mining applications.

  18. In-Flight Flow Visualization Using Infrared Thermography

    NASA Technical Reports Server (NTRS)

    vanDam, C. P.; Shiu, H. J.; Banks D. W.

    1997-01-01

    The feasibility of remote infrared thermography of aircraft surfaces during flight to visualize the extent of laminar flow on a target aircraft has been examined. In general, it was determined that such thermograms can be taken successfully using an existing airplane/thermography system (NASA Dryden's F-18 with infrared imaging pod) and that the transition pattern and, thus, the extent of laminar flow can be extracted from these thermograms. Depending on the in-flight distance between the F-18 and the target aircraft, the thermograms can have a spatial resolution of as little as 0.1 inches. The field of view provided by the present remote system is superior to that of prior stationary infrared thermography systems mounted in the fuselage or vertical tail of a subject aircraft. An additional advantage of the present experimental technique is that the target aircraft requires no or minimal modifications. An image processing procedure was developed which improves the signal-to-noise ratio of the thermograms. Problems encountered during the analog recording of the thermograms (banding of video images) made it impossible to evaluate the adequacy of the present imaging system and image processing procedure to detect transition on untreated metal surfaces. The high reflectance, high thermal difussivity, and low emittance of metal surfaces tend to degrade the images to an extent that it is very difficult to extract transition information from them. The application of a thin (0.005 inches) self-adhesive insulating film to the surface is shown to solve this problem satisfactorily. In addition to the problem of infrared based transition detection on untreated metal surfaces, future flight tests will also concentrate on the visualization of other flow phenomena such as flow separation and reattachment.

  19. Local activity determines functional connectivity in the resting human brain: a simultaneous FDG-PET/fMRI study.

    PubMed

    Riedl, Valentin; Bienkowska, Katarzyna; Strobel, Carola; Tahmasian, Masoud; Grimmer, Timo; Förster, Stefan; Friston, Karl J; Sorg, Christian; Drzezga, Alexander

    2014-04-30

    Over the last decade, synchronized resting-state fluctuations of blood oxygenation level-dependent (BOLD) signals between remote brain areas [so-called BOLD resting-state functional connectivity (rs-FC)] have gained enormous relevance in systems and clinical neuroscience. However, the neural underpinnings of rs-FC are still incompletely understood. Using simultaneous positron emission tomography/magnetic resonance imaging we here directly investigated the relationship between rs-FC and local neuronal activity in humans. Computational models suggest a mechanistic link between the dynamics of local neuronal activity and the functional coupling among distributed brain regions. Therefore, we hypothesized that the local activity (LA) of a region at rest determines its rs-FC. To test this hypothesis, we simultaneously measured both LA (glucose metabolism) and rs-FC (via synchronized BOLD fluctuations) during conditions of eyes closed or eyes open. During eyes open, LA increased in the visual system, and the salience network (i.e., cingulate and insular cortices) and the pattern of elevated LA coincided almost exactly with the spatial pattern of increased rs-FC. Specifically, the voxelwise regional profile of LA in these areas strongly correlated with the regional pattern of rs-FC among the same regions (e.g., LA in primary visual cortex accounts for ∼ 50%, and LA in anterior cingulate accounts for ∼ 20% of rs-FC with the visual system). These data provide the first direct evidence in humans that local neuronal activity determines BOLD FC at rest. Beyond its relevance for the neuronal basis of coherent BOLD signal fluctuations, our procedure may translate into clinical research particularly to investigate potentially aberrant links between local dynamics and remote functional coupling in patients with neuropsychiatric disorders.

  20. Demonstration of the Low-Cost Virtual Collaborative Environment (VCE)

    NASA Technical Reports Server (NTRS)

    Bowers, David; Montes, Leticia; Ramos, Angel; Joyce, Brendan; Lumia, Ron

    1997-01-01

    This paper demonstrates the feasibility of a low-cost approach of remotely controlling equipment. Our demonstration system consists of a PC, the PUMA 560 robot with Barrett hand, and commercially available controller and teleconferencing software. The system provides a graphical user interface which allows a user to program equipment tasks and preview motions i.e., simulate the results. Once satisfied that the actions are both safe and accomplish the task, the remote user sends the data over the Internet to the local site for execution on the real equipment. A video link provides visual feedback to the remote sight. This technology lends itself readily to NASA's upcoming Mars expeditions by providing remote simulation and control of equipment.

  1. A Flexible and Integrated System for the Remote Acquisition of Neuropsychological Data in Stroke Research

    PubMed Central

    Durisko, Corrine; McCue, Michael; Doyle, Patrick J.; Dickey, Michael Walsh

    2016-01-01

    Abstract Background: Neuropsychological testing is a central aspect of stroke research because it provides critical information about the cognitive-behavioral status of stroke survivors, as well as the diagnosis and treatment of stroke-related disorders. Standard neuropsychological methods rely upon face-to-face interactions between a patient and researcher, which creates geographic and logistical barriers that impede research progress and treatment advances. Introduction: To overcome these barriers, we created a flexible and integrated system for the remote acquisition of neuropsychological data (RAND). The system we developed has a secure architecture that permits collaborative videoconferencing. The system supports shared audiovisual feeds that can provide continuous virtual interaction between a participant and researcher throughout a testing session. Shared presentation and computing controls can be used to deliver auditory and visual test items adapted from standard face-to-face materials or execute computer-based assessments. Spoken and manual responses can be acquired, and the components of the session can be recorded for offline data analysis. Materials and Methods: To evaluate its feasibility, our RAND system was used to administer a speech-language test battery to 16 stroke survivors with a variety of communication, sensory, and motor impairments. The sessions were initiated virtually without prior face-to-face instruction in the RAND technology or test battery. Results: Neuropsychological data were successfully acquired from all participants, including those with limited technology experience, and those with a communication, sensory, or motor impairment. Furthermore, participants indicated a high level of satisfaction with the RAND system and the remote assessment that it permits. Conclusions: The results indicate the feasibility of using the RAND system for virtual home-based neuropsychological assessment without prior face-to-face contact between a participant and researcher. Because our RAND system architecture uses off-the-shelf technology and software, it can be duplicated without specialized expertise or equipment. In sum, our RAND system offers a readily available and promising alternative to face-to-face neuropsychological assessment in stroke research. PMID:27214198

  2. Three-dimensional visualization and display technologies; Proceedings of the Meeting, Los Angeles, CA, Jan. 18-20, 1989

    NASA Technical Reports Server (NTRS)

    Robbins, Woodrow E. (Editor); Fisher, Scott S. (Editor)

    1989-01-01

    Special attention was given to problems of stereoscopic display devices, such as CAD for enhancement of the design process in visual arts, stereo-TV improvement of remote manipulator performance, a voice-controlled stereographic video camera system, and head-mounted displays and their low-cost design alternatives. Also discussed was a novel approach to chromostereoscopic microscopy, computer-generated barrier-strip autostereography and lenticular stereograms, and parallax barrier three-dimensional TV. Additional topics include processing and user interface isssues and visualization applications, including automated analysis and fliud flow topology, optical tomographic measusrements of mixing fluids, visualization of complex data, visualization environments, and visualization management systems.

  3. eFarm: A Tool for Better Observing Agricultural Land Systems

    PubMed Central

    Yu, Qiangyi; Shi, Yun; Tang, Huajun; Yang, Peng; Xie, Ankun; Liu, Bin; Wu, Wenbin

    2017-01-01

    Currently, observations of an agricultural land system (ALS) largely depend on remotely-sensed images, focusing on its biophysical features. While social surveys capture the socioeconomic features, the information was inadequately integrated with the biophysical features of an ALS and the applications are limited due to the issues of cost and efficiency to carry out such detailed and comparable social surveys at a large spatial coverage. In this paper, we introduce a smartphone-based app, called eFarm: a crowdsourcing and human sensing tool to collect the geotagged ALS information at the land parcel level, based on the high resolution remotely-sensed images. We illustrate its main functionalities, including map visualization, data management, and data sensing. Results of the trial test suggest the system works well. We believe the tool is able to acquire the human–land integrated information which is broadly-covered and timely-updated, thus presenting great potential for improving sensing, mapping, and modeling of ALS studies. PMID:28245554

  4. Robert Spencer | NREL

    Science.gov Websites

    & Simulation Research Interests Remote Sensing Natural Resource Modeling Machine Learning Education Analysis Center. Areas of Expertise Geospatial Analysis Data Visualization Algorithm Development Modeling

  5. DspaceOgre 3D Graphics Visualization Tool

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Myin, Steven; Pomerantz, Marc I.

    2011-01-01

    This general-purpose 3D graphics visualization C++ tool is designed for visualization of simulation and analysis data for articulated mechanisms. Examples of such systems are vehicles, robotic arms, biomechanics models, and biomolecular structures. DspaceOgre builds upon the open-source Ogre3D graphics visualization library. It provides additional classes to support the management of complex scenes involving multiple viewpoints and different scene groups, and can be used as a remote graphics server. This software provides improved support for adding programs at the graphics processing unit (GPU) level for improved performance. It also improves upon the messaging interface it exposes for use as a visualization server.

  6. Efficient transmission of compressed data for remote volume visualization.

    PubMed

    Krishnan, Karthik; Marcellin, Michael W; Bilgin, Ali; Nadar, Mariappan S

    2006-09-01

    One of the goals of telemedicine is to enable remote visualization and browsing of medical volumes. There is a need to employ scalable compression schemes and efficient client-server models to obtain interactivity and an enhanced viewing experience. First, we present a scheme that uses JPEG2000 and JPIP (JPEG2000 Interactive Protocol) to transmit data in a multi-resolution and progressive fashion. The server exploits the spatial locality offered by the wavelet transform and packet indexing information to transmit, in so far as possible, compressed volume data relevant to the clients query. Once the client identifies its volume of interest (VOI), the volume is refined progressively within the VOI from an initial lossy to a final lossless representation. Contextual background information can also be made available having quality fading away from the VOI. Second, we present a prioritization that enables the client to progressively visualize scene content from a compressed file. In our specific example, the client is able to make requests to progressively receive data corresponding to any tissue type. The server is now capable of reordering the same compressed data file on the fly to serve data packets prioritized as per the client's request. Lastly, we describe the effect of compression parameters on compression ratio, decoding times and interactivity. We also present suggestions for optimizing JPEG2000 for remote volume visualization and volume browsing applications. The resulting system is ideally suited for client-server applications with the server maintaining the compressed volume data, to be browsed by a client with a low bandwidth constraint.

  7. Federated Giovanni: A Distributed Web Service for Analysis and Visualization of Remote Sensing Data

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2014-01-01

    The Geospatial Interactive Online Visualization and Analysis Interface (Giovanni) is a popular tool for users of the Goddard Earth Sciences Data and Information Services Center (GES DISC) and has been in use for over a decade. It provides a wide variety of algorithms and visualizations to explore large remote sensing datasets without having to download the data and without having to write readers and visualizers for it. Giovanni is now being extended to enable its capabilities at other data centers within the Earth Observing System Data and Information System (EOSDIS). This Federated Giovanni will allow four other data centers to add and maintain their data within Giovanni on behalf of their user community. Those data centers are the Physical Oceanography Distributed Active Archive Center (PO.DAAC), MODIS Adaptive Processing System (MODAPS), Ocean Biology Processing Group (OBPG), and Land Processes Distributed Active Archive Center (LP DAAC). Three tiers are supported: Tier 1 (GES DISC-hosted) gives the remote data center a data management interface to add and maintain data, which are provided through the Giovanni instance at the GES DISC. Tier 2 packages Giovanni up as a virtual machine for distribution to and deployment by the other data centers. Data variables are shared among data centers by sharing documents from the Solr database that underpins Giovanni's data management capabilities. However, each data center maintains their own instance of Giovanni, exposing the variables of most interest to their user community. Tier 3 is a Shared Source model, in which the data centers cooperate to extend the infrastructure by contributing source code.

  8. Parallel Rendering of Large Time-Varying Volume Data

    NASA Technical Reports Server (NTRS)

    Garbutt, Alexander E.

    2005-01-01

    Interactive visualization of large time-varying 3D volume datasets has been and still is a great challenge to the modem computational world. It stretches the limits of the memory capacity, the disk space, the network bandwidth and the CPU speed of a conventional computer. In this SURF project, we propose to develop a parallel volume rendering program on SGI's Prism, a cluster computer equipped with state-of-the-art graphic hardware. The proposed program combines both parallel computing and hardware rendering in order to achieve an interactive rendering rate. We use 3D texture mapping and a hardware shader to implement 3D volume rendering on each workstation. We use SGI's VisServer to enable remote rendering using Prism's graphic hardware. And last, we will integrate this new program with ParVox, a parallel distributed visualization system developed at JPL. At the end of the project, we Will demonstrate remote interactive visualization using this new hardware volume renderer on JPL's Prism System using a time-varying dataset from selected JPL applications.

  9. Commercial future: making remote sensing a media event

    NASA Astrophysics Data System (ADS)

    Lurie, Ian

    1999-12-01

    The rapid growth of commercial remote sensing has made high quality digital sensing data widely available -- now, remote sensing must become and remain a strong, commercially viable industry. However, this new industry cannot survive without an educated consumer base. To access markets, remote sensing providers must make their product more accessible, both literally and figuratively: Potential customers must be able to find the data they require, when they require it, and they must understand the utility of the information available to them. The Internet and the World Wide Web offer the perfect medium to educate potential customers and to sell remote sensing data to those customers. A well-designed web presence can provide both an information center and a market place for companies offering their data for sale. A very high potential web-based market for remote sensing lies in media. News agencies, web sites, and a host of other visual media services can use remote sensing data to provide current, relevant information regarding news around the world. This paper will provide a model for promotion and sale of remote sensing data via the Internet.

  10. Development of a graphical user interface for the global land information system (GLIS)

    USGS Publications Warehouse

    Alstad, Susan R.; Jackson, David A.

    1993-01-01

    The process of developing a Motif Graphical User Interface for the Global Land Information System (GLIS) involved incorporating user requirements, in-house visual and functional design requirements, and Open Software Foundation (OSF) Motif style guide standards. Motif user interface windows have been developed using the software to support Motif window functions war written using the C programming language. The GLIS architecture was modified to support multiple servers and remote handlers running the X Window System by forming a network of servers and handlers connected by TCP/IP communications. In April 1993, prior to release the GLIS graphical user interface and system architecture modifications were test by developers and users located at the EROS Data Center and 11 beta test sites across the country.

  11. A Systematic Review of Remote Laboratory Work in Science Education with the Support of Visualizing Its Structure through the "HistCite" and "CiteSpace" Software

    ERIC Educational Resources Information Center

    Tho, Siew Wei; Yeung, Yau Yuen; Wei, Rui; Chan, Ka Wing; So, Winnie Wing-mui

    2017-01-01

    Laboratory work, particularly the latest remote laboratories (RLs), has been assumed to have a general positive effect on science education because practical work can provide diverse learning experiences and enhance thinking skills suitable for the 21st century. However, there has not been a synthesis of the science education research to support…

  12. Interpretation of remotely sensed data and its applications in oceanography

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Tanaka, K.; Inostroza, H. M.; Verdesio, J. J.

    1982-01-01

    The methodology of interpretation of remote sensing data and its oceanographic applications are described. The elements of image interpretation for different types of sensors are discussed. The sensors utilized are the multispectral scanner of LANDSAT, and the thermal infrared of NOAA and geostationary satellites. Visual and automatic data interpretation in studies of pollution, the Brazil current system, and upwelling along the southeastern Brazilian coast are compared.

  13. The application of remote sensing to the development and formulation of hydrologic planning models

    NASA Technical Reports Server (NTRS)

    Fowler, T. R.; Castruccio, P. A.; Loats, H. L., Jr.

    1977-01-01

    The development of a remote sensing model and its efficiency in determining parameters of hydrologic models are reviewed. Procedures for extracting hydrologic data from LANDSAT imagery, and the visual analysis of composite imagery are presented. A hydrologic planning model is developed and applied to determine seasonal variations in watershed conditions. The transfer of this technology to a user community and contract arrangements are discussed.

  14. The antisaccade task: visual distractors elicit a location-independent planning 'cost'.

    PubMed

    DeSimone, Jesse C; Everling, Stefan; Heath, Matthew

    2015-01-01

    The presentation of a remote - but not proximal - distractor concurrent with target onset increases prosaccade reaction times (RT) (i.e., the remote distractor effect: RDE). The competitive integration model asserts that the RDE represents the time required to resolve the conflict for a common saccade threshold between target- and distractor-related saccade generating commands in the superior colliculus. To our knowledge however, no previous research has examined whether remote and proximal distractors differentially influence antisaccade RTs. This represents a notable question because antisaccades require decoupling of the spatial relations between stimulus and response (SR) and therefore provide a basis for determining whether the sensory- and/or motor-related features of a distractor influence response planning. Participants completed pro- and antisaccades in a target-only condition and conditions wherein the target was concurrently presented with a proximal or remote distractor. As expected, prosaccade RTs elicited a reliable RDE. In contrast, antisaccade RTs were increased independent of the distractor's spatial location and the magnitude of the effect was comparable across each distractor location. Thus, distractor-related antisaccade RT costs are not accounted for by a competitive integration between conflicting saccade generating commands. Instead, we propose that a visual distractor increases uncertainty related to the evocation of the response-selection rule necessary for decoupling SR relations.

  15. Aircraft loading and freezer enhancements: lessons for medical research in remote communities.

    PubMed

    Gagnon, Roy; Gagnon, Faith; Panagiotopoulos, Constadina

    2008-01-01

    Type 2 diabetes (T2D) and impaired glucose tolerance (IGT), historically extremely rare in children, is becoming prevalent among First Nations children. In Canada, many of these children live in remote villages accessible only by float plane. Because T2D has many long-term health implications, prevention and early identification are critical. We developed a process for sending a fully equipped endocrinology team to a remote community to screen the children for T2D and IGT. Float plane (sea plane) travel has several unexpected limitations for a medical research team. These include having to travel in good visibility (visual flight rules), limited payload capacity, and restriction against transporting dry ice. The benefits include avoiding the usual security restrictions. We developed and tested a custom-built insulation jacket and system of backup battery packs for the countertop -25 degrees C freezer (in lieu of dry ice) to transport frozen blood samples from the village to our hospital's laboratory. We also ensured that the five-member research team, its equipment, and the consumable supplies stayed within the maximum takeoff weight of the airplane and met center-of-gravity criteria to ensure a safe flight. Using the insulated freezer, sample integrity was maintained throughout the flight, and a safe weight-and-balance trip was achieved for the team and supplies. The team obtained complete T2D screening data on 88% of children in the remote community.

  16. PI2GIS: processing image to geographical information systems, a learning tool for QGIS

    NASA Astrophysics Data System (ADS)

    Correia, R.; Teodoro, A.; Duarte, L.

    2017-10-01

    To perform an accurate interpretation of remote sensing images, it is necessary to extract information using different image processing techniques. Nowadays, it became usual to use image processing plugins to add new capabilities/functionalities integrated in Geographical Information System (GIS) software. The aim of this work was to develop an open source application to automatically process and classify remote sensing images from a set of satellite input data. The application was integrated in a GIS software (QGIS), automating several image processing steps. The use of QGIS for this purpose is justified since it is easy and quick to develop new plugins, using Python language. This plugin is inspired in the Semi-Automatic Classification Plugin (SCP) developed by Luca Congedo. SCP allows the supervised classification of remote sensing images, the calculation of vegetation indices such as NDVI (Normalized Difference Vegetation Index) and EVI (Enhanced Vegetation Index) and other image processing operations. When analysing SCP, it was realized that a set of operations, that are very useful in teaching classes of remote sensing and image processing tasks, were lacking, such as the visualization of histograms, the application of filters, different image corrections, unsupervised classification and several environmental indices computation. The new set of operations included in the PI2GIS plugin can be divided into three groups: pre-processing, processing, and classification procedures. The application was tested consider an image from Landsat 8 OLI from a North area of Portugal.

  17. Design of a video system providing optimal visual information for controlling payload and experiment operations with television

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A program was conducted which included the design of a set of simplified simulation tasks, design of apparatus and breadboard TV equipment for task performance, and the implementation of a number of simulation tests. Performance measurements were made under controlled conditions and the results analyzed to permit evaluation of the relative merits (effectivity) of various TV systems. Burden factors were subsequently generated for each TV system to permit tradeoff evaluation of system characteristics against performance. For the general remote operation mission, the 2-view system is recommended. This system is characterized and the corresponding equipment specifications were generated.

  18. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    PubMed Central

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E.; Ramakrishnan, IV

    2017-01-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access—an early forerunner of true ubiquitous access—screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments. PMID:28782061

  19. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    PubMed

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E; Ramakrishnan, I V

    2017-05-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access-an early forerunner of true ubiquitous access-screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.

  20. Vroom: designing an augmented environment for remote collaboration in digital cinema production

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; Cornish, Tracy

    2013-03-01

    As media technologies become increasingly affordable, compact and inherently networked, new generations of telecollaborative platforms continue to arise which integrate these new affordances. Virtual reality has been primarily concerned with creating simulations of environments that can transport participants to real or imagined spaces that replace the "real world". Meanwhile Augmented Reality systems have evolved to interleave objects from Virtual Reality environments into the physical landscape. Perhaps now there is a new class of systems that reverse this precept to enhance dynamic media landscapes and immersive physical display environments to enable intuitive data exploration through collaboration. Vroom (Virtual Room) is a next-generation reconfigurable tiled display environment in development at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego. Vroom enables freely scalable digital collaboratories, connecting distributed, high-resolution visualization resources for collaborative work in the sciences, engineering and the arts. Vroom transforms a physical space into an immersive media environment with large format interactive display surfaces, video teleconferencing and spatialized audio built on a highspeed optical network backbone. Vroom enables group collaboration for local and remote participants to share knowledge and experiences. Possible applications include: remote learning, command and control, storyboarding, post-production editorial review, high resolution video playback, 3D visualization, screencasting and image, video and multimedia file sharing. To support these various scenarios, Vroom features support for multiple user interfaces (optical tracking, touch UI, gesture interface, etc.), support for directional and spatialized audio, giga-pixel image interactivity, 4K video streaming, 3D visualization and telematic production. This paper explains the design process that has been utilized to make Vroom an accessible and intuitive immersive environment for remote collaboration specifically for digital cinema production.

  1. Large-mirror testing facility at the National Optical Astronomy Observatories.

    NASA Astrophysics Data System (ADS)

    Barr, L. D.; Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, C.; Roddier, F.

    1991-09-01

    A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes will do. The method uses a remotely operated CCD camera system to record the fringe pattern from the workpiece. The typical test is done with a camera exposure of about a millisecond to "freeze" the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. The method described provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce, all within a few minutes, to an accuracy of 0.01 μm measured peak-to-valley.

  2. a Rough Set Decision Tree Based Mlp-Cnn for Very High Resolution Remotely Sensed Image Classification

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Pan, X.; Zhang, S. Q.; Li, H. P.; Atkinson, P. M.

    2017-09-01

    Recent advances in remote sensing have witnessed a great amount of very high resolution (VHR) images acquired at sub-metre spatial resolution. These VHR remotely sensed data has post enormous challenges in processing, analysing and classifying them effectively due to the high spatial complexity and heterogeneity. Although many computer-aid classification methods that based on machine learning approaches have been developed over the past decades, most of them are developed toward pixel level spectral differentiation, e.g. Multi-Layer Perceptron (MLP), which are unable to exploit abundant spatial details within VHR images. This paper introduced a rough set model as a general framework to objectively characterize the uncertainty in CNN classification results, and further partition them into correctness and incorrectness on the map. The correct classification regions of CNN were trusted and maintained, whereas the misclassification areas were reclassified using a decision tree with both CNN and MLP. The effectiveness of the proposed rough set decision tree based MLP-CNN was tested using an urban area at Bournemouth, United Kingdom. The MLP-CNN, well capturing the complementarity between CNN and MLP through the rough set based decision tree, achieved the best classification performance both visually and numerically. Therefore, this research paves the way to achieve fully automatic and effective VHR image classification.

  3. A cloud mask methodology for high resolution remote sensing data combining information from high and medium resolution optical sensors

    NASA Astrophysics Data System (ADS)

    Sedano, Fernando; Kempeneers, Pieter; Strobl, Peter; Kucera, Jan; Vogt, Peter; Seebach, Lucia; San-Miguel-Ayanz, Jesús

    2011-09-01

    This study presents a novel cloud masking approach for high resolution remote sensing images in the context of land cover mapping. As an advantage to traditional methods, the approach does not rely on thermal bands and it is applicable to images from most high resolution earth observation remote sensing sensors. The methodology couples pixel-based seed identification and object-based region growing. The seed identification stage relies on pixel value comparison between high resolution images and cloud free composites at lower spatial resolution from almost simultaneously acquired dates. The methodology was tested taking SPOT4-HRVIR, SPOT5-HRG and IRS-LISS III as high resolution images and cloud free MODIS composites as reference images. The selected scenes included a wide range of cloud types and surface features. The resulting cloud masks were evaluated through visual comparison. They were also compared with ad-hoc independently generated cloud masks and with the automatic cloud cover assessment algorithm (ACCA). In general the results showed an agreement in detected clouds higher than 95% for clouds larger than 50 ha. The approach produced consistent results identifying and mapping clouds of different type and size over various land surfaces including natural vegetation, agriculture land, built-up areas, water bodies and snow.

  4. Remote Advanced Payload Test Rig (RAPTR) Portable Payload Test System for the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Calvert, John; Freas, George, II

    2017-01-01

    The RAPTR was developed to test ISS payloads for NASA. RAPTR is a simulation of the Command and Data Handling (C&DH) interfaces of the ISS (MIL-STD 1553B, Ethernet and TAXI) and is designed to facilitate rapid testing and deployment of payload experiments to the ISS. The ISS Program's goal is to reduce the amount of time it takes a payload developer to build, test and fly a payload, including payload software. The RAPTR meets this need with its user oriented, visually rich interface. Additionally, the Analog and Discrete (A&D) signals of the following payload types may be tested with RAPTR: (1) EXPRESS Sub Rack Payloads; (2) ELC payloads; (3) External Columbus payloads; (4) External Japanese Experiment Module (JEM) payloads. The automated payload configuration setup and payload data inspection infrastructure is found nowhere else in ISS payload test systems. Testing can be done with minimal human intervention and setup, as the RAPTR automatically monitors parameters in the data headers that are sent to, and come from the experiment under test.

  5. Monitoring, analysis and classification of vegetation and soil data collected by a small and lightweight hyperspectral imaging system

    NASA Astrophysics Data System (ADS)

    Mönnig, Carsten

    2014-05-01

    The increasing precision of modern farming systems requires a near-real-time monitoring of agricultural crops in order to estimate soil condition, plant health and potential crop yield. For large sized agricultural plots, satellite imagery or aerial surveys can be used at considerable costs and possible time delays of days or even weeks. However, for small to medium sized plots, these monitoring approaches are cost-prohibitive and difficult to assess. Therefore, we propose within the INTERREG IV A-Project SMART INSPECTORS (Smart Aerial Test Rigs with Infrared Spectrometers and Radar), a cost effective, comparably simple approach to support farmers with a small and lightweight hyperspectral imaging system to collect remotely sensed data in spectral bands in between 400 to 1700nm. SMART INSPECTORS includes the whole remote sensing processing chain of small scale remote sensing from sensor construction, data processing and ground truthing for analysis of the results. The sensors are mounted on a remotely controlled (RC) Octocopter, a fixed wing RC airplane as well as on a two-seated Autogyro for larger plots. The high resolution images up to 5cm on the ground include spectra of visible light, near and thermal infrared as well as hyperspectral imagery. The data will be analyzed using remote sensing software and a Geographic Information System (GIS). The soil condition analysis includes soil humidity, temperature and roughness. Furthermore, a radar sensor is envisaged for the detection of geomorphologic, drainage and soil-plant roughness investigation. Plant health control includes drought stress, vegetation health, pest control, growth condition and canopy temperature. Different vegetation and soil indices will help to determine and understand soil conditions and plant traits. Additional investigation might include crop yield estimation of certain crops like apples, strawberries, pasture land, etc. The quality of remotely sensed vegetation data will be tested with ground truthing tools like a spectrometer, visual inspection and ground control panel. The soil condition will also be monitored with a wireless sensor network installed on the examined plots of interest. Provided with this data, a farmer can respond immediately to potential threats with high local precision. In this presentation, preliminary results of hyperspectral images of distinctive vegetation cover and soil on different pasture test plots are shown. After an evaluation period, the whole processing chain will offer farmers a unique, near real- time, low cost solution for small to mid-sized agricultural plots in order to easily assess crop and soil quality and the estimation of harvest. SMART INSPECTORS remotely sensed data will form the basis for an input in a decision support system which aims to detect crop related issues in order to react quickly and efficiently, saving fertilizer, water or pesticides.

  6. Data-proximate Visualization via Unidata Cloud Technologies

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.; Oxelson Ganter, J.; Weber, J.

    2016-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service.The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready.The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be.Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  7. Cloud-based data-proximate visualization and analysis

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2017-04-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready. The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  8. A motorized ultrasound system for MRI-ultrasound fusion guided prostatectomy

    NASA Astrophysics Data System (ADS)

    Seifabadi, Reza; Xu, Sheng; Pinto, Peter; Wood, Bradford J.

    2016-03-01

    Purpose: This study presents MoTRUS, a motorized transrectal ultrasound system, to enable remote navigation of a transrectal ultrasound (TRUS) probe during da Vinci assisted prostatectomy. MoTRUS not only provides a stable platform to the ultrasound probe, but also allows the physician to navigate it remotely while sitting on the da Vinci console. This study also presents phantom feasibility study with the goal being intraoperative MRI-US image fusion capability to bring preoperative MR images to the operating room for the best visualization of the gland, boundaries, nerves, etc. Method: A two degree-of-freedom probe holder is developed to insert and rotate a bi-plane transrectal ultrasound transducer. A custom joystick is made to enable remote navigation of MoTRUS. Safety features have been considered to avoid inadvertent risks (if any) to the patient. Custom design software has been developed to fuse pre-operative MR images to intraoperative ultrasound images acquired by MoTRUS. Results: Remote TRUS probe navigation was evaluated on a patient after taking required consents during prostatectomy using MoTRUS. It took 10 min to setup the system in OR. MoTRUS provided similar capability in addition to remote navigation and stable imaging. No complications were observed. Image fusion was evaluated on a commercial prostate phantom. Electromagnetic tracking was used for the fusion. Conclusions: Motorized navigation of the TRUS probe during prostatectomy is safe and feasible. Remote navigation provides physician with a more precise and easier control of the ultrasound image while removing the burden of manual manipulation of the probe. Image fusion improved visualization of the prostate and boundaries in a phantom study.

  9. Remote telescope control of site testing with ASCOM

    NASA Astrophysics Data System (ADS)

    Ji, Kaifan; Liang, Bo; Peng, Yajie; Wang, Feng

    2012-04-01

    Remote telescope control is significant important for the astronomical site testing. Basing on ASCOM standard, a prototype of remote telescope control system has been implemented. In this paper, the details of the system design, both server end and client end, are introduced. We tested the prototype on a narrow-band dial-up networking and controlled a real remote telescope successfully. The result indicates that it is effective to control remote telescope and other devices with ASCOM.

  10. Remote sensing image denoising application by generalized morphological component analysis

    NASA Astrophysics Data System (ADS)

    Yu, Chong; Chen, Xiong

    2014-12-01

    In this paper, we introduced a remote sensing image denoising method based on generalized morphological component analysis (GMCA). This novel algorithm is the further extension of morphological component analysis (MCA) algorithm to the blind source separation framework. The iterative thresholding strategy adopted by GMCA algorithm firstly works on the most significant features in the image, and then progressively incorporates smaller features to finely tune the parameters of whole model. Mathematical analysis of the computational complexity of GMCA algorithm is provided. Several comparison experiments with state-of-the-art denoising algorithms are reported. In order to make quantitative assessment of algorithms in experiments, Peak Signal to Noise Ratio (PSNR) index and Structural Similarity (SSIM) index are calculated to assess the denoising effect from the gray-level fidelity aspect and the structure-level fidelity aspect, respectively. Quantitative analysis on experiment results, which is consistent with the visual effect illustrated by denoised images, has proven that the introduced GMCA algorithm possesses a marvelous remote sensing image denoising effectiveness and ability. It is even hard to distinguish the original noiseless image from the recovered image by adopting GMCA algorithm through visual effect.

  11. Assessment of visual landscape quality using IKONOS imagery.

    PubMed

    Ozkan, Ulas Yunus

    2014-07-01

    The assessment of visual landscape quality is of importance to the management of urban woodlands. Satellite remote sensing may be used for this purpose as a substitute for traditional survey techniques that are both labour-intensive and time-consuming. This study examines the association between the quality of the perceived visual landscape in urban woodlands and texture measures extracted from IKONOS satellite data, which features 4-m spatial resolution and four spectral bands. The study was conducted in the woodlands of Istanbul (the most important element of urban mosaic) lying along both shores of the Bosporus Strait. The visual quality assessment applied in this study is based on the perceptual approach and was performed via a survey of expressed preferences. For this purpose, representative photographs of real scenery were used to elicit observers' preferences. A slide show comprising 33 images was presented to a group of 153 volunteers (all undergraduate students), and they were asked to rate the visual quality of each on a 10-point scale (1 for very low visual quality, 10 for very high). Average visual quality scores were calculated for landscape. Texture measures were acquired using the two methods: pixel-based and object-based. Pixel-based texture measures were extracted from the first principle component (PC1) image. Object-based texture measures were extracted by using the original four bands. The association between image texture measures and perceived visual landscape quality was tested via Pearson's correlation coefficient. The analysis found a strong linear association between image texture measures and visual quality. The highest correlation coefficient was calculated between standard deviation of gray levels (SDGL) (one of the pixel-based texture measures) and visual quality (r = 0.82, P < 0.05). The results showed that perceived visual quality of urban woodland landscapes can be estimated by using texture measures extracted from satellite data in combination with appropriate modelling techniques.

  12. Remote sensing of nutrient deficiency in Lactuca sativa using neural networks for terrestrial and advanced life support applications

    NASA Astrophysics Data System (ADS)

    Sears, Edie Seldon

    2000-12-01

    A remote sensing study using reflectance and fluorescence spectra of hydroponically grown Lactuca sativa (lettuce) canopies was conducted. An optical receiver was designed and constructed to interface with a commercial fiber optic spectrometer for data acquisition. Optical parameters were varied to determine effects of field of view and distance to target on vegetation stress assessment over the test plant growth cycle. Feedforward backpropagation neural networks (NN) were implemented to predict the presence of canopy stress. Effects of spatial and spectral resolutions on stress predictions of the neural network were also examined. Visual inspection and fresh mass values failed to differentiate among controls, plants cultivated with 25% of the recommended concentration of phosphorous (P), and those cultivated with 25% nitrogen (N) based on fresh mass and visual inspection. The NN's were trained on input vectors created using reflectance and test day, fluorescence and test day, and reflectance, fluorescence, and test day. Four networks were created representing four levels of spectral resolution: 100-nm NN, 10-nm NN, 1-nm NN, and 0.1-nm NN. The 10-nm resolution was found to be sufficient for classifying extreme nitrogen deficiency in freestanding hydroponic lettuce. As a result of leaf angle and canopy structure broadband scattering intensity in the 700-nm to 1000-nm range was found to be the most useful portion of the spectrum in this study. More subtle effects of "greenness" and fluorescence emission were believed to be obscured by canopy structure and leaf orientation. As field of view was not as found to be as significant as originally believed, systems implementing higher repetitions over more uniformly oriented, i.e. smaller, flatter, target areas would provide for more discernible neural network input vectors. It is believed that this technique holds considerable promise for early detection of extreme nitrogen deficiency. Further research is recommended using stereoscopic digital cameras to quantify leaf area index, leaf shape, and leaf orientation as well as reflectance. Given this additional information fluorescence emission may also prove a more useful biological assay of freestanding vegetation.

  13. Optoelectronic microdevices for combined phototherapy

    NASA Astrophysics Data System (ADS)

    Zharov, Vladimir P.; Menyaev, Yulian A.; Hamaev, V. A.; Antropov, G. M.; Waner, Milton

    2000-03-01

    In photomedicine in some of cases radiation delivery to local zones through optical fibers can be changed for the direct placing of tiny optical sources like semiconductor microlasers or light diodes in required zones of ears, nostrils, larynx, nasopharynx cochlea or alimentary tract. Our study accentuates the creation of optoelectronic microdevices for local phototherapy and functional imaging by using reflected light. Phototherapeutic micromodule consist of the light source, microprocessor and miniature optics with different kind of power supply: from autonomous with built-in batteries to remote supply by using pulsed magnetic field and supersmall coils. The developed prototype photomodule has size (phi) 8X16 mm and work duration with built-in battery and light diode up several hours at the average power from several tenths of mW to few mW. Preliminary clinical tests developed physiotherapeutic micrimodules in stomatology for treating the inflammation and in otolaryngology for treating tonsillitis and otitis are presented. The developed implanted electro- optical sources with typical size (phi) 4X0,8 mm and with remote supply were used for optical stimulation of photosensitive retina structure and electrostimulation of visual nerve. In this scheme the superminiature coil with 30 electrical integrated levels was used. Such devices were implanted in eyes of 175 patients with different vision problems during clinical trials in Institute of Eye's Surgery in Moscow. For functional imaging of skin layered structure LED arrays coupled photodiodes arrays were developed. The possibilities of this device for study drug diffusion and visualization small veins are discussed.

  14. Plasmodium species differentiation by non-expert on-line volunteers for remote malaria field diagnosis.

    PubMed

    Ortiz-Ruiz, Alejandra; Postigo, María; Gil-Casanova, Sara; Cuadrado, Daniel; Bautista, José M; Rubio, José Miguel; Luengo-Oroz, Miguel; Linares, María

    2018-01-30

    Routine field diagnosis of malaria is a considerable challenge in rural and low resources endemic areas mainly due to lack of personnel, training and sample processing capacity. In addition, differential diagnosis of Plasmodium species has a high level of misdiagnosis. Real time remote microscopical diagnosis through on-line crowdsourcing platforms could be converted into an agile network to support diagnosis-based treatment and malaria control in low resources areas. This study explores whether accurate Plasmodium species identification-a critical step during the diagnosis protocol in order to choose the appropriate medication-is possible through the information provided by non-trained on-line volunteers. 88 volunteers have performed a series of questionnaires over 110 images to differentiate species (Plasmodium falciparum, Plasmodium ovale, Plasmodium vivax, Plasmodium malariae, Plasmodium knowlesi) and parasite staging from thin blood smear images digitalized with a smartphone camera adapted to the ocular of a conventional light microscope. Visual cues evaluated in the surveys include texture and colour, parasite shape and red blood size. On-line volunteers are able to discriminate Plasmodium species (P. falciparum, P. malariae, P. vivax, P. ovale, P. knowlesi) and stages in thin-blood smears according to visual cues observed on digitalized images of parasitized red blood cells. Friendly textual descriptions of the visual cues and specialized malaria terminology is key for volunteers learning and efficiency. On-line volunteers with short-training are able to differentiate malaria parasite species and parasite stages from digitalized thin smears based on simple visual cues (shape, size, texture and colour). While the accuracy of a single on-line expert is far from perfect, a single parasite classification obtained by combining the opinions of multiple on-line volunteers over the same smear, could improve accuracy and reliability of Plasmodium species identification in remote malaria diagnosis.

  15. Remote Handled WIPP Canisters at Los Alamos National Laboratory Characterized for Retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffin, J.; Gonzales, W.

    2007-07-01

    The Los Alamos National Laboratory (LANL) is pursuing retrieval, transportation, and disposal of 16 remote handled transuranic waste canisters stored below ground in shafts since 1994. These canisters were retrievably stored in the shafts to await Nuclear Regulatory Commission certification of the Model Number RH-TRU 72B transportation cask and authorization of the Waste Isolation Pilot Plant (WIPP) to accept the canisters for disposal. Retrieval planning included radiological characterization and visual inspection of the canisters to confirm historical records, verify container integrity, determine proper personnel protection for the retrieval operations, provide radiological dose and exposure rate data for retrieval operations, andmore » to provide exterior radiological contamination data. The radiological characterization and visual inspection of the canisters was performed in May 2006. The effort required the development of remote techniques and equipment due to the potential for personnel exposure to radiological doses approaching 300 R/hr. Innovations included the use of two nested 1.5 meter (m) (5-feet [ft]) long concrete culvert pipes (1.1-m [42 inch (in.)] and 1.5-m [60-in] diameter, respectively) as radiological shielding and collapsible electrostatic dusting wands to collect radiological swipe samples from the annular space between the canister and shaft wall. Visual inspection indicated that the canisters are in good condition with little or no rust, the welded seams are intact, and ten of the canisters include hydrogen gas sampling equipment on the pintle that will have to be removed prior to retrieval. The visual inspection also provided six canister identification numbers that matched historical storage records. The exterior radiological data indicated alpha and beta contamination below LANL release criteria and radiological dose and exposure rates lower than expected based upon historical data and modeling of the canister contents. (authors)« less

  16. Real-time visual communication to aid disaster recovery in a multi-segment hybrid wireless networking system

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Wang, Qi; Grecos, Christos

    2012-06-01

    When natural disasters or other large-scale incidents occur, obtaining accurate and timely information on the developing situation is vital to effective disaster recovery operations. High-quality video streams and high-resolution images, if available in real time, would provide an invaluable source of current situation reports to the incident management team. Meanwhile, a disaster often causes significant damage to the communications infrastructure. Therefore, another essential requirement for disaster management is the ability to rapidly deploy a flexible incident area communication network. Such a network would facilitate the transmission of real-time video streams and still images from the disrupted area to remote command and control locations. In this paper, a comprehensive end-to-end video/image transmission system between an incident area and a remote control centre is proposed and implemented, and its performance is experimentally investigated. In this study a hybrid multi-segment communication network is designed that seamlessly integrates terrestrial wireless mesh networks (WMNs), distributed wireless visual sensor networks, an airborne platform with video camera balloons, and a Digital Video Broadcasting- Satellite (DVB-S) system. By carefully integrating all of these rapidly deployable, interworking and collaborative networking technologies, we can fully exploit the joint benefits provided by WMNs, WSNs, balloon camera networks and DVB-S for real-time video streaming and image delivery in emergency situations among the disaster hit area, the remote control centre and the rescue teams in the field. The whole proposed system is implemented in a proven simulator. Through extensive simulations, the real-time visual communication performance of this integrated system has been numerically evaluated, towards a more in-depth understanding in supporting high-quality visual communications in such a demanding context.

  17. A Community-Based IoT Personalized Wireless Healthcare Solution Trial.

    PubMed

    Catherwood, Philip A; Steele, David; Little, Mike; Mccomb, Stephen; Mclaughlin, James

    2018-01-01

    This paper presents an advanced Internet of Things point-of-care bio-fluid analyzer; a LoRa/Bluetooth-enabled electronic reader for biomedical strip-based diagnostics system for personalized monitoring. We undertake test simulations (technology trial without patient subjects) to demonstrate potential of long-range analysis, using a disposable test 'key' and companion Android app to form a diagnostic platform suitable for remote point-of-care screening for urinary tract infection (UTI). The 868 MHz LoRaWAN-enabled personalized monitor demonstrated sound potential with UTI test results being correctly diagnosed and transmitted to a remote secure cloud server in every case. Tests ranged over distances of 1.1-6.0 Km with radio path losses from 119-141 dB. All tests conducted were correctly and robustly received at the base station and relayed to the secure server for inspection. The UTI test strips were visually inspected for correct diagnosis based on color change and verified as 100% accurate. Results from testing across a number of regions indicate that such an Internet of Things medical solution is a robust and simple way to deliver next generation community-based smart diagnostics and disease management to best benefit patients and clinical staff alike. This significant step can be applied to any type of home or region, particularly those lacking suitable mobile signals, broadband connections, or even landlines. It brings subscription-free long-range bio-telemetry to healthcare providers and offers savings on regular clinician home visits or frequent clinic visits by the chronically ill. This paper highlights practical hurdles in establishing an Internet of Medical Things network, assisting informed deployment of similar future systems.

  18. A Community-Based IoT Personalized Wireless Healthcare Solution Trial

    PubMed Central

    Steele, David; Little, Mike; Mccomb, Stephen; Mclaughlin, James

    2018-01-01

    This paper presents an advanced Internet of Things point-of-care bio-fluid analyzer; a LoRa/Bluetooth-enabled electronic reader for biomedical strip-based diagnostics system for personalized monitoring. We undertake test simulations (technology trial without patient subjects) to demonstrate potential of long-range analysis, using a disposable test ‘key’ and companion Android app to form a diagnostic platform suitable for remote point-of-care screening for urinary tract infection (UTI). The 868 MHz LoRaWAN-enabled personalized monitor demonstrated sound potential with UTI test results being correctly diagnosed and transmitted to a remote secure cloud server in every case. Tests ranged over distances of 1.1–6.0 Km with radio path losses from 119–141 dB. All tests conducted were correctly and robustly received at the base station and relayed to the secure server for inspection. The UTI test strips were visually inspected for correct diagnosis based on color change and verified as 100% accurate. Results from testing across a number of regions indicate that such an Internet of Things medical solution is a robust and simple way to deliver next generation community-based smart diagnostics and disease management to best benefit patients and clinical staff alike. This significant step can be applied to any type of home or region, particularly those lacking suitable mobile signals, broadband connections, or even landlines. It brings subscription-free long-range bio-telemetry to healthcare providers and offers savings on regular clinician home visits or frequent clinic visits by the chronically ill. This paper highlights practical hurdles in establishing an Internet of Medical Things network, assisting informed deployment of similar future systems. PMID:29888145

  19. The Impact of Visual Disability on the Quality of Life of Older Persons in Rural Northeast Thailand

    ERIC Educational Resources Information Center

    La Grow, Steven; Sudnongbua, Supaporn; Boddy, Julie

    2011-01-01

    A high rate of self-reported visual disability was found among a sample of persons aged 60 and older in the course of a study that assessed the impact of feelings of abandonment among older persons in a remote rural area in northeast Thailand (Sudnongbua, La Grow, & Boddy, 2010). This study assessed the impact of self-reported visual…

  20. Adaptation to implied tilt: extensive spatial extrapolation of orientation gradients

    PubMed Central

    Roach, Neil W.; Webb, Ben S.

    2013-01-01

    To extract the global structure of an image, the visual system must integrate local orientation estimates across space. Progress is being made toward understanding this integration process, but very little is known about whether the presence of structure exerts a reciprocal influence on local orientation coding. We have previously shown that adaptation to patterns containing circular or radial structure induces tilt-aftereffects (TAEs), even in locations where the adapting pattern was occluded. These spatially “remote” TAEs have novel tuning properties and behave in a manner consistent with adaptation to the local orientation implied by the circular structure (but not physically present) at a given test location. Here, by manipulating the spatial distribution of local elements in noisy circular textures, we demonstrate that remote TAEs are driven by the extrapolation of orientation structure over remarkably large regions of visual space (more than 20°). We further show that these effects are not specific to adapting stimuli with polar orientation structure, but require a gradient of orientation change across space. Our results suggest that mechanisms of visual adaptation exploit orientation gradients to predict the local pattern content of unfilled regions of space. PMID:23882243

  1. Accessing and Utilizing Remote Sensing Data for Vectorborne Infectious Diseases Surveillance and Modeling

    NASA Technical Reports Server (NTRS)

    Kiang, Richard; Adimi, Farida; Kempler, Steven

    2008-01-01

    Background: The transmission of vectorborne infectious diseases is often influenced by environmental, meteorological and climatic parameters, because the vector life cycle depends on these factors. For example, the geophysical parameters relevant to malaria transmission include precipitation, surface temperature, humidity, elevation, and vegetation type. Because these parameters are routinely measured by satellites, remote sensing is an important technological tool for predicting, preventing, and containing a number of vectorborne infectious diseases, such as malaria, dengue, West Nile virus, etc. Methods: A variety of NASA remote sensing data can be used for modeling vectorborne infectious disease transmission. We will discuss both the well known and less known remote sensing data, including Landsat, AVHRR (Advanced Very High Resolution Radiometer), MODIS (Moderate Resolution Imaging Spectroradiometer), TRMM (Tropical Rainfall Measuring Mission), ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer), EO-1 (Earth Observing One) ALI (Advanced Land Imager), and SIESIP (Seasonal to Interannual Earth Science Information Partner) dataset. Giovanni is a Web-based application developed by the NASA Goddard Earth Sciences Data and Information Services Center. It provides a simple and intuitive way to visualize, analyze, and access vast amounts of Earth science remote sensing data. After remote sensing data is obtained, a variety of techniques, including generalized linear models and artificial intelligence oriented methods, t 3 can be used to model the dependency of disease transmission on these parameters. Results: The processes of accessing, visualizing and utilizing precipitation data using Giovanni, and acquiring other data at additional websites are illustrated. Malaria incidence time series for some parts of Thailand and Indonesia are used to demonstrate that malaria incidences are reasonably well modeled with generalized linear models and artificial intelligence based techniques. Conclusions: Remote sensing data relevant to the transmission of vectorborne infectious diseases can be conveniently accessed at NASA and some other websites. These data are useful for vectorborne infectious disease surveillance and modeling.

  2. Global change research related to the Earth's energy and hydrologic cycle

    NASA Technical Reports Server (NTRS)

    Perkey, Donald J.

    1994-01-01

    The following are discussed: Geophysical Modeling and Processes; Land Surface Processes and Atmospheric Interactions; Remote Sensing Technology and Geophysical Retrievals; and Scientific Data Management and Visual Analysis.

  3. Remote sensing techniques for the detection of soil erosion and the identification of soil conservation practices

    NASA Technical Reports Server (NTRS)

    Pelletier, R. E.; Griffin, R. H.

    1985-01-01

    The following paper is a summary of a number of techniques initiated under the AgRISTARS (Agriculture and Resources Inventory Surveys Through Aerospace Remote Sensing) project for the detection of soil degradation caused by water erosion and the identification of soil conservation practices for resource inventories. Discussed are methods to utilize a geographic information system to determine potential soil erosion through a USLE (Universal Soil Loss Equation) model; application of the Kauth-Thomas Transform to detect present erosional status; and the identification of conservation practices through visual interpretation and a variety of enhancement procedures applied to digital remotely sensed data.

  4. Feasibility of remote administration of the Fundamentals of Laparoscopic Surgery (FLS) skills test.

    PubMed

    Okrainec, Allan; Vassiliou, Melina; Kapoor, Andrew; Pitzul, Kristen; Henao, Oscar; Kaneva, Pepa; Jackson, Timothy; Ritter, E Matt

    2013-11-01

    Fundamentals of Laparoscopic Surgery (FLS) certification testing currently is offered at accredited test centers or at select surgical conferences. Maintaining these test centers requires considerable investment in human and financial resources. Additionally, it can be challenging for individuals outside North America to become FLS certified. The objective of this pilot study was to assess the feasibility of remotely administering and scoring the FLS examination using live videoconferencing compared with standard onsite testing. This parallel mixed-methods study used both FLS scoring data and participant feedback to determine the barriers to feasibility of remote proctoring for the FLS examination. Participants were tested at two accredited FLS testing centers. An official FLS proctor administered and scored the FLS exam remotely while another onsite proctor provided a live score of participants' performance. Participant feedback was collected during testing. Interrater reliabilities of onsite and remote FLS scoring data were compared using intraclass correlation coefficients (ICCs). Participant feedback was analyzed using modified grounded theory to identify themes for barriers to feasibility. The scores of the remote and onsite proctors showed excellent interrater reliability in the total FLS (ICC 0.995, CI [0.985-0.998]). Several barriers led to critical errors in remote scoring, but most were accompanied by a solution incorporated into the study protocol. The most common barrier was the chain of custody for exam accessories. The results of this pilot study suggest that remote administration of the FLS has the potential to decrease costs without altering test-taker scores or exam validity. Further research is required to validate protocols for remote and onsite proctors and to direct execution of these protocols in a controlled environment identical to current FLS test administration.

  5. Study on identifying deciduous forest by the method of feature space transformation

    NASA Astrophysics Data System (ADS)

    Zhang, Xuexia; Wu, Pengfei

    2009-10-01

    The thematic remotely sensed information extraction is always one of puzzling nuts which the remote sensing science faces, so many remote sensing scientists devotes diligently to this domain research. The methods of thematic information extraction include two kinds of the visual interpretation and the computer interpretation, the developing direction of which is intellectualization and comprehensive modularization. The paper tries to develop the intelligent extraction method of feature space transformation for the deciduous forest thematic information extraction in Changping district of Beijing city. The whole Chinese-Brazil resources satellite images received in 2005 are used to extract the deciduous forest coverage area by feature space transformation method and linear spectral decomposing method, and the result from remote sensing is similar to woodland resource census data by Chinese forestry bureau in 2004.

  6. Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.

    PubMed

    Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar

    2012-01-01

    Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.

  7. Web-GIS visualisation of permafrost-related Remote Sensing products for ESA GlobPermafrost

    NASA Astrophysics Data System (ADS)

    Haas, A.; Heim, B.; Schaefer-Neth, C.; Laboor, S.; Nitze, I.; Grosse, G.; Bartsch, A.; Kaab, A.; Strozzi, T.; Wiesmann, A.; Seifert, F. M.

    2016-12-01

    The ESA GlobPermafrost (www.globpermafrost.info) provides a remote sensing service for permafrost research and applications. The service comprises of data product generation for various sites and regions as well as specific infrastructure allowing overview and access to datasets. Based on an online user survey conducted within the project, the user community extensively applies GIS software to handle remote sensing-derived datasets and requires preview functionalities before accessing them. In response, we develop the Permafrost Information System PerSys which is conceptualized as an open access geospatial data dissemination and visualization portal. PerSys will allow visualisation of GlobPermafrost raster and vector products such as land cover classifications, Landsat multispectral index trend datasets, lake and wetland extents, InSAR-based land surface deformation maps, rock glacier velocity fields, spatially distributed permafrost model outputs, and land surface temperature datasets. The datasets will be published as WebGIS services relying on OGC-standardized Web Mapping Service (WMS) and Web Feature Service (WFS) technologies for data display and visualization. The WebGIS environment will be hosted at the AWI computing centre where a geodata infrastructure has been implemented comprising of ArcGIS for Server 10.4, PostgreSQL 9.2 and a browser-driven data viewer based on Leaflet (http://leafletjs.com). Independently, we will provide an `Access - Restricted Data Dissemination Service', which will be available to registered users for testing frequently updated versions of project datasets. PerSys will become a core project of the Arctic Permafrost Geospatial Centre (APGC) within the ERC-funded PETA-CARB project (www.awi.de/petacarb). The APGC Data Catalogue will contain all final products of GlobPermafrost, allow in-depth dataset search via keywords, spatial and temporal coverage, data type, etc., and will provide DOI-based links to the datasets archived in the long-term, open access PANGAEA data repository.

  8. The DELTA MONSTER: An RPV designed to investigate the aerodynamics of a delta wing platform

    NASA Technical Reports Server (NTRS)

    Connolly, Kristen; Flynn, Mike; Gallagher, Randy; Greek, Chris; Kozlowski, Marc; Mcdonald, Brian; Mckenna, Matt; Sellar, Rich; Shearon, Andy

    1989-01-01

    The mission requirements for the performance of aerodynamic tests on a delta wind planform posed some problems, these include aerodynamic interference; structural support; data acquisition and transmission instrumentation; aircraft stability and control; and propulsion implementation. To eliminate the problems of wall interference, free stream turbulence, and the difficulty of achieving dynamic similarity between the test and actual flight aircraft that are associated with aerodynamic testing in wind tunnels, the concept of the remotely piloted vehicle which can perform a basic aerodynamic study on a delta wing was the main objective for the Green Mission - the Delta Monster. The basic aerodynamic studies were performed on a delta wing with a sweep angle greater than 45 degrees. These tests were performed at various angles of attack and Reynolds numbers. The delta wing was instrumented to determine the primary leading edge vortex formation and location, using pressure measurements and/or flow visualization. A data acquisition system was provided to collect all necessary data.

  9. A preliminary test of the application of the Lightning Detection and Ranging System (LDAR) as a thunderstorm warning and location device for the FHA including a correlation with updrafts, turbulence, and radar precipitation echoes

    NASA Technical Reports Server (NTRS)

    Poehler, H. A.

    1978-01-01

    Results of a test of the use of a Lightning Detection and Ranging (LDAR) remote display in the Patrick AFB RAPCON facility are presented. Agreement between LDAR and radar precipitation echoes of the RAPCON radar was observed, as well as agreement between LDAR and pilot's visual observations of lightning flashes. A more precise comparison between LDAR and KSC based radars is achieved by the superposition of LDAR precipitation echoes. Airborne measurements of updrafts and turbulence by an armored T-28 aircraft flying through the thunderclouds are correlated with LDAR along the flight path. Calibration and measurements of the accuracy of the LDAR System are discussed, and the extended range of the system is illustrated.

  10. Field Test on the Feasibility of Remoting HF Antenna with Fiber Optics

    DTIC Science & Technology

    2008-07-31

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/5652--08-9137 Field Test on the Feasibility of Remoting HF Antenna with Fiber Optics July...NUMBER (include area code) b. ABSTRACT c. THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Field Test on the Feasibility of Remoting HF Antenna...optic link was employed to remote a high-frequency ( HF , 2-30 MHz) direction-finding (DF) array. The test link comprised a seven-element “L” array

  11. Intraoperative Cochlear Implant Device Testing Utilizing an Automated Remote System: A Prospective Pilot Study.

    PubMed

    Lohmann, Amanda R; Carlson, Matthew L; Sladen, Douglas P

    2018-03-01

    Intraoperative cochlear implant device testing provides valuable information regarding device integrity, electrode position, and may assist with determining initial stimulation settings. Manual intraoperative device testing during cochlear implantation requires the time and expertise of a trained audiologist. The purpose of the current study is to investigate the feasibility of using automated remote intraoperative cochlear implant reverse telemetry testing as an alternative to standard testing. Prospective pilot study evaluating intraoperative remote automated impedance and Automatic Neural Response Telemetry (AutoNRT) testing in 34 consecutive cochlear implant surgeries using the Intraoperative Remote Assistant (Cochlear Nucleus CR120). In all cases, remote intraoperative device testing was performed by trained operating room staff. A comparison was made to the "gold standard" of manual testing by an experienced cochlear implant audiologist. Electrode position and absence of tip fold-over was confirmed using plain film x-ray. Automated remote reverse telemetry testing was successfully completed in all patients. Intraoperative x-ray demonstrated normal electrode position without tip fold-over. Average impedance values were significantly higher using standard testing versus CR120 remote testing (standard mean 10.7 kΩ, SD 1.2 vs. CR120 mean 7.5 kΩ, SD 0.7, p < 0.001). There was strong agreement between standard manual testing and remote automated testing with regard to the presence of open or short circuits along the array. There were, however, two cases in which standard testing identified an open circuit, when CR120 testing showed the circuit to be closed. Neural responses were successfully obtained in all patients using both systems. There was no difference in basal electrode responses (standard mean 195.0 μV, SD 14.10 vs. CR120 194.5 μV, SD 14.23; p = 0.7814); however, more favorable (lower μV amplitude) results were obtained with the remote automated system in the apical 10 electrodes (standard 185.4 μV, SD 11.69 vs. CR120 177.0 μV, SD 11.57; p value < 0.001). These preliminary data demonstrate that intraoperative cochlear implant device testing using a remote automated system is feasible. This system may be useful for cochlear implant programs with limited audiology support or for programs looking to streamline intraoperative device testing protocols. Future studies with larger patient enrollment are required to validate these promising, but preliminary, findings.

  12. Studies to design and develop improved remote manipulator systems

    NASA Technical Reports Server (NTRS)

    Hill, J. W.; Sword, A. J.

    1973-01-01

    Remote manipulator control considered is based on several levels of automatic supervision which derives manipulator commands from an analysis of sensor states and task requirements. Principle sensors are manipulator joint position, tactile, and currents. The tactile sensor states can be displayed visually in perspective or replicated in the operator's control handle of perceived by the automatic supervisor. Studies are reported on control organization, operator performance and system performance measures. Unusual hardware and software details are described.

  13. New orientation and accessibility option for persons with visual impairment: transportation applications for remote infrared audible signage.

    PubMed

    Crandall, William; Bentzen, Billie Louise; Myers, Linda; Brabyn, John

    2001-05-01

    BACKGROUND: For a blind or visually impaired person, a vital prerequisite to accessing any feature of the built environment is being able to find this feature. Braille signs, even where available, do not replace the functions of print signage because they cannot be read from a distance. Remotely readable infrared signs utilise spoken infrared message transmissions to label key environmental features, so that a blind person with a suitable receiver can locate and identify them from a distance. METHODS: Three problems that are among the most challenging and dangerous faced by blind travellers are negotiating complex transit stations, locating bus stops and safely and efficiently crossing light-controlled intersections. We report the results of human factors studies using a remote infrared audible sign system (RIAS), Talking Signs(R), in these critical tasks, examining issues such as the amount of training needed to use the system, its impact on performance and safety, benefits for different population subgroups and user opinions of its value. RESULTS: Results are presented in the form of both objective performance measures and in subjects' ratings of the usefulness of the system in performing these tasks. Findings are that blind people can quickly and easily learn to use remote infrared audible signage effectively and that its use improves travel safety, efficiency and independence.? CONCLUSIONS: The technology provides equal access to a wide variety of public facilities.

  14. Remote Viewer for Maritime Robotics Software

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki; Wolf, Michael; Huntsberger, Terrance L.; Howard, Andrew B.

    2013-01-01

    This software is a viewer program for maritime robotics software that provides a 3D visualization of the boat pose, its position history, ENC (Electrical Nautical Chart) information, camera images, map overlay, and detected tracks.

  15. Toward interactive search in remote sensing imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, Reid B; Hush, Do; Harvey, Neal

    2010-01-01

    To move from data to information in almost all science and defense applications requires a human-in-the-loop to validate information products, resolve inconsistencies, and account for incomplete and potentially deceptive sources of information. This is a key motivation for visual analytics which aims to develop techniques that complement and empower human users. By contrast, the vast majority of algorithms developed in machine learning aim to replace human users in data exploitation. In this paper we describe a recently introduced machine learning problem, called rare category detection, which may be a better match to visual analytic environments. We describe a new designmore » criteria for this problem, and present comparisons to existing techniques with both synthetic and real-world datasets. We conclude by describing an application in broad-area search of remote sensing imagery.« less

  16. Remote vs. head-mounted eye-tracking: a comparison using radiologists reading mammograms

    NASA Astrophysics Data System (ADS)

    Mello-Thoms, Claudia; Gur, David

    2007-03-01

    Eye position monitoring has been used for decades in Radiology in order to determine how radiologists interpret medical images. Using these devices several discoveries about the perception/decision making process have been made, such as the importance of comparisons of perceived abnormalities with selected areas of the background, the likelihood that a true lesion will attract visual attention early in the reading process, and the finding that most misses attract prolonged visual dwell, often comparable to dwell in the location of reported lesions. However, eye position tracking is a cumbersome process, which often requires the observer to wear a helmet gear which contains the eye tracker per se and a magnetic head tracker, which allows for the computation of head position. Observers tend to complain of fatigue after wearing the gear for a prolonged time. Recently, with the advances made to remote eye-tracking, the use of head-mounted systems seemed destined to become a thing of the past. In this study we evaluated a remote eye tracking system, and compared it to a head-mounted system, as radiologists read a case set of one-view mammograms on a high-resolution display. We compared visual search parameters between the two systems, such as time to hit the location of the lesion for the first time, amount of dwell time in the location of the lesion, total time analyzing the image, etc. We also evaluated the observers' impressions of both systems, and what their perceptions were of the restrictions of each system.

  17. Leg ischaemia before circulatory arrest alters brain leucocyte count and respiratory chain redox state.

    PubMed

    Yannopoulos, Fredrik S; Arvola, Oiva; Haapanen, Henri; Herajärvi, Johanna; Miinalainen, Ilkka; Jensen, Hanna; Kiviluoma, Kai; Juvonen, Tatu

    2014-03-01

    Remote ischaemic preconditioning and its neuroprotective abilities are currently under investigation and the method has shown significant effects in several small and large animal studies. In our previous studies, leucocyte filtration during cardiopulmonary bypass reduced cerebrocortical adherent leucocyte count and mitigated cerebral damage after hypothermic circulatory arrest (HCA) in piglets. This study aimed to obtain and assess direct visual data of leucocyte behaviour in cerebral vessels after hypothermic circulatory arrest following remote ischaemic preconditioning. Twelve native stock piglets were randomized into a remote ischaemic preconditioning group (n = 6) and a control group (n = 6). The intervention group underwent hind-leg ischaemia, whereas the control group received a sham-treatment before a 60-min period of hypothermic circulatory arrest. An intravital microscope was used to obtain measurements from the cerebrocortical vessel in vivo. It included three sets of filters: a violet filter to visualize microvascular perfusion and vessel diameter, a green filter for visualization of rhodamine-labelled leucocytes and an ultraviolet filter for reduced nicotinamide adenine dinucleotide (NADH) analysis. The final magnification on the microscope was 400. After the experiment, cerebral and cerebellar biopsies were collected and analysed with transmission electron microscope by a blinded analyst. In the transmission electron microscope analysis, the entire intervention group had normal, unaffected rough endoplasmic reticulum's in their cerebellar tissue, whereas the control group had a mean score of 1.06 (standard deviation 0.41) (P = 0.026). The measured amount of adherent leucocytes was lower in the remote ischaemic preconditioning group. The difference was statistically significant at 5, 15 and 45 min after circulatory arrest. Statistically significant differences were seen also in the recovery phase at 90 and 120 min after reperfusion. Nicotinamide adenine dinucleotide autofluorescence had statistically significant differences at 10 min after cooling and at 120 and 180 min after hypothermic circulatory arrest. Remote ischaemic preconditioning seems to provide better mitochondrial respiratory chain function as indicated by the higher NADH content. It simultaneously provides a reduction of adherent leucocytes in cerebral vessels after hypothermic circulatory arrest. Additionally, it might provide some degree of cellular organ preservation as implied by the electron microscopy results.

  18. Identification and visualization of dominant patterns and anomalies in remotely sensed vegetation phenology using a parallel tool for principal components analysis

    Treesearch

    Richard Tran Mills; Jitendra Kumar; Forrest M. Hoffman; William W. Hargrove; Joseph P. Spruce; Steven P. Norman

    2013-01-01

    We investigated the use of principal components analysis (PCA) to visualize dominant patterns and identify anomalies in a multi-year land surface phenology data set (231 m × 231 m normalized difference vegetation index (NDVI) values derived from the Moderate Resolution Imaging Spectroradiometer (MODIS)) used for detecting threats to forest health in the conterminous...

  19. Integrated Web-Based Access to and use of Satellite Remote Sensing Data for Improved Decision Making in Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Teng, W.; Chiu, L.; Kempler, S.; Liu, Z.; Nadeau, D.; Rui, H.

    2006-12-01

    Using NASA satellite remote sensing data from multiple sources for hydrologic applications can be a daunting task and requires a detailed understanding of the data's internal structure and physical implementation. Gaining this understanding and applying it to data reduction is a time-consuming task that must be undertaken before the core investigation can begin. In order to facilitate such investigations, the NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) has developed the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure or "Giovanni," which supports a family of Web interfaces (instances) that allow users to perform interactive visualization and analysis online without downloading any data. Two such Giovanni instances are particularly relevant to hydrologic applications: the Tropical Rainfall Measuring Mission (TRMM) Online Visualization and Analysis System (TOVAS) and the Agricultural Online Visualization and Analysis System (AOVAS), both highly popular and widely used for a variety of applications, including those related to several NASA Applications of National Priority, such as Agricultural Efficiency, Disaster Management, Ecological Forecasting, Homeland Security, and Public Health. Dynamic, context- sensitive Web services provided by TOVAS and AOVAS enable users to seamlessly access NASA data from within, and deeply integrate the data into, their local client environments. One example is between TOVAS and Florida International University's TerraFly, a Web-enabled system that serves a broad segment of the research and applications community, by facilitating access to various textual, remotely sensed, and vector data. Another example is between AOVAS and the U.S. Department of Agriculture Foreign Agricultural Service (USDA FAS)'s Crop Explorer, the primary decision support tool used by FAS to monitor the production, supply, and demand of agricultural commodities worldwide. AOVAS is also part of GES DISC's Agricultural Information System (AIS), which can operationally provide satellite remote sensing data products (e.g., near- real-time rainfall) and analysis services to agricultural users. AIS enables the remote, interoperable access to distributed data, by using the GrADS-Data Server (GDS) and the Open Geospatial Consortium (OGC)- compliant MapServer. The latter allows the access of AIS data from any OGC-compliant client, such as the Earth-Sun System Gateway (ESG) or Google Earth. The Giovanni system is evolving towards a Service- Oriented Architecture and is highly customizable (e.g., adding new products or services), thus availing the hydrologic applications user community of Giovanni's simple-to-use and powerful capabilities to improve decision-making.

  20. Application-Controlled Demand Paging for Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Cox, Michael; Ellsworth, David; Kutler, Paul (Technical Monitor)

    1997-01-01

    In the area of scientific visualization, input data sets are often very large. In visualization of Computational Fluid Dynamics (CFD) in particular, input data sets today can surpass 100 Gbytes, and are expected to scale with the ability of supercomputers to generate them. Some visualization tools already partition large data sets into segments, and load appropriate segments as they are needed. However, this does not remove the problem for two reasons: 1) there are data sets for which even the individual segments are too large for the largest graphics workstations, 2) many practitioners do not have access to workstations with the memory capacity required to load even a segment, especially since the state-of-the-art visualization tools tend to be developed by researchers with much more powerful machines. When the size of the data that must be accessed is larger than the size of memory, some form of virtual memory is simply required. This may be by segmentation, paging, or by paged segments. In this paper we demonstrate that complete reliance on operating system virtual memory for out-of-core visualization leads to poor performance. We then describe a paged segment system that we have implemented, and explore the principles of memory management that can be employed by the application for out-of-core visualization. We show that application control over some of these can significantly improve performance. We show that sparse traversal can be exploited by loading only those data actually required. We show also that application control over data loading can be exploited by 1) loading data from alternative storage format (in particular 3-dimensional data stored in sub-cubes), 2) controlling the page size. Both of these techniques effectively reduce the total memory required by visualization at run-time. We also describe experiments we have done on remote out-of-core visualization (when pages are read by demand from remote disk) whose results are promising.

  1. Experiences in teleoperation of land vehicles

    NASA Technical Reports Server (NTRS)

    Mcgovern, Douglas E.

    1989-01-01

    Teleoperation of land vehicles allows the removal of the operator from the vehicle to a remote location. This can greatly increase operator safety and comfort in applications such as security patrol or military combat. The cost includes system complexity and reduced system performance. All feedback on vehicle performance and on environmental conditions must pass through sensors, a communications channel, and displays. In particular, this requires vision to be transmitted by close-circuit television with a consequent degradation of information content. Vehicular teleoperation, as a result, places severe demands on the operator. Teleoperated land vehicles have been built and tested by many organizations, including Sandia National Laboratories (SNL). The SNL fleet presently includes eight vehicles of varying capability. These vehicles have been operated using different types of controls, displays, and visual systems. Experimentation studying the effects of vision system characteristics on off-road, remote driving was performed for conditions of fixed camera versus steering-coupled camera and of color versus black and white video display. Additionally, much experience was gained through system demonstrations and hardware development trials. The preliminary experimental findings and the results of the accumulated operational experience are discussed.

  2. Network based sky Brightness Monitor

    NASA Astrophysics Data System (ADS)

    McKenna, Dan; Pulvermacher, R.; Davis, D. R.

    2009-01-01

    We have developed and are currently testing an autonomous 2 channel photometer designed to measure the night sky brightness in the visual wavelengths over a multi-year campaign. The photometer uses a robust silicon sensor filtered with Hoya CM500 glass. The Sky brightness is measured every minute at two elevation angles typically zenith and 20 degrees to monitor brightness and transparency. The Sky Brightness monitor consists of two units, the remote photometer and a network interface. Currently these devices use 2.4 Ghz transceivers with a free space range of 100 meters. The remote unit is battery powered with day time recharging using a solar panel. Data received by the network interface transmits data via standard POP Email protocol. A second version is under development for radio sensitive areas using an optical fiber for data transmission. We will present the current comparison with the National Park Service sky monitoring camera. We will also discuss the calibration methods used for standardization and temperature compensation. This system is expected to be deployed in the next year and be operated by the International Dark Sky Association SKYMONITOR project.

  3. Remote sensing applied to agriculture: Basic principles, methodology, and applications

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Mendonca, F. J.

    1981-01-01

    The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.

  4. Visualizing Airborne and Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Bierwirth, Victoria A.

    2011-01-01

    Remote sensing is a process able to provide information about Earth to better understand Earth's processes and assist in monitoring Earth's resources. The Cloud Absorption Radiometer (CAR) is one remote sensing instrument dedicated to the cause of collecting data on anthropogenic influences on Earth as well as assisting scientists in understanding land-surface and atmospheric interactions. Landsat is a satellite program dedicated to collecting repetitive coverage of the continental Earth surfaces in seven regions of the electromagnetic spectrum. Combining these two aircraft and satellite remote sensing instruments will provide a detailed and comprehensive data collection able to provide influential information and improve predictions of changes in the future. This project acquired, interpreted, and created composite images from satellite data acquired from Landsat 4-5 Thematic Mapper (TM) and Landsat 7 Enhanced Thematic Mapper plus (ETM+). Landsat images were processed for areas covered by CAR during the Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCT AS), Cloud and Land Surface Interaction Campaign (CLASIC), Intercontinental Chemical Transport Experiment-Phase B (INTEXB), and Southern African Regional Science Initiative (SAFARI) 2000 missions. The acquisition of Landsat data will provide supplemental information to assist in visualizing and interpreting airborne and satellite imagery.

  5. Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software

    NASA Astrophysics Data System (ADS)

    Gowda, P. H.; Moorhead, J.; Brauer, D. K.

    2017-12-01

    Evapotranspiration (ET) is a major component of the hydrologic cycle. ET data are used for a variety of water management and research purposes such as irrigation scheduling, water and crop modeling, streamflow, water availability, and many more. Remote sensing products have been widely used to create spatially representative ET data sets which provide important information from field to regional scales. As UAV capabilities increase, remote sensing use is likely to also increase. For that purpose, scientists at the USDA-ARS research laboratory in Bushland, TX developed the Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software. The BEARS software is a Java based software that allows users to process remote sensing data to generate ET outputs using predefined models, or enter custom equations and models. The capability to define new equations and build new models expands the applicability of the BEARS software beyond ET mapping to any remote sensing application. The software also includes an image viewing tool that allows users to visualize outputs, as well as draw an area of interest using various shapes. This software is freely available from the USDA-ARS Conservation and Production Research Laboratory website.

  6. Remote video assessment for missile launch facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, G.G.; Stewart, W.A.

    1995-07-01

    The widely dispersed, unmanned launch facilities (LFs) for land-based ICBMs (intercontinental ballistic missiles) currently do not have visual assessment capability for existing intrusion alarms. The security response force currently must assess each alarm on-site. Remote assessment will enhance manpower, safety, and security efforts. Sandia National Laboratories was tasked by the USAF Electronic Systems Center to research, recommend, and demonstrate a cost-effective remote video assessment capability at missile LFs. The project`s charter was to provide: system concepts; market survey analysis; technology search recommendations; and operational hardware demonstrations for remote video assessment from a missile LF to a remote security center viamore » a cost-effective transmission medium and without using visible, on-site lighting. The technical challenges of this project were to: analyze various video transmission media and emphasize using the existing missile system copper line which can be as long as 30 miles; accentuate and extremely low-cost system because of the many sites requiring system installation; integrate the video assessment system with the current LF alarm system; and provide video assessment at the remote sites with non-visible lighting.« less

  7. A Responsive Client for Distributed Visualization

    NASA Astrophysics Data System (ADS)

    Bollig, E. F.; Jensen, P. A.; Erlebacher, G.; Yuen, D. A.; Momsen, A. R.

    2006-12-01

    As grids, web services and distributed computing continue to gain popularity in the scientific community, demand for virtual laboratories likewise increases. Today organizations such as the Virtual Laboratory for Earth and Planetary Sciences (VLab) are dedicated to developing web-based portals to perform various simulations remotely while abstracting away details of the underlying computation. Two of the biggest challenges in portal- based computing are fast visualization and smooth interrogation without over taxing clients resources. In response to this challenge, we have expanded on our previous data storage strategy and thick client visualization scheme [1] to develop a client-centric distributed application that utilizes remote visualization of large datasets and makes use of the local graphics processor for improved interactivity. Rather than waste precious client resources for visualization, a combination of 3D graphics and 2D server bitmaps are used to simulate the look and feel of local rendering. Java Web Start and Java Bindings for OpenGL enable install-on- demand functionality as well as low level access to client graphics for all platforms. Powerful visualization services based on VTK and auto-generated by the WATT compiler [2] are accessible through a standard web API. Data is permanently stored on compute nodes while separate visualization nodes fetch data requested by clients, caching it locally to prevent unnecessary transfers. We will demonstrate application capabilities in the context of simulated charge density visualization within the VLab portal. In addition, we will address generalizations of our application to interact with a wider number of WATT services and performance bottlenecks. [1] Ananthuni, R., Karki, B.B., Bollig, E.F., da Silva, C.R.S., Erlebacher, G., "A Web-Based Visualization and Reposition Scheme for Scientific Data," In Press, Proceedings of the 2006 International Conference on Modeling Simulation and Visualization Methods (MSV'06) (2006). [2] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005.

  8. Automated detection of snow avalanche deposits: segmentation and classification of optical remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Lato, M. J.; Frauenfelder, R.; Bühler, Y.

    2012-09-01

    Snow avalanches in mountainous areas pose a significant threat to infrastructure (roads, railways, energy transmission corridors), personal property (homes) and recreational areas as well as for lives of people living and moving in alpine terrain. The impacts of snow avalanches range from delays and financial loss through road and railway closures, destruction of property and infrastructure, to loss of life. Avalanche warnings today are mainly based on meteorological information, snow pack information, field observations, historically recorded avalanche events as well as experience and expert knowledge. The ability to automatically identify snow avalanches using Very High Resolution (VHR) optical remote sensing imagery has the potential to assist in the development of accurate, spatially widespread, detailed maps of zones prone to avalanches as well as to build up data bases of past avalanche events in poorly accessible regions. This would provide decision makers with improved knowledge of the frequency and size distributions of avalanches in such areas. We used an object-oriented image interpretation approach, which employs segmentation and classification methodologies, to detect recent snow avalanche deposits within VHR panchromatic optical remote sensing imagery. This produces avalanche deposit maps, which can be integrated with other spatial mapping and terrain data. The object-oriented approach has been tested and validated against manually generated maps in which avalanches are visually recognized and digitized. The accuracy (both users and producers) are over 0.9 with errors of commission less than 0.05. Future research is directed to widespread testing of the algorithm on data generated by various sensors and improvement of the algorithm in high noise regions as well as the mapping of avalanche paths alongside their deposits.

  9. Visual Prediction of Rover Slip: Learning Algorithms and Field Experiments

    DTIC Science & Technology

    2008-01-01

    DATES COVERED 00-00-2008 to 00-00-2008 4. TITLE AND SUBTITLE Visual Prediction of Rover Slip: Learning Algorithms and Field Experiments 5a...rover mobility [23, 78]. Remote slip prediction will enable safe traversals on large slopes covered with sand, drift material or loose crater ejecta...aqueous processes, e.g., mineral-rich out- crops which imply exposure to water [92] or putative lake formations or shorelines, layered deposits, etc

  10. Visualization Software for VisIT Java Client

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Billings, Jay Jay; Smith, Robert W

    The VisIT Java Client (JVC) library is a lightweight thin client that is designed and written purely in the native language of Java (the Python & JavaScript versions of the library use the same concept) and communicates with any new unmodified standalone version of VisIT, a high performance computing parallel visualization toolkit, over traditional or web sockets and dynamically determines capabilities of the running VisIT instance whether local or remote.

  11. A micro-vibration generated method for testing the imaging quality on ground of space remote sensing

    NASA Astrophysics Data System (ADS)

    Gu, Yingying; Wang, Li; Wu, Qingwen

    2018-03-01

    In this paper, a novel method is proposed, which can simulate satellite platform micro-vibration and test the impact of satellite micro-vibration on imaging quality of space optical remote sensor on ground. The method can generate micro-vibration of satellite platform in orbit from vibrational degrees of freedom, spectrum, magnitude, and coupling path. Experiment results show that the relative error of acceleration control is within 7%, in frequencies from 7Hz to 40Hz. Utilizing this method, the system level test about the micro-vibration impact on imaging quality of space optical remote sensor can be realized. This method will have an important applications in testing micro-vibration tolerance margin of optical remote sensor, verifying vibration isolation and suppression performance of optical remote sensor, exploring the principle of micro-vibration impact on imaging quality of optical remote sensor.

  12. New developments in super-resolution for GaoFen-4

    NASA Astrophysics Data System (ADS)

    Li, Feng; Fu, Jie; Xin, Lei; Liu, Yuhong; Liu, Zhijia

    2017-10-01

    In this paper, the application of super resolution (SR, restoring a high spatial resolution image from a series of low resolution images of the same scene) techniques to GaoFen(GF)-4, which is the most advanced geostationaryorbit earth observing satellite in China, remote sensing images is investigated and tested. SR has been a hot research area for decades, but one of the barriers of applying SR in remote sensing community is the time slot between those low resolution (LR) images acquisition. In general, the longer the time slot, the less reliable the reconstruction. GF-4 has the unique advantage of capturing a sequence of LR of the same region in minutes, i.e. working as a staring camera from the point view of SR. This is the first experiment of applying super resolution to a sequence of low resolution images captured by GF-4 within a short time period. In this paper, we use Maximum a Posteriori (MAP) to solve the ill-conditioned problem of SR. Both the wavelet transform and the curvelet transform are used to setup a sparse prior for remote sensing images. By combining several images of both the BeiJing and DunHuang regions captured by GF-4 our method can improve spatial resolution both visually and numerically. Experimental tests show that lots of detail cannot be observed in the captured LR images, but can be seen in the super resolved high resolution (HR) images. To help the evaluation, Google Earth image can also be referenced. Moreover, our experimental tests also show that the higher the temporal resolution, the better the HR images can be resolved. The study illustrates that the application for SR to geostationary-orbit based earth observation data is very feasible and worthwhile, and it holds the potential application for all other geostationary-orbit based earth observing systems.

  13. Programming (Tips) for Physicists & Engineers

    ScienceCinema

    Ozcan, Erkcan

    2018-02-19

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  14. Programming (Tips) for Physicists & Engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozcan, Erkcan

    2010-07-13

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  15. Photo-realistic Terrain Modeling and Visualization for Mars Exploration Rover Science Operations

    NASA Technical Reports Server (NTRS)

    Edwards, Laurence; Sims, Michael; Kunz, Clayton; Lees, David; Bowman, Judd

    2005-01-01

    Modern NASA planetary exploration missions employ complex systems of hardware and software managed by large teams of. engineers and scientists in order to study remote environments. The most complex and successful of these recent projects is the Mars Exploration Rover mission. The Computational Sciences Division at NASA Ames Research Center delivered a 30 visualization program, Viz, to the MER mission that provides an immersive, interactive environment for science analysis of the remote planetary surface. In addition, Ames provided the Athena Science Team with high-quality terrain reconstructions generated with the Ames Stereo-pipeline. The on-site support team for these software systems responded to unanticipated opportunities to generate 30 terrain models during the primary MER mission. This paper describes Viz, the Stereo-pipeline, and the experiences of the on-site team supporting the scientists at JPL during the primary MER mission.

  16. Cross-media color reproduction using the frequency-based spatial gamut mapping algorithm based on human color vision

    NASA Astrophysics Data System (ADS)

    Wu, Guangyuan; Niu, Shijun; Li, Xiaozhou; Hu, Guichun

    2018-04-01

    Due to the increasing globalization of printing industry, remoting proofing will become the inevitable development trend. Cross-media color reproduction will occur in different color gamuts using remote proofing technologies, which usually leads to the problem of incompatible color gamut. In this paper, to achieve equivalent color reproduction between a monitor and a printer, a frequency-based spatial gamut mapping algorithm is proposed for decreasing the loss of visual color information. The design of algorithm is based on the contrast sensitivity functions (CSF), which exploited CSF spatial filter to preserve luminance of the high spatial frequencies and chrominance of the low frequencies. First we show a general framework for how to apply CSF spatial filter in retention of relevant visual information. Then we compare the proposed framework with HPMINDE, CUSP, Bala's algorithm. The psychophysical experimental results indicated the good performance of the proposed algorithm.

  17. Finding the optical axis of a distant object using an optical alignment system based on a holographic marker

    NASA Astrophysics Data System (ADS)

    Zhuk, D. I.; Denisyuk, I. Yu.; Gutner, I. E.

    2015-07-01

    A way to construct a holographic indicator of the position of the central axis of a distant object based on recording a transmission hologram in a layer of photosensitive material and forming a remote real image before a light source is considered. A light source with a holographically formed marker designed for visual guidance to the object axis; it can be used to simplify aircraft landing on a glide path, preliminary visual alignment of large coaxial details of various machines, etc. Specific features of the scheme of recording a holographic marker and the reconstruction of its image are considered. The possibility of forming a remote holographic image marker, which can be aligned with a simultaneously operating reference laser system for determining the direction to an object and its optical axis, has been demonstrated experimentally.

  18. Xi-cam: a versatile interface for data visualization and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  19. Xi-cam: a versatile interface for data visualization and analysis

    DOE PAGES

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...

    2018-05-31

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  20. Virtual Interactive Presence in Global Surgical Education: International Collaboration Through Augmented Reality.

    PubMed

    Davis, Matthew Christopher; Can, Dang D; Pindrik, Jonathan; Rocque, Brandon G; Johnston, James M

    2016-02-01

    Technology allowing a remote, experienced surgeon to provide real-time guidance to local surgeons has great potential for training and capacity building in medical centers worldwide. Virtual interactive presence and augmented reality (VIPAR), an iPad-based tool, allows surgeons to provide long-distance, virtual assistance wherever a wireless internet connection is available. Local and remote surgeons view a composite image of video feeds at each station, allowing for intraoperative telecollaboration in real time. Local and remote stations were established in Ho Chi Minh City, Vietnam, and Birmingham, Alabama, as part of ongoing neurosurgical collaboration. Endoscopic third ventriculostomy with choroid plexus coagulation with VIPAR was used for subjective and objective evaluation of system performance. VIPAR allowed both surgeons to engage in complex visual and verbal communication during the procedure. Analysis of 5 video clips revealed video delay of 237 milliseconds (range, 93-391 milliseconds) relative to the audio signal. Excellent image resolution allowed the remote neurosurgeon to visualize all critical anatomy. The remote neurosurgeon could gesture to structures with no detectable difference in accuracy between stations, allowing for submillimeter precision. Fifteen endoscopic third ventriculostomy with choroid plexus coagulation procedures have been performed with the use of VIPAR between Vietnam and the United States, with no significant complications. 80% of these patients remain shunt-free. Evolving technologies that allow long-distance, intraoperative guidance, and knowledge transfer hold great potential for highly efficient international neurosurgical education. VIPAR is one example of an inexpensive, scalable platform for increasing global neurosurgical capacity. Efforts to create a network of Vietnamese neurosurgeons who use VIPAR for collaboration are underway. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Neural networks for satellite remote sensing and robotic sensor interpretation

    NASA Astrophysics Data System (ADS)

    Martens, Siegfried

    Remote sensing of forests and robotic sensor fusion can be viewed, in part, as supervised learning problems, mapping from sensory input to perceptual output. This dissertation develops ARTMAP neural networks for real-time category learning, pattern recognition, and prediction tailored to remote sensing and robotics applications. Three studies are presented. The first two use ARTMAP to create maps from remotely sensed data, while the third uses an ARTMAP system for sensor fusion on a mobile robot. The first study uses ARTMAP to predict vegetation mixtures in the Plumas National Forest based on spectral data from the Landsat Thematic Mapper satellite. While most previous ARTMAP systems have predicted discrete output classes, this project develops new capabilities for multi-valued prediction. On the mixture prediction task, the new network is shown to perform better than maximum likelihood and linear mixture models. The second remote sensing study uses an ARTMAP classification system to evaluate the relative importance of spectral and terrain data for map-making. This project has produced a large-scale map of remotely sensed vegetation in the Sierra National Forest. Network predictions are validated with ground truth data, and maps produced using the ARTMAP system are compared to a map produced by human experts. The ARTMAP Sierra map was generated in an afternoon, while the labor intensive expert method required nearly a year to perform the same task. The robotics research uses an ARTMAP system to integrate visual information and ultrasonic sensory information on a B14 mobile robot. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. ARTMAP effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.

  2. Make it fun for everyone: visualization techniques in geoscience

    NASA Astrophysics Data System (ADS)

    Portnov, A.; Sojtaric, M.

    2017-12-01

    We live on a planet that mostly consists of oceans, but most people cannot picture what the surface and the subsurface of the ocean floor looks like. Marine geophysics has traditionally been difficult to explain to general public as most of what we do happens beyond the visual realm of an average audience. However, recent advances in 3D visualization of scientific data is one of the tools we can employ to better explain complex systems through gripping visual content. Coupled with a narrative approach, this type of visualization can open up a whole new and relatively little known world of science to general public. Up-to-date remote-sensing methods provide unique data of surface of seabed and subsurface all over the planet. Modern software can present this data in a spectacular way and with great scientific accuracy, making it attractive both for specialists and non-specialists in geoscience. As an example, we present several visualizations, which in simple way tell stories of various research in the remote parts of the World, such as Arctic regions and deep ocean in the Gulf of Mexico. Diverse datasets: multibeam echosounding; hydrographic survey; seismic and borehole data are put together to build up perfectly geo-referenced environment, showing the complexity of geological processes on our planet. Some of the data was collected 10-15 years ago, but acquired its new life with the help of new data visualization techniques. Every digital object with assigned coordinates, including 2D pictures and 3D models may become a part of this virtual geologic environment, limiting the potential of geo-visualization only by the imagination of a scientist. Presented videos have an apparent scientific focus on marine geology and geophysics, since the data was collected by several research and petroleum organizations, specialized in this field. The stories which we tell in this way may, for example, provide the public with further insight in complexities surrounding natural subsea gas storage and release.

  3. Video-based eye tracking for neuropsychiatric assessment.

    PubMed

    Adhikari, Sam; Stark, David E

    2017-01-01

    This paper presents a video-based eye-tracking method, ideally deployed via a mobile device or laptop-based webcam, as a tool for measuring brain function. Eye movements and pupillary motility are tightly regulated by brain circuits, are subtly perturbed by many disease states, and are measurable using video-based methods. Quantitative measurement of eye movement by readily available webcams may enable early detection and diagnosis, as well as remote/serial monitoring, of neurological and neuropsychiatric disorders. We successfully extracted computational and semantic features for 14 testing sessions, comprising 42 individual video blocks and approximately 17,000 image frames generated across several days of testing. Here, we demonstrate the feasibility of collecting video-based eye-tracking data from a standard webcam in order to assess psychomotor function. Furthermore, we were able to demonstrate through systematic analysis of this data set that eye-tracking features (in particular, radial and tangential variance on a circular visual-tracking paradigm) predict performance on well-validated psychomotor tests. © 2017 New York Academy of Sciences.

  4. Remote distractor effects and saccadic inhibition: spatial and temporal modulation.

    PubMed

    Walker, Robin; Benson, Valerie

    2013-09-12

    The onset of a visual distractor remote from a saccade target is known to increase saccade latency (the remote distractor effect [RDE]). In addition, distractors may also selectively inhibit saccades that would be initiated about 90 ms after distractor onset (termed saccadic inhibition [SI]). Recently, it has been proposed that the transitory inhibition of saccades (SI) may underlie the increase in mean latency (RDE). In a first experiment, the distractor eccentricity was manipulated, and a robust RDE that was strongly modulated by distractor eccentricity was observed. However, the underlying latency distributions did not reveal clear evidence of SI. A second experiment manipulated distractor spatial location and the timing of the distractor onset in relation to the target. An RDE was again observed with remote distractors away from the target axis and under conditions with early-onset distractors that would be unlikely to produce SI, whereas later distractor onsets produced an RDE along with some evidence of an SI effect. A third experiment using a mixed block of target-distractor stimulus-onset asynchronies (SOAs) revealed an RDE that varied with both distractor eccentricity and SOA and changes to latency distributions consistent with the timing of SI. We argue that the notion that SI underpins the RDE is similar to the earlier argument that express saccades underlie the fixation offset (gap) effect and that changes in mean latency and to the shape of the underlying latency distributions following a visual onset may involve more than one inhibitory process.

  5. Water environmental management with the aid of remote sensing and GIS technology

    NASA Astrophysics Data System (ADS)

    Chen, Xiaoling; Yuan, Zhongzhi; Li, Yok-Sheung; Song, Hong; Hou, Yingzi; Xu, Zhanhua; Liu, Honghua; Wai, Onyx W.

    2005-01-01

    Water environment is associated with many disciplinary fields including sciences and management which makes it difficult to study. Timely observation, data getting and analysis on water environment are very important for decision makers who play an important role to maintain the sustainable development. This study focused on developing a plateform of water environment management based on remote sensing and GIS technology, and its main target is to provide with necessary information on water environment through spatial analysis and visual display in a suitable way. The work especially focused on three points, and the first one is related to technical issues of spatial data organization and communication with a combination of GIS and statistical software. A data-related model was proposed to solve the data communication between the mentioned systems. The second one is spatio-temporal analysis based on remote sensing and GIS. Water quality parameters of suspended sediment concentration and BOD5 were specially analyzed in this case, and the results suggested an obvious influence of land source pollution quantitatively in a spatial domain. The third one is 3D visualization of surface feature based on RS and GIS technology. The Pearl River estuary and HongKong's coastal waters in the South China Sea were taken as a case in this study. The software ARCGIS was taken as a basic platform to develop a water environmental management system. The sampling data of water quality in 76 monitoring stations of coastal water bodies and remote sensed images were selected in this study.

  6. 3-dimensional telepresence system for a robotic environment

    DOEpatents

    Anderson, Matthew O.; McKay, Mark D.

    2000-01-01

    A telepresence system includes a camera pair remotely controlled by a control module affixed to an operator. The camera pair provides for three dimensional viewing and the control module, affixed to the operator, affords hands-free operation of the camera pair. In one embodiment, the control module is affixed to the head of the operator and an initial position is established. A triangulating device is provided to track the head movement of the operator relative to the initial position. A processor module receives input from the triangulating device to determine where the operator has moved relative to the initial position and moves the camera pair in response thereto. The movement of the camera pair is predetermined by a software map having a plurality of operation zones. Each zone therein corresponds to unique camera movement parameters such as speed of movement. Speed parameters include constant speed, or increasing or decreasing. Other parameters include pan, tilt, slide, raise or lowering of the cameras. Other user interface devices are provided to improve the three dimensional control capabilities of an operator in a local operating environment. Such other devices include a pair of visual display glasses, a microphone and a remote actuator. The pair of visual display glasses are provided to facilitate three dimensional viewing, hence depth perception. The microphone affords hands-free camera movement by utilizing voice commands. The actuator allows the operator to remotely control various robotic mechanisms in the remote operating environment.

  7. Absolute Depth Sensitivity in Cat Primary Visual Cortex under Natural Viewing Conditions.

    PubMed

    Pigarev, Ivan N; Levichkina, Ekaterina V

    2016-01-01

    Mechanisms of 3D perception, investigated in many laboratories, have defined depth either relative to the fixation plane or to other objects in the visual scene. It is obvious that for efficient perception of the 3D world, additional mechanisms of depth constancy could operate in the visual system to provide information about absolute distance. Neurons with properties reflecting some features of depth constancy have been described in the parietal and extrastriate occipital cortical areas. It has also been shown that, for some neurons in the visual area V1, responses to stimuli of constant angular size differ at close and remote distances. The present study was designed to investigate whether, in natural free gaze viewing conditions, neurons tuned to absolute depths can be found in the primary visual cortex (area V1). Single-unit extracellular activity was recorded from the visual cortex of waking cats sitting on a trolley in front of a large screen. The trolley was slowly approaching the visual scene, which consisted of stationary sinusoidal gratings of optimal orientation rear-projected over the whole surface of the screen. Each neuron was tested with two gratings, with spatial frequency of one grating being twice as high as that of the other. Assuming that a cell is tuned to a spatial frequency, its maximum response to the grating with a spatial frequency twice as high should be shifted to a distance half way closer to the screen in order to attain the same size of retinal projection. For hypothetical neurons selective to absolute depth, location of the maximum response should remain at the same distance irrespective of the type of stimulus. It was found that about 20% of neurons in our experimental paradigm demonstrated sensitivity to particular distances independently of the spatial frequencies of the gratings. We interpret these findings as an indication of the use of absolute depth information in the primary visual cortex.

  8. A Nonlinear Model for Interactive Data Analysis and Visualization and an Implementation Using Progressive Computation for Massive Remote Climate Data Ensembles

    NASA Astrophysics Data System (ADS)

    Christensen, C.; Liu, S.; Scorzelli, G.; Lee, J. W.; Bremer, P. T.; Summa, B.; Pascucci, V.

    2017-12-01

    The creation, distribution, analysis, and visualization of large spatiotemporal datasets is a growing challenge for the study of climate and weather phenomena in which increasingly massive domains are utilized to resolve finer features, resulting in datasets that are simply too large to be effectively shared. Existing workflows typically consist of pipelines of independent processes that preclude many possible optimizations. As data sizes increase, these pipelines are difficult or impossible to execute interactively and instead simply run as large offline batch processes. Rather than limiting our conceptualization of such systems to pipelines (or dataflows), we propose a new model for interactive data analysis and visualization systems in which we comprehensively consider the processes involved from data inception through analysis and visualization in order to describe systems composed of these processes in a manner that facilitates interactive implementations of the entire system rather than of only a particular component. We demonstrate the application of this new model with the implementation of an interactive system that supports progressive execution of arbitrary user scripts for the analysis and visualization of massive, disparately located climate data ensembles. It is currently in operation as part of the Earth System Grid Federation server running at Lawrence Livermore National Lab, and accessible through both web-based and desktop clients. Our system facilitates interactive analysis and visualization of massive remote datasets up to petabytes in size, such as the 3.5 PB 7km NASA GEOS-5 Nature Run simulation, previously only possible offline or at reduced resolution. To support the community, we have enabled general distribution of our application using public frameworks including Docker and Anaconda.

  9. [Constructing images and territories: thinking on the visuality and materiality of remote sensing].

    PubMed

    Monteiro, Marko

    2015-01-01

    This article offers a reflection on the question of the image in science, thinking about how visual practices contribute towards the construction of knowledge and territories. The growing centrality of the visual in current scientific practices shows the need for reflection that goes beyond the image. The object of discussion will be the scientific images used in the monitoring and visualization of territory. The article looks into the relations between visuality and a number of other factors: the researchers that construct it; the infrastructure involved in the construction; and the institutions and policies that monitor the territory. It is argued that such image-relations do not just visualize but help to construct the territory based on specific forms. Exploring this process makes it possible to develop a more complex understanding of the forms through which sciences and technology help to construct realities.

  10. Lifting Scheme DWT Implementation in a Wireless Vision Sensor Network

    NASA Astrophysics Data System (ADS)

    Ong, Jia Jan; Ang, L.-M.; Seng, K. P.

    This paper presents the practical implementation of a Wireless Visual Sensor Network (WVSN) with DWT processing on the visual nodes. WVSN consists of visual nodes that capture video and transmit to the base-station without processing. Limitation of network bandwidth restrains the implementation of real time video streaming from remote visual nodes through wireless communication. Three layers of DWT filters are implemented to process the captured image from the camera. With having all the wavelet coefficients produced, it is possible just to transmit the low frequency band coefficients and obtain an approximate image at the base-station. This will reduce the amount of power required in transmission. When necessary, transmitting all the wavelet coefficients will produce the full detail of image, which is similar to the image captured at the visual nodes. The visual node combines the CMOS camera, Xilinx Spartan-3L FPGA and wireless ZigBee® network that uses the Ember EM250 chip.

  11. AccessScope project: Accessible light microscope for users with upper limb mobility or visual impairments.

    PubMed

    Mansoor, Awais; Ahmed, Wamiq M; Samarapungavan, Ala; Cirillo, John; Schwarte, David; Robinson, J Paul; Duerstock, Bradley S

    2010-01-01

    A web-based application was developed to remotely view slide specimens and control all functions of a research-level light microscopy workstation, called AccessScope. Students and scientists with upper limb mobility and visual impairments are often unable to use a light microscope by themselves and must depend on others in its operation. Users with upper limb mobility impairments and low vision were recruited to assist in the design process of the AccessScope personal computer (PC) user interface. Participants with these disabilities were evaluated in their ability to use AccessScope to perform microscopical tasks. AccessScope usage was compared with inspecting prescanned slide images by grading participants' identification and understanding of histological features and knowledge of microscope operation. With AccessScope subjects were able to independently perform common light microscopy functions through an Internet browser by employing different PC pointing devices or accessibility software according to individual abilities. Subjects answered more histology and microscope usage questions correctly after first participating in an AccessScope test session. AccessScope allowed users with upper limb or visual impairments to successfully perform light microscopy without assistance. This unprecedented capability is crucial for students and scientists with disabilities to perform laboratory coursework or microscope-based research and pursue science, technology, engineering, and mathematics fields.

  12. Web tools for effective retrieval, visualization, and evaluation of cardiology medical images and records

    NASA Astrophysics Data System (ADS)

    Masseroli, Marco; Pinciroli, Francesco

    2000-12-01

    To provide easy retrieval, integration and evaluation of multimodal cardiology images and data in a web browser environment, distributed application technologies and java programming were used to implement a client-server architecture based on software agents. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. The client side is a Java applet running in a web browser and providing a friendly medical user interface to perform queries on patient and medical test dat and integrate and visualize properly the various query results. A set of tools based on Java Advanced Imaging API enables to process and analyze the retrieved cardiology images, and quantify their features in different regions of interest. The platform-independence Java technology makes the developed prototype easy to be managed in a centralized form and provided in each site where an intranet or internet connection can be located. Giving the healthcare providers effective tools for querying, visualizing and evaluating comprehensively cardiology medical images and records in all locations where they can need them- i.e. emergency, operating theaters, ward, or even outpatient clinics- the developed prototype represents an important aid in providing more efficient diagnoses and medical treatments.

  13. Using NASA's Giovanni Web Portal to Access and Visualize Satellite-Based Earth Science Data in the Classroom

    NASA Astrophysics Data System (ADS)

    Lloyd, S. A.; Acker, J. G.; Prados, A. I.; Leptoukh, G. G.

    2008-12-01

    One of the biggest obstacles for the average Earth science student today is locating and obtaining satellite- based remote sensing datasets in a format that is accessible and optimal for their data analysis needs. At the Goddard Earth Sciences Data and Information Services Center (GES-DISC) alone, on the order of hundreds of Terabytes of data are available for distribution to scientists, students and the general public. The single biggest and time-consuming hurdle for most students when they begin their study of the various datasets is how to slog through this mountain of data to arrive at a properly sub-setted and manageable dataset to answer their science question(s). The GES DISC provides a number of tools for data access and visualization, including the Google-like Mirador search engine and the powerful GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) web interface. Giovanni provides a simple way to visualize, analyze and access vast amounts of satellite-based Earth science data. Giovanni's features and practical examples of its use will be demonstrated, with an emphasis on how satellite remote sensing can help students understand recent events in the atmosphere and biosphere. Giovanni is actually a series of sixteen similar web-based data interfaces, each of which covers a single satellite dataset (such as TRMM, TOMS, OMI, AIRS, MLS, HALOE, etc.) or a group of related datasets (such as MODIS and MISR for aerosols, SeaWIFS and MODIS for ocean color, and the suite of A-Train observations co-located along the CloudSat orbital path). Recently, ground-based datasets have been included in Giovanni, including the Northern Eurasian Earth Science Partnership Initiative (NEESPI), and EPA fine particulate matter (PM2.5) for air quality. Model data such as the Goddard GOCART model and MERRA meteorological reanalyses (in process) are being increasingly incorporated into Giovanni to facilitate model- data intercomparison. A full suite of data analysis and visualization tools is also available within Giovanni. The GES DISC is currently developing a systematic series of training modules for Earth science satellite data, associated with our development of additional datasets and data visualization tools for Giovanni. Training sessions will include an overview of the Earth science datasets archived at Goddard, an overview of terms and techniques associated with satellite remote sensing, dataset-specific issues, an overview of Giovanni functionality, and a series of examples of how data can be readily accessed and visualized.

  14. Preliminary study of near surface detections at geothermal field using optic and SAR imageries

    NASA Astrophysics Data System (ADS)

    Kurniawahidayati, Beta; Agoes Nugroho, Indra; Syahputra Mulyana, Reza; Saepuloh, Asep

    2017-12-01

    Current remote sensing technologies shows that surface manifestation of geothermal system could be detected with optical and SAR remote sensing, but to assess target beneath near the surface layer with the surficial method needs a further study. This study conducts a preliminary result using Optic and SAR remote sensing imagery to detect near surface geothermal manifestation at and around Mt. Papandayan, West Java, Indonesia. The data used in this study were Landsat-8 OLI/TIRS for delineating geothermal manifestation prospect area and an Advanced Land Observing Satellite(ALOS) Phased Array type L-band Synthetic Aperture Radar (PALSAR) level 1.1 for extracting lineaments and their density. An assumption was raised that the lineaments correlated with near surface structures due to long L-band wavelength about 23.6 cm. Near surface manifestation prospect area are delineated using visual comparison between Landsat 8 RGB True Colour Composite band 4,3,2 (TCC), False Colour Composite band 5,6,7 (FCC), and lineament density map of ALOS PALSAR. Visual properties of ground object were distinguished from interaction of the electromagnetic radiation and object whether it reflect, scatter, absorb, or and emit electromagnetic radiation based on characteristic of their molecular composition and their macroscopic scale and geometry. TCC and FCC composite bands produced 6 and 7 surface manifestation zones according to its visual classification, respectively. Classified images were then compared to a Normalized Different Vegetation Index (NDVI) to obtain the influence of vegetation at the ground surface to the image. Geothermal area were classified based on vegetation index from NDVI. TCC image is more sensitive to the vegetation than FCC image. The later composite produced a better result for identifying visually geothermal manifestation showed by detail-detected zones. According to lineament density analysis high density area located on the peak of Papandayan overlaid with zone 1 and 2 of FCC. Comparing to the extracted lineament density, we interpreted that the near surface manifestation is located at zone 1 and 2 of FCC image.

  15. Reduced Distractibility in a Remote Culture

    PubMed Central

    de Fockert, Jan W.; Caparos, Serge; Linnell, Karina J.; Davidoff, Jules

    2011-01-01

    Background In visual processing, there are marked cultural differences in the tendency to adopt either a global or local processing style. A remote culture (the Himba) has recently been reported to have a greater local bias in visual processing than Westerners. Here we give the first evidence that a greater, and remarkable, attentional selectivity provides the basis for this local bias. Methodology/Principal Findings In Experiment 1, Eriksen-type flanker interference was measured in the Himba and in Western controls. In both groups, responses to the direction of a task-relevant target arrow were affected by the compatibility of task-irrelevant distractor arrows. However, the Himba showed a marked reduction in overall flanker interference compared to Westerners. The smaller interference effect in the Himba occurred despite their overall slower performance than Westerners, and was evident even at a low level of perceptual load of the displays. In Experiment 2, the attentional selectivity of the Himba was further demonstrated by showing that their attention was not even captured by a moving singleton distractor. Conclusions/Significance We argue that the reduced distractibility in the Himba is clearly consistent with their tendency to prioritize the analysis of local details in visual processing. PMID:22046275

  16. Remote Control and Monitoring of VLBI Experiments by Smartphones

    NASA Astrophysics Data System (ADS)

    Ruztort, C. H.; Hase, H.; Zapata, O.; Pedreros, F.

    2012-12-01

    For the remote control and monitoring of VLBI operations, we developed a software optimized for smartphones. This is a new tool based on a client-server architecture with a Web interface optimized for smartphone screens and cellphone networks. The server uses variables of the Field System and its station specific parameters stored in the shared memory. The client running on the smartphone by a Web interface analyzes and visualizes the current status of the radio telescope, receiver, schedule, and recorder. In addition, it allows commands to be sent remotely to the Field System computer and displays the log entries. The user has full access to the entire operation process, which is important in emergency cases. The software also integrates a webcam interface.

  17. Integrated Remote Sensing Modalities for Classification at a Legacy Test Site

    NASA Astrophysics Data System (ADS)

    Lee, D. J.; Anderson, D.; Craven, J.

    2016-12-01

    Detecting, locating, and characterizing suspected underground nuclear test sites is of interest to the worldwide nonproliferation monitoring community. Remote sensing provides both cultural and surface geological information over a large search area in a non-intrusive manner. We have characterized a legacy nuclear test site at the Nevada National Security Site (NNSS) using an aerial system based on RGB imagery, light detection and ranging, and hyperspectral imaging. We integrate these different remote sensing modalities to perform pattern recognition and classification tasks on the test site. These tasks include detecting cultural artifacts and exotic materials. We evaluate if the integration of different remote sensing modalities improves classification performance.

  18. Mapping Surface Water DOC in the Northern Gulf of Mexico Using CDOM Absorption Coefficients and Remote Sensing Imagery

    NASA Astrophysics Data System (ADS)

    Kelly, B.; Chelsky, A.; Bulygina, E.; Roberts, B. J.

    2017-12-01

    Remote sensing techniques have become valuable tools to researchers, providing the capability to measure and visualize important parameters without the need for time or resource intensive sampling trips. Relationships between dissolved organic carbon (DOC), colored dissolved organic matter (CDOM) and spectral data have been used to remotely sense DOC concentrations in riverine systems, however, this approach has not been applied to the northern Gulf of Mexico (GoM) and needs to be tested to determine how accurate these relationships are in riverine-dominated shelf systems. In April, July, and October 2017 we sampled surface water from 80+ sites over an area of 100,000 km2 along the Louisiana-Texas shelf in the northern GoM. DOC concentrations were measured on filtered water samples using a Shimadzu TOC-VCSH analyzer using standard techniques. Additionally, DOC concentrations were estimated from CDOM absorption coefficients of filtered water samples on a UV-Vis spectrophotometer using a modification of the methods of Fichot and Benner (2011). These values were regressed against Landsat visible band spectral data for those same locations to establish a relationship between the spectral data, CDOM absorption coefficients. This allowed us to spatially map CDOM absorption coefficients in the Gulf of Mexico using the Landsat spectral data in GIS. We then used a multiple linear regressions model to derive DOC concentrations from the CDOM absorption coefficients and applied those to our map. This study provides an evaluation of the viability of scaling up CDOM absorption coefficient and remote-sensing derived estimates of DOC concentrations to the scale of the LA-TX shelf ecosystem.

  19. Remote high-definition rotating video enables fast spatial survey of marine underwater macrofauna and habitats.

    PubMed

    Pelletier, Dominique; Leleu, Kévin; Mallet, Delphine; Mou-Tham, Gérard; Hervé, Gilles; Boureau, Matthieu; Guilpart, Nicolas

    2012-01-01

    Observing spatial and temporal variations of marine biodiversity from non-destructive techniques is central for understanding ecosystem resilience, and for monitoring and assessing conservation strategies, e.g. Marine Protected Areas. Observations are generally obtained through Underwater Visual Censuses (UVC) conducted by divers. The problems inherent to the presence of divers have been discussed in several papers. Video techniques are increasingly used for observing underwater macrofauna and habitat. Most video techniques that do not need the presence of a diver use baited remote systems. In this paper, we present an original video technique which relies on a remote unbaited rotating remote system including a high definition camera. The system is set on the sea floor to record images. These are then analysed at the office to quantify biotic and abiotic sea bottom cover, and to identify and count fish species and other species like marine turtles. The technique was extensively tested in a highly diversified coral reef ecosystem in the South Lagoon of New Caledonia, based on a protocol covering both protected and unprotected areas in major lagoon habitats. The technique enabled to detect and identify a large number of species, and in particular fished species, which were not disturbed by the system. Habitat could easily be investigated through the images. A large number of observations could be carried out per day at sea. This study showed the strong potential of this non obtrusive technique for observing both macrofauna and habitat. It offers a unique spatial coverage and can be implemented at sea at a reasonable cost by non-expert staff. As such, this technique is particularly interesting for investigating and monitoring coastal biodiversity in the light of current conservation challenges and increasing monitoring needs.

  20. Imaging Fluorescent Combustion Species in Gas Turbine Flame Tubes: On Complexities in Real Systems

    NASA Technical Reports Server (NTRS)

    Hicks, Y. R.; Locke, R. J.; Anderson, R. C.; Zaller, M.; Schock, H. J.

    1997-01-01

    Planar laser-induced fluorescence (PLIF) is used to visualize the flame structure via OH, NO, and fuel imaging in kerosene- burning gas turbine combustor flame tubes. When compared to simple gaseous hydrocarbon flames and hydrogen flames, flame tube testing complexities include spectral interferences from large fuel fragments, unknown turbulence interactions, high pressure operation, and the concomitant need for windows and remote operation. Complications of these and other factors as they apply to image analysis are considered. Because both OH and gas turbine engine fuels (commercial and military) can be excited and detected using OH transition lines, a narrowband and a broadband detection scheme are compared and the benefits and drawbacks of each method are examined.

  1. Desert Research and Technology Studies (RATS) Local and Remote Test Sites

    NASA Technical Reports Server (NTRS)

    Janoiko, Barbara; Kosmo, Joseph; Eppler, Dean

    2007-01-01

    Desert RATS (Research and Technology Studies) is a combined group of inter-NASA center scientists and engineers, collaborating with representatives of industry and academia, for the purpose of conducting remote field exercises. These exercises provide the capability to validate experimental hardware and software, to evaluate and develop mission operational techniques, and to identify and establish technical requirements applicable for future planetary exploration. D-RATS completed its ninth year of field testing in September 2006. Dry run test activities prior to testing at designated remote field site locations are initially conducted at the Johnson Space Center (JSC) Remote Field Demonstration Test Site. This is a multi-acre external test site located at JSC and has detailed representative terrain features simulating both Lunar and Mars surface characteristics. The majority of the remote field tests have been subsequently conducted in various high desert areas adjacent to Flagstaff, Arizona. Both the local JSC and remote field test sites have terrain conditions that are representative of both the Moon and Mars, such as strewn rock and volcanic ash fields, meteorite crater ejecta blankets, rolling plains, hills, gullies, slopes, and outcrops. Flagstaff is the preferred remote test site location for many reasons. First, there are nine potential test sites with representative terrain features within a 75-mile radius. Second, Flagstaff is the location of the United States Geologic Survey (USGS)/Astrogeology Branch, which historically supported Apollo astronaut geologic training and currently supports and provides host accommodations to the D-RATS team. Finally, in considering the importance of logistics in regard to providing the necessary level of support capabilities, the Flagstaff area provides substantial logistics support and lodging accommodations to take care of team members during long hours of field operations.

  2. AirSTAR Hardware and Software Design for Beyond Visual Range Flight Research

    NASA Technical Reports Server (NTRS)

    Laughter, Sean; Cox, David

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Airborne Subscale Transport Aircraft Research (AirSTAR) Unmanned Aerial System (UAS) is a facility developed to study the flight dynamics of vehicles in emergency conditions, in support of aviation safety research. The system was upgraded to have its operational range significantly expanded, going beyond the line of sight of a ground-based pilot. A redesign of the airborne flight hardware was undertaken, as well as significant changes to the software base, in order to provide appropriate autonomous behavior in response to a number of potential failures and hazards. Ground hardware and system monitors were also upgraded to include redundant communication links, including ADS-B based position displays and an independent flight termination system. The design included both custom and commercially available avionics, combined to allow flexibility in flight experiment design while still benefiting from tested configurations in reversionary flight modes. A similar hierarchy was employed in the software architecture, to allow research codes to be tested, with a fallback to more thoroughly validated flight controls. As a remotely piloted facility, ground systems were also developed to ensure the flight modes and system state were communicated to ground operations personnel in real-time. Presented in this paper is a general overview of the concept of operations for beyond visual range flight, and a detailed review of the airborne hardware and software design. This discussion is held in the context of the safety and procedural requirements that drove many of the design decisions for the AirSTAR UAS Beyond Visual Range capability.

  3. 3D display considerations for rugged airborne environments

    NASA Astrophysics Data System (ADS)

    Barnidge, Tracy J.; Tchon, Joseph L.

    2015-05-01

    The KC-46 is the next generation, multi-role, aerial refueling tanker aircraft being developed by Boeing for the United States Air Force. Rockwell Collins has developed the Remote Vision System (RVS) that supports aerial refueling operations under a variety of conditions. The system utilizes large-area, high-resolution 3D displays linked with remote sensors to enhance the operator's visual acuity for precise aerial refueling control. This paper reviews the design considerations, trade-offs, and other factors related to the selection and ruggedization of the 3D display technology for this military application.

  4. Irrigated rice area estimation using remote sensing techniques: Project's proposal and preliminary results. [Rio Grande do Sul, Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Deassuncao, G. V.; Moreira, M. A.; Novaes, R. A.

    1984-01-01

    The development of a methodology for annual estimates of irrigated rice crop in the State of Rio Grande do Sul, Brazil, using remote sensing techniques is proposed. The project involves interpretation, digital analysis, and sampling techniques of LANDSAT imagery. Results are discussed from a preliminary phase for identifying and evaluating irrigated rice crop areas in four counties of the State, for the crop year 1982/1983. This first phase involved just visual interpretation techniques of MSS/LANDSAT images.

  5. Feasibility of Using Remotely Sensed Data to Aid in Long-Term Monitoring of Biodiversity

    NASA Technical Reports Server (NTRS)

    Carroll, Mark L.; Brown, Molly E.; Elders, Akiko; Johnson, Kiersten

    2014-01-01

    Remote sensing is defined as making observations of an event or phenomena without physically sampling it. Typically this is done with instruments and sensors mounted on anything from poles extended over a cornfield,to airplanes,to satellites orbiting the Earth The sensors have characteristics that allow them to detect and record information regarding the emission and reflectance of electromagnetic energy from a surface or object. That information can then be represented visually on a screen or paper map or used in data analysis to inform decision-making.

  6. The DAST-1 remotely piloted research vehicle development and initial flight testing

    NASA Technical Reports Server (NTRS)

    Kotsabasis, A.

    1981-01-01

    The development and initial flight testing of the DAST (drones for aerodynamic and structural testing) remotely piloted research vehicle, fitted with the first aeroelastic research wing ARW-I are presented. The ARW-I is a swept supercritical wing, designed to exhibit flutter within the vehicle's flight envelope. An active flutter suppression system (FSS) designed to increase the ARW-I flutter boundary speed by 20 percent is described. The development of the FSS was based on prediction techniques of structural and unsteady aerodynamic characteristics. A description of the supporting ground facilities and aircraft systems involved in the remotely piloted research vehicle (RPRV) flight test technique is given. The design, specification, and testing of the remotely augmented vehicle system are presented. A summary of the preflight and flight test procedures associated with the RPRV operation is given. An evaluation of the blue streak test flight and the first and second ARW-I test flights is presented.

  7. Methods and potentials for using satellite image classification in school lessons

    NASA Astrophysics Data System (ADS)

    Voss, Kerstin; Goetzke, Roland; Hodam, Henryk

    2011-11-01

    The FIS project - FIS stands for Fernerkundung in Schulen (Remote Sensing in Schools) - aims at a better integration of the topic "satellite remote sensing" in school lessons. According to this, the overarching objective is to teach pupils basic knowledge and fields of application of remote sensing. Despite the growing significance of digital geomedia, the topic "remote sensing" is not broadly supported in schools. Often, the topic is reduced to a short reflection on satellite images and used only for additional illustration of issues relevant for the curriculum. Without addressing the issue of image data, this can hardly contribute to the improvement of the pupils' methodical competences. Because remote sensing covers more than simple, visual interpretation of satellite images, it is necessary to integrate remote sensing methods like preprocessing, classification and change detection. Dealing with these topics often fails because of confusing background information and the lack of easy-to-use software. Based on these insights, the FIS project created different simple analysis tools for remote sensing in school lessons, which enable teachers as well as pupils to be introduced to the topic in a structured way. This functionality as well as the fields of application of these analysis tools will be presented in detail with the help of three different classification tools for satellite image classification.

  8. Zooming into creativity: individual differences in attentional global-local biases are linked to creative thinking.

    PubMed

    Zmigrod, Sharon; Zmigrod, Leor; Hommel, Bernhard

    2015-01-01

    While recent studies have investigated how processes underlying human creativity are affected by particular visual-attentional states, we tested the impact of more stable attention-related preferences. These were assessed by means of Navon's global-local task, in which participants respond to the global or local features of large letters constructed from smaller letters. Three standard measures were derived from this task: the sizes of the global precedence effect, the global interference effect (i.e., the impact of incongruent letters at the global level on local processing), and the local interference effect (i.e., the impact of incongruent letters at the local level on global processing). These measures were correlated with performance in a convergent-thinking creativity task (the Remote Associates Task), a divergent-thinking creativity task (the Alternate Uses Task), and a measure of fluid intelligence (Raven's matrices). Flexibility in divergent thinking was predicted by the local interference effect while convergent thinking was predicted by intelligence only. We conclude that a stronger attentional bias to visual information about the "bigger picture" promotes cognitive flexibility in searching for multiple solutions.

  9. Sensitivity quantification of remote detection NMR and MRI

    NASA Astrophysics Data System (ADS)

    Granwehr, J.; Seeley, J. A.

    2006-04-01

    A sensitivity analysis is presented of the remote detection NMR technique, which facilitates the spatial separation of encoding and detection of spin magnetization. Three different cases are considered: remote detection of a transient signal that must be encoded point-by-point like a free induction decay, remote detection of an experiment where the transient dimension is reduced to one data point like phase encoding in an imaging experiment, and time-of-flight (TOF) flow visualization. For all cases, the sensitivity enhancement is proportional to the relative sensitivity between the remote detector and the circuit that is used for encoding. It is shown for the case of an encoded transient signal that the sensitivity does not scale unfavorably with the number of encoded points compared to direct detection. Remote enhancement scales as the square root of the ratio of corresponding relaxation times in the two detection environments. Thus, remote detection especially increases the sensitivity of imaging experiments of porous materials with large susceptibility gradients, which cause a rapid dephasing of transverse spin magnetization. Finally, TOF remote detection, in which the detection volume is smaller than the encoded fluid volume, allows partial images corresponding to different time intervals between encoding and detection to be recorded. These partial images, which contain information about the fluid displacement, can be recorded, in an ideal case, with the same sensitivity as the full image detected in a single step with a larger coil.

  10. Robot Tracer with Visual Camera

    NASA Astrophysics Data System (ADS)

    Jabbar Lubis, Abdul; Dwi Lestari, Yuyun; Dafitri, Haida; Azanuddin

    2017-12-01

    Robot is a versatile tool that can function replace human work function. The robot is a device that can be reprogrammed according to user needs. The use of wireless networks for remote monitoring needs can be utilized to build a robot that can be monitored movement and can be monitored using blueprints and he can track the path chosen robot. This process is sent using a wireless network. For visual robot using high resolution cameras to facilitate the operator to control the robot and see the surrounding circumstances.

  11. Data-Proximate Analysis and Visualization in the Cloud using Cloudstream, an Open-Source Application Streaming Technology Stack

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.

    2017-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.

  12. Visualizing along-strike change in deformation style using analog modeling and digital visualization software

    NASA Astrophysics Data System (ADS)

    Burberry, C. M.

    2012-12-01

    It is a well-known phenomenon that deformation style varies in space; both along the strike of a deformed belt and along the strike of individual structures within that belt. This variation in deformation style is traditionally visualized with a series of closely spaced 2D cross-sections. However, the use of 2D section lines implies plane strain along those lines, and the true 3D nature of the deformation is not necessarily captured. By using a combination of remotely sensed data, analog modeling of field datasets and this remote data, and numerical and digital visualization of the finished model, a 3D understanding and restoration of the deformation style within the region can be achieved. The workflow used for this study begins by considering the variation in deformation style which can be observed from satellite images and combining this data with traditional field data, in order to understand the deformation in the region under consideration. The conceptual model developed at this stage is then modeled using a sand and silicone modeling system, where the kinematics and dynamics of the deformation processes can be examined. A series of closely-spaced cross-sections, as well as 3D images of the deformation, are created from the analog model, and input into a digital visualization and modeling system for restoration. In this fashion, a valid 3D model is created where the internal structure of the deformed system can be visualized and mined for information. The region used in the study is the Sawtooth Range, Montana. The region forms part of the Montana Disturbed Belt in the Front Ranges of the Rocky Mountains, along strike from the Alberta Syncline in the Canadian Rocky Mountains. Interpretation of satellite data indicates that the deformation front structures include both folds and thrust structures. The thrust structures vary from hinterland-verging triangle zones to foreland-verging imbricate thrusts along strike, and the folds also vary in geometry along strike. The analog models, constrained by data from exploration wells, indicate that this change in geometry is related to a change in mechanical stratigraphy along the strike of the belt. Results from the kinematic and dynamic analysis of the digital model will also be presented. Additional implications of such a workflow and visualization system include the possibility of creating and viewing multiple cross-sections, including sections created at oblique angles to the original model. This allows the analysis of the non-plane strain component of the models and thus a more complete analysis, understanding and visualization of the deformed region. This workflow and visualization system is applicable to any region where traditional field methods must be coupled with remote data, intensely processed depth data, or analog modeling systems in order to generate valid geologic or geophsyical models.

  13. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  14. Remote operation: a selective review of research into visual depth perception.

    PubMed

    Reinhardt-Rutland, A H

    1996-07-01

    Some perceptual motor operations are performed remotely; examples include the handling of life-threatening materials and surgical procedures. A camera conveys the site of operation to a TV monitor, so depth perception relies mainly on pictorial information, perhaps with enhancement of the occlusion cue by motion. However, motion information such as motion parallax is not likely to be important. The effectiveness of pictorial information is diminished by monocular and binocular information conveying flatness of the screen and by difficulties in scaling: Only a degree of relative depth can be conveyed. Furthermore, pictorial information can mislead. Depth perception is probably adequate in remote operation, if target objects are well separated, with well-defined edges and familiar shapes. Stereoscopic viewing systems are being developed to introduce binocular information to remote operation. However, stereoscopic viewing is problematic because binocular disparity conflicts with convergence and monocular information. An alternative strategy to improve precision in remote operation may be to rely on individuals who lack binocular function: There is redundancy in depth information, and such individuals seem to compensate for the lack of binocular function.

  15. Remote sensing: a tool for park planning and management

    USGS Publications Warehouse

    Draeger, William C.; Pettinger, Lawrence R.

    1981-01-01

    Remote sensing may be defined as the science of imaging or measuring objects from a distance. More commonly, however, the term is used in reference to the acquisition and use of photographs, photo-like images, and other data acquired from aircraft and satellites. Thus, remote sensing includes the use of such diverse materials as photographs taken by hand from a light aircraft, conventional aerial photographs obtained with a precision mapping camera, satellite images acquired with sophisticated scanning devices, radar images, and magnetic and gravimetric data that may not even be in image form. Remotely sensed images may be color or black and white, can vary in scale from those that cover only a few hectares of the earth's surface to those that cover tens of thousands of square kilometers, and they may be interpreted visually or with the assistance of computer systems. This article attempts to describe several of the commonly available types of remotely sensed data, to discuss approaches to data analysis, and to demonstrate (with image examples) typical applications that might interest managers of parks and natural areas.

  16. Remote Sensing Image Fusion Method Based on Nonsubsampled Shearlet Transform and Sparse Representation

    NASA Astrophysics Data System (ADS)

    Moonon, Altan-Ulzii; Hu, Jianwen; Li, Shutao

    2015-12-01

    The remote sensing image fusion is an important preprocessing technique in remote sensing image processing. In this paper, a remote sensing image fusion method based on the nonsubsampled shearlet transform (NSST) with sparse representation (SR) is proposed. Firstly, the low resolution multispectral (MS) image is upsampled and color space is transformed from Red-Green-Blue (RGB) to Intensity-Hue-Saturation (IHS). Then, the high resolution panchromatic (PAN) image and intensity component of MS image are decomposed by NSST to high and low frequency coefficients. The low frequency coefficients of PAN and the intensity component are fused by the SR with the learned dictionary. The high frequency coefficients of intensity component and PAN image are fused by local energy based fusion rule. Finally, the fused result is obtained by performing inverse NSST and inverse IHS transform. The experimental results on IKONOS and QuickBird satellites demonstrate that the proposed method provides better spectral quality and superior spatial information in the fused image than other remote sensing image fusion methods both in visual effect and object evaluation.

  17. Remote Monitoring of Soil Water Content, Temperature, and Heat Flow Using Low-Cost Cellular (3G) IoT Technology

    NASA Astrophysics Data System (ADS)

    Ham, J. M.

    2016-12-01

    New microprocessor boards, open-source sensors, and cloud infrastructure developed for the Internet of Things (IoT) can be used to create low-cost monitoring systems for environmental research. This project describes two applications in soil science and hydrology: 1) remote monitoring of the soil temperature regime near oil and gas operations to detect the thermal signature associated with the natural source zone degradation of hydrocarbon contaminants in the vadose zone, and 2) remote monitoring of soil water content near the surface as part of a global citizen science network. In both cases, prototype data collection systems were built around the cellular (2G/3G) "Electron" microcontroller (www.particle.io). This device allows connectivity to the cloud using a low-cost global SIM and data plan. The systems have cellular connectivity in over 100 countries and data can be logged to the cloud for storage. Users can view data real time over any internet connection or via their smart phone. For both projects, data logging, storage, and visualization was done using IoT services like Thingspeak (thingspeak.com). The soil thermal monitoring system was tested on experimental plots in Colorado USA to evaluate the accuracy and reliability of different temperature sensors and 3D printed housings. The soil water experiment included comparison opens-source capacitance-based sensors to commercial versions. Results demonstrate the power of leveraging IoT technology for field research.

  18. A Virtual Instrument Panel and Serial Interface for the Parr 1672 Thermometer

    ERIC Educational Resources Information Center

    Salter, Gail; Range, Kevin; Salter, Carl

    2005-01-01

    The various features of a Visual Basic Program, which implements the 1672 Parr thermometer are described. The program permits remote control of the calorimetry experiment and also provides control for the flow of data and for file storage.

  19. Telepathology. Long-distance diagnosis.

    PubMed

    Weinstein, R S; Bloom, K J; Rozek, L S

    1989-04-01

    Telepathology is defined as the practice of pathology at a distance, by visualizing an image on a video monitor rather than viewing a specimen directly through a microscope. Components of a telepathology system include the following: (1) a workstation equipped with a high-resolution video camera attached to a remote-controlled light microscope; (2) a pathologist workstation incorporating controls for manipulating the robotic microscope as well as a high-resolution video monitor; and (3) a telecommunications link. Progress has been made in designing and constructing telepathology workstations and fully motorized, computer-controlled light microscopes suitable for telepathology. In addition, components such as video signal digital encoders and decoders that produce remarkably stable, high-color fidelity, and high-resolution images have been incorporated into the workstations. Resolution requirements for the video microscopy component of telepathology have been formally examined in receiver operator characteristic (ROC) curve analyses. Test-of-concept demonstrations have been completed with the use of geostationary satellites as the broadband communication linkages for 750-line resolution video. Potential benefits of telepathology include providing a means of conveniently delivering pathology services in real-time to remote sites or underserviced areas, time-sharing of pathologists' services by multiple institutions, and increasing accessibility to specialty pathologists.

  20. A Novel Ship-Tracking Method for GF-4 Satellite Sequential Images.

    PubMed

    Yao, Libo; Liu, Yong; He, You

    2018-06-22

    The geostationary remote sensing satellite has the capability of wide scanning, persistent observation and operational response, and has tremendous potential for maritime target surveillance. The GF-4 satellite is the first geostationary orbit (GEO) optical remote sensing satellite with medium resolution in China. In this paper, a novel ship-tracking method in GF-4 satellite sequential imagery is proposed. The algorithm has three stages. First, a local visual saliency map based on local peak signal-to-noise ratio (PSNR) is used to detect ships in a single frame of GF-4 satellite sequential images. Second, the accuracy positioning of each potential target is realized by a dynamic correction using the rational polynomial coefficients (RPCs) and automatic identification system (AIS) data of ships. Finally, an improved multiple hypotheses tracking (MHT) algorithm with amplitude information is used to track ships by further removing the false targets, and to estimate ships’ motion parameters. The algorithm has been tested using GF-4 sequential images and AIS data. The results of the experiment demonstrate that the algorithm achieves good tracking performance in GF-4 satellite sequential images and estimates the motion information of ships accurately.

  1. Subsonic stability and control derivatives for an unpowered, remotely piloted 3/8-scale F-15 airplane model obtained from flight test

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.; Shafer, M. F.

    1976-01-01

    In response to the interest in airplane configuration characteristics at high angles of attack, an unpowered remotely piloted 3/8-scale F-15 airplane model was flight tested. The subsonic stability and control characteristics of this airplane model over an angle of attack range of -20 to 53 deg are documented. The remotely piloted technique for obtaining flight test data was found to provide adequate stability and control derivatives. The remotely piloted technique provided an opportunity to test the aircraft mathematical model in an angle of attack regime not previously examined in flight test. The variation of most of the derivative estimates with angle of attack was found to be consistent, particularly when the data were supplemented by uncertainty levels.

  2. Alaska Testbed for the Fusion of Citizen Science and Remote Sensing of Sea Ice and Snow

    NASA Astrophysics Data System (ADS)

    Walsh, J. E.; Sparrow, E.; Lee, O. A.; Brook, M.; Brubaker, M.; Casas, J.

    2017-12-01

    Citizen science, remote sensing and related environmental information sources for the Alaskan Arctic are synthesized with the objectives of (a) placing local observations into a broader geospatial framework and (b) enabling the use of local observations to evaluate sea ice, snow and land surface products obtained from remote sensing. In its initial phase, the project instituted a coordinated set of community-based observations of sea ice and snow in three coastal communities in western and northern Alaska: Nome, Point Hope and Barrow. Satellite maps of sea ice concentration have been consolidated with the in situ reports, leading to a three-part depiction of surface conditions at each site: narrative reports, surface-based photos, and satellite products. The project has developed a prototype visualization package, enabling users to select a location and date for which the three information sources can be viewed. Visual comparisons of the satellite products and the local reports show generally consistent depictions of the sea ice concentrations in the vicinity of the coastlines, although the satellite products are generally biased low, especially in coastal regions where shorefast ice persists after the appearance of open water farther offshore. A preliminary comparison of the local snow reports and the MODIS daily North American snow cover images indicates that areas of snow persisted in the satellite images beyond the date of snow disappearance reported by the observers. The "in-town" location of most of the snow reports is a factor that must be addressed in further reporting and remote sensing comparisons.

  3. Towards real-time remote processing of laparoscopic video

    NASA Astrophysics Data System (ADS)

    Ronaghi, Zahra; Duffy, Edward B.; Kwartowitz, David M.

    2015-03-01

    Laparoscopic surgery is a minimally invasive surgical technique where surgeons insert a small video camera into the patient's body to visualize internal organs and small tools to perform surgical procedures. However, the benefit of small incisions has a drawback of limited visualization of subsurface tissues, which can lead to navigational challenges in the delivering of therapy. Image-guided surgery (IGS) uses images to map subsurface structures and can reduce the limitations of laparoscopic surgery. One particular laparoscopic camera system of interest is the vision system of the daVinci-Si robotic surgical system (Intuitive Surgical, Sunnyvale, CA, USA). The video streams generate approximately 360 megabytes of data per second, demonstrating a trend towards increased data sizes in medicine, primarily due to higher-resolution video cameras and imaging equipment. Processing this data on a bedside PC has become challenging and a high-performance computing (HPC) environment may not always be available at the point of care. To process this data on remote HPC clusters at the typical 30 frames per second (fps) rate, it is required that each 11.9 MB video frame be processed by a server and returned within 1/30th of a second. The ability to acquire, process and visualize data in real-time is essential for performance of complex tasks as well as minimizing risk to the patient. As a result, utilizing high-speed networks to access computing clusters will lead to real-time medical image processing and improve surgical experiences by providing real-time augmented laparoscopic data. We aim to develop a medical video processing system using an OpenFlow software defined network that is capable of connecting to multiple remote medical facilities and HPC servers.

  4. Remote-controlled pan, tilt, zoom cameras at Kilauea and Mauna Loa Volcanoes, Hawai'i

    USGS Publications Warehouse

    Hoblitt, Richard P.; Orr, Tim R.; Castella, Frederic; Cervelli, Peter F.

    2008-01-01

    Lists of important volcano-monitoring disciplines usually include seismology, geodesy, and gas geochemistry. Visual monitoring - the essence of volcanology - is usually not mentioned. Yet, observations of the outward appearance of a volcano provide data that is equally as important as that provided by the other disciplines. The eye was almost certainly the first volcano monitoring-tool used by early man. Early volcanology was mostly descriptive and was based on careful visual observations of volcanoes. There is still no substitute for the eye of an experienced volcanologist. Today, scientific instruments replace or augment our senses as monitoring tools because instruments are faster and more sensitive, work tirelessly day and night, keep better records, operate in hazardous environments, do not generate lawsuits when damaged or destroyed, and in most cases are cheaper. Furthermore, instruments are capable of detecting phenomena that are outside the reach of our senses. The human eye is now augmented by the camera. Sequences of timed images provide a record of visual phenomena that occur on and above the surface of volcanoes. Photographic monitoring is a fundamental monitoring tool; image sequences can often provide the basis for interpreting other data streams. Monitoring data are most useful when they are generated and are available for analysis in real-time or near real-time. This report describes the current (as of 2006) system for real-time photograph acquisition and transmission from remote sites on Kilauea and Mauna Loa volcanoes to the U.S. Geological Survey Hawaiian Volcano Observatory (HVO). It also describes how the photographs are archived and analyzed. In addition to providing system documentation for HVO, we hope that the report will prove useful as a practical guide to the construction of a high-bandwidth network for the telemetry of real-time data from remote locations.

  5. Large-mirror testing facility at the National Optical Astronomy Observatories

    NASA Astrophysics Data System (ADS)

    Coudé du Foresto, V.; Fox, J.; Poczulp, G. A.; Richardson, J.; Roddier, Claude; Roddier, Francois; Barr, L. D.

    1991-09-01

    A method for testing the surfaces of large mirrors has been developed to be used even when conditions of vibration and thermal turbulence in the light path cannot be eliminated. The full aperture of the mirror under test is examined by means of a scatterplate interferometer that has the property of being a quasi-common-path method, although any means for obtaining interference fringes can be used. By operating the test equipment remotely, the optician does not cause unnecessary vibrations or heat in the testing area. The typical test is done with a camera exposure of about a millisecond to 'freeze' the fringe pattern on the detector. Averaging up to 10 separate exposures effectively eliminates the turbulence effects. From the intensity information, a phase map of the wavefront reflected from the surface is obtained using a phase-unwrapping technique. The method provides the optician with complete numerical information and visual plots for the surface under test and the diffracted image the method will produce to an accuracy of 0.01 micron measured peak-to-valley. The method has been extensively used for a variety of test of a 1.8-m-diam borosilicate-glass honeycomb mirror, where the method was shown to have a sensitivity equal to a Foucault test.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marrinan, Thomas; Leigh, Jason; Renambot, Luc

    Mixed presence collaboration involves remote collaboration between multiple collocated groups. This paper presents the design and results of a user study that focused on mixed presence collaboration using large-scale tiled display walls. The research was conducted in order to compare data synchronization schemes for multi-user visualization applications. Our study compared three techniques for sharing data between display spaces with varying constraints and affordances. The results provide empirical evidence that using data sharing techniques with continuous synchronization between the sites lead to improved collaboration for a search and analysis task between remotely located groups. We have also identified aspects of synchronizedmore » sessions that result in increased remote collaborator awareness and parallel task coordination. It is believed that this research will lead to better utilization of large-scale tiled display walls for distributed group work.« less

  7. Development and implementation of software systems for imaging spectroscopy

    USGS Publications Warehouse

    Boardman, J.W.; Clark, R.N.; Mazer, A.S.; Biehl, L.L.; Kruse, F.A.; Torson, J.; Staenz, K.

    2006-01-01

    Specialized software systems have played a crucial role throughout the twenty-five year course of the development of the new technology of imaging spectroscopy, or hyperspectral remote sensing. By their very nature, hyperspectral data place unique and demanding requirements on the computer software used to visualize, analyze, process and interpret them. Often described as a marriage of the two technologies of reflectance spectroscopy and airborne/spaceborne remote sensing, imaging spectroscopy, in fact, produces data sets with unique qualities, unlike previous remote sensing or spectrometer data. Because of these unique spatial and spectral properties hyperspectral data are not readily processed or exploited with legacy software systems inherited from either of the two parent fields of study. This paper provides brief reviews of seven important software systems developed specifically for imaging spectroscopy.

  8. Private business: the uptake of confidential HIV testing in remote aboriginal communities on the Anangu Pitjantjatjara Lands.

    PubMed

    Miller, P J; Torzillo, P J

    1998-10-01

    Despite a concentration of risk factors for HIV transmission, many remote Aboriginal communities in central Australia have a low uptake of HIV testing. We studied the uptake of HIV testing in six clinics in remote Aboriginal communities following the introduction of voluntary confidential testing to assess the impact of the intervention and to determine if the program was reaching people most at risk of HIV infection and transmission. The study was conducted by Nganampa Health Council, an Aboriginal-controlled health service on the Anangu Pitjantjatjara Lands in the far north-west of South Australia. Since the introduction of confidential coded testing in August 1994 the number of HIV tests provided through the remote clinics has increased from 83 tests/year to 592 tests/year. In the 12-month audit period (August 1, 1995, to July 31, 1996) 62.7% of women aged 20-24 years, 44.6% of people aged 12-40 years and 24% of the total population had an HIV test. Fifty per cent of tests were accounted for by the 15-25 year age groups and 60% of tests related to an STD consult. This study shows that a high uptake of HIV testing in high-risk groups can be achieved in remote Aboriginal communities where a high level of confidentiality is maintained.

  9. Remote Laboratory and Animal Behaviour: An Interactive Open Field System

    ERIC Educational Resources Information Center

    Fiore, Lorenzo; Ratti, Giovannino

    2007-01-01

    Remote laboratories can provide distant learners with practical acquisitions which would otherwise remain precluded. Our proposal here is a remote laboratory on a behavioural test (open field test), with the aim of introducing learners to the observation and analysis of stereotyped behaviour in animals. A real-time video of a mouse in an…

  10. A Comparative Study of Online Remote Proctored versus Onsite Proctored High-Stakes Exams

    ERIC Educational Resources Information Center

    Weiner, John A.; Hurtz, Gregory M.

    2017-01-01

    Advances in technology have spurred innovations in secure assessment delivery. One such innovation, remote online proctoring, has become increasingly sophisticated and is gaining wider consideration for high-stakes testing. However, there is an absence of published research examining remote online proctoring and its effects on test scores and the…

  11. Remotely Piloted Aircraft Systems as a Rhinoceros Anti-Poaching Tool in Africa

    PubMed Central

    Mulero-Pázmány, Margarita; Stolper, Roel; van Essen, L. D.; Negro, Juan J.; Sassen, Tyrell

    2014-01-01

    Over the last years there has been a massive increase in rhinoceros poaching incidents, with more than two individuals killed per day in South Africa in the first months of 2013. Immediate actions are needed to preserve current populations and the agents involved in their protection are demanding new technologies to increase their efficiency in the field. We assessed the use of remotely piloted aircraft systems (RPAS) to monitor for poaching activities. We performed 20 flights with 3 types of cameras: visual photo, HD video and thermal video, to test the ability of the systems to detect (a) rhinoceros, (b) people acting as poachers and (c) to do fence surveillance. The study area consisted of several large game farms in KwaZulu-Natal province, South Africa. The targets were better detected at the lowest altitudes, but to operate the plane safely and in a discreet way, altitudes between 100 and 180 m were the most convenient. Open areas facilitated target detection, while forest habitats complicated it. Detectability using visual cameras was higher at morning and midday, but the thermal camera provided the best images in the morning and at night. Considering not only the technical capabilities of the systems but also the poacherś modus operandi and the current control methods, we propose RPAS usage as a tool for surveillance of sensitive areas, for supporting field anti-poaching operations, as a deterrent tool for poachers and as a complementary method for rhinoceros ecology research. Here, we demonstrate that low cost RPAS can be useful for rhinoceros stakeholders for field control procedures. There are, however, important practical limitations that should be considered for their successful and realistic integration in the anti-poaching battle. PMID:24416177

  12. Remotely piloted aircraft systems as a rhinoceros anti-poaching tool in Africa.

    PubMed

    Mulero-Pázmány, Margarita; Stolper, Roel; van Essen, L D; Negro, Juan J; Sassen, Tyrell

    2014-01-01

    Over the last years there has been a massive increase in rhinoceros poaching incidents, with more than two individuals killed per day in South Africa in the first months of 2013. Immediate actions are needed to preserve current populations and the agents involved in their protection are demanding new technologies to increase their efficiency in the field. We assessed the use of remotely piloted aircraft systems (RPAS) to monitor for poaching activities. We performed 20 flights with 3 types of cameras: visual photo, HD video and thermal video, to test the ability of the systems to detect (a) rhinoceros, (b) people acting as poachers and (c) to do fence surveillance. The study area consisted of several large game farms in KwaZulu-Natal province, South Africa. The targets were better detected at the lowest altitudes, but to operate the plane safely and in a discreet way, altitudes between 100 and 180 m were the most convenient. Open areas facilitated target detection, while forest habitats complicated it. Detectability using visual cameras was higher at morning and midday, but the thermal camera provided the best images in the morning and at night. Considering not only the technical capabilities of the systems but also the poacherś modus operandi and the current control methods, we propose RPAS usage as a tool for surveillance of sensitive areas, for supporting field anti-poaching operations, as a deterrent tool for poachers and as a complementary method for rhinoceros ecology research. Here, we demonstrate that low cost RPAS can be useful for rhinoceros stakeholders for field control procedures. There are, however, important practical limitations that should be considered for their successful and realistic integration in the anti-poaching battle.

  13. Marshall Space Flight Center Telescience Resource Kit

    NASA Technical Reports Server (NTRS)

    Wade, Gina

    2016-01-01

    Telescience Resource Kit (TReK) is a suite of software applications that can be used to monitor and control assets in space or on the ground. The Telescience Resource Kit was originally developed for the International Space Station program. Since then it has been used to support a variety of NASA programs and projects including the WB-57 Ascent Vehicle Experiment (WAVE) project, the Fast Affordable Science and Technology Satellite (FASTSAT) project, and the Constellation Program. The Payloads Operations Center (POC), also known as the Payload Operations Integration Center (POIC), provides the capability for payload users to operate their payloads at their home sites. In this environment, TReK provides local ground support system services and an interface to utilize remote services provided by the POC. TReK provides ground system services for local and remote payload user sites including International Partner sites, Telescience Support Centers, and U.S. Investigator sites in over 40 locations worldwide. General Capabilities: Support for various data interfaces such as User Datagram Protocol, Transmission Control Protocol, and Serial interfaces. Data Services - retrieve, process, record, playback, forward, and display data (ground based data or telemetry data). Command - create, modify, send, and track commands. Command Management - Configure one TReK system to serve as a command server/filter for other TReK systems. Database - databases are used to store telemetry and command definition information. Application Programming Interface (API) - ANSI C interface compatible with commercial products such as Visual C++, Visual Basic, LabVIEW, Borland C++, etc. The TReK API provides a bridge for users to develop software to access and extend TReK services. Environments - development, test, simulations, training, and flight. Includes standalone training simulators.

  14. Mapping and Visualization of The Deepwater Horizon Oil Spill Using Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Ferreira Pichardo, E.

    2017-12-01

    Satellites are man-made objects hovering around the Earth's orbit and are essential for Earth observation, i.e. the monitoring and gathering of data about the Earth's vital systems. Environmental Satellites are used for atmospheric research, weather forecasting, and warning as well as monitoring extreme weather events. These satellites are categorized into Geosynchronous and Low Earth (Polar) orbiting satellites. Visualizing satellite data is critical to understand the Earth's systems and changes to our environment. The objective of this research is to examine satellite-based remotely sensed data that needs to be processed and rendered in the form of maps or other forms of visualization to understand and interpret the satellites' observations to monitor the status, changes and evolution of the mega-disaster Deepwater Horizon Spill that occurred on April 20, 2010 in the Gulf of Mexico. In this project, we will use an array of tools and programs such as Python, CSPP and Linux. Also, we will use data from the National Oceanic and Atmospheric Administration (NOAA): Polar-Orbiting Satellites Terra Earth Observing System AM-1 (EOS AM-1), and Aqua EOS PM-1 to investigate the mega-disaster. Each of these satellites carry a variety of instruments, and we will use the data obtained from the remote sensor Moderate-Resolution Imaging Spectroradiometer (MODIS). Ultimately, this study shows the importance of mapping and visualizing data such as satellite data (MODIS) to understand the extents of environmental impacts disasters such as the Deepwater Horizon Oil spill.

  15. Evaluation of bridge decks using non-destructive evaluation (NDE) at near highway speeds for effective asset management - pilot project.

    DOT National Transportation Integrated Search

    2016-09-29

    This project piloted the findings from an initial research and development project pertaining to the detection, : quantification, and visualization of bridge deck distresses through the use of remote sensing techniques, specifically : combining optic...

  16. ENVIRONMENTAL SYSTEMS MANAGEMENT AS APPLIED TO WATERSHEDS, UTILIZING REMOTE SENSING, DECISION SUPPORT AND VISUALIZATION

    EPA Science Inventory

    Environmental Systems Management as a conceptual framework and as a set of interdisciplinary analytical approaches will be described within the context of sustainable watershed management, within devergent complex ecosystems. A specific subset of integrated tools are deployed to...

  17. VisiOmatic: Celestial image viewer

    NASA Astrophysics Data System (ADS)

    Bertin, Emmanuel; Marmo, Chiara; Pillay, Ruven

    2014-08-01

    VisiOmatic is a web client for IIPImage (ascl:1408.009) and is used to visualize and navigate through large science images from remote locations. It requires STIFF (ascl:1110.006), is based on the Leaflet Javascript library, and works on both touch-based and mouse-based devices.

  18. Xi-cam: Flexible High Throughput Data Processing for GISAXS

    NASA Astrophysics Data System (ADS)

    Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sarje, Abinav; Krishnan, Hari; Pellouchoud, Lenson; Ren, Fang; Fournier, Amanda; Jiang, Zhang; Tassone, Christopher; Mehta, Apurva; Sethian, James; Hexemer, Alexander

    With increasing capabilities and data demand for GISAXS beamlines, supporting software is under development to handle larger data rates, volumes, and processing needs. We aim to provide a flexible and extensible approach to GISAXS data treatment as a solution to these rising needs. Xi-cam is the CAMERA platform for data management, analysis, and visualization. The core of Xi-cam is an extensible plugin-based GUI platform which provides users an interactive interface to processing algorithms. Plugins are available for SAXS/GISAXS data and data series visualization, as well as forward modeling and simulation through HipGISAXS. With Xi-cam's advanced mode, data processing steps are designed as a graph-based workflow, which can be executed locally or remotely. Remote execution utilizes HPC or de-localized resources, allowing for effective reduction of high-throughput data. Xi-cam is open-source and cross-platform. The processing algorithms in Xi-cam include parallel cpu and gpu processing optimizations, also taking advantage of external processing packages such as pyFAI. Xi-cam is available for download online.

  19. Web Service Model for Plasma Simulations with Automatic Post Processing and Generation of Visual Diagnostics*

    NASA Astrophysics Data System (ADS)

    Exby, J.; Busby, R.; Dimitrov, D. A.; Bruhwiler, D.; Cary, J. R.

    2003-10-01

    We present our design and initial implementation of a web service model for running particle-in-cell (PIC) codes remotely from a web browser interface. PIC codes have grown significantly in complexity and now often require parallel execution on multiprocessor computers, which in turn requires sophisticated post-processing and data analysis. A significant amount of time and effort is required for a physicist to develop all the necessary skills, at the expense of actually doing research. Moreover, parameter studies with a computationally intensive code justify the systematic management of results with an efficient way to communicate them among a group of remotely located collaborators. Our initial implementation uses the OOPIC Pro code [1], Linux, Apache, MySQL, Python, and PHP. The Interactive Data Language is used for visualization. [1] D.L. Bruhwiler et al., Phys. Rev. ST-AB 4, 101302 (2001). * This work is supported by DOE grant # DE-FG02-03ER83857 and by Tech-X Corp. ** Also University of Colorado.

  20. Multisource data fusion for documenting archaeological sites

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir; Chibunichev, Alexander; Zhuravlev, Denis

    2017-10-01

    The quality of archaeological sites documenting is of great importance for cultural heritage preserving and investigating. The progress in developing new techniques and systems for data acquisition and processing creates an excellent basis for achieving a new quality of archaeological sites documenting and visualization. archaeological data has some specific features which have to be taken into account when acquiring, processing and managing. First of all, it is a needed to gather as full as possible information about findings providing no loss of information and no damage to artifacts. Remote sensing technologies are the most adequate and powerful means which satisfy this requirement. An approach to archaeological data acquiring and fusion based on remote sensing is proposed. It combines a set of photogrammetric techniques for obtaining geometrical and visual information at different scales and detailing and a pipeline for archaeological data documenting, structuring, fusion, and analysis. The proposed approach is applied for documenting of Bosporus archaeological expedition of Russian State Historical Museum.

  1. Application of NASA Giovanni to Coastal Zone Remote Sensing Research

    NASA Technical Reports Server (NTRS)

    Acker, James; Leptoukh, Gregory; Kempler, Steven; Berrick, Stephen; Rui, Hualan; Shen, Suhung

    2007-01-01

    The Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) provides rapid access to, and enables effective utilization of, remotely-sensed data that are applicable to investigations of coastal environmental processes. Data sets in Giovanni include precipitation data from the Tropical Rainfall Measuring Mission (TRMM), particularly useful for coastal storm investigations; ocean color radiometry data from the Sea-viewing Wide Field-of-view Sensor (SeaWIFS) and Moderate Resolution Imaging Spectroradiometer (MODIS), useful for water quality evaluation, phytoplankton blooms, and terrestrial-marine interactions; and atmospheric data from MODIS and the Advanced Infrared Sounder (AIRS), providing the capability to characterize atmospheric variables. Giovanni provides a simple interface allowing discovery and analysis of environmental data sets with accompanying graphic visualizations. Examples of Giovanni investigations of the coastal zone include hurricane and storm impacts, hydrologically-induced phytoplankton blooms, chlorophyll trend analysis, and dust storm characterization. New and near-future capabilities of Giovanni will be described.

  2. Application of NASA Giovanni to Coastal Zone Remote Sensing Search

    NASA Technical Reports Server (NTRS)

    Acker, James; Leptoukh, Gregory; Kempler, Steven; Berrick, Stephen; Rui, Hualan; Shen, Suhung

    2007-01-01

    The Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization ANd aNalysis Infrastructure (Giovanni) provides rapid access to, and enables effective utilization of, remotely-sensed data that are applicable to investigations of coastal environmental processes. Data sets in Giovanni include precipitation data from the Tropical Rainfall Measuring Mission (TRMM), particularly useful for coastal storm investigations; ocean color radiometry data from the Sea-viewing Wide Field-of-view Sensor (SeaWIFS) and Moderate Resolution Imaging Spectroradiometer (MODIS), useful for water quality evaluation, phytoplankton blooms, and terrestrial-marine interactions; and atmospheric data from MODIS and the Advanced Infrared Sounder (AIRS), providing the capability to characterize atmospheric variables. Giovanni provides a simple interface allowing discovery and analysis of environmental data sets with accompanying graphic visualizations. Examples of Giovanni investigations of the coastal zone include hurricane and storm impacts, hydrologically-induced phytoplankton blooms, chlorophyll trend analysis, and dust storm characterization. New and near-future capabilities of Giovanni will be described.

  3. Exploring Remote Sensing Products Online with Giovanni for Studying Urbanization

    NASA Technical Reports Server (NTRS)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina; Kempler, Steve

    2012-01-01

    Recently, a Large amount of MODIS land products at multi-spatial resolutions have been integrated into the online system, Giovanni, to support studies on land cover and land use changes focused on Northern Eurasia and Monsoon Asia regions. Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) is a Web-based application developed by the NASA Goddard Earth Sciences Data and Information Services Center (GES-DISC) providing a simple and intuitive way to visualize, analyze, and access Earth science remotely-sensed and modeled data. The customized Giovanni Web portals (Giovanni-NEESPI and Giovanni-MAIRS) are created to integrate land, atmospheric, cryospheric, and social products, that enable researchers to do quick exploration and basic analyses of land surface changes and their relationships to climate at global and regional scales. This presentation documents MODIS land surface products in Giovanni system. As examples, images and statistical analysis results on land surface and local climate changes associated with urbanization over Yangtze River Delta region, China, using data in Giovanni are shown.

  4. Synergistic relationships among remote-sensing and geophysical media: Geological and hydrological applications

    NASA Technical Reports Server (NTRS)

    Goebel, J. E.; Walton, M.; Batten, L. G. (Principal Investigator)

    1980-01-01

    The synergistic relationships among LANDSAT imagery, Skylab photographs, and aerial photographs were useful for establishing areas of near surface bedrock. Lineaments were located on LANDSAT imagery and aerial photographs during 1978 and near surface water tables were to be located during 1980. Both of these subjects can be identified by remote sensing methods more reliably than individual outcrops, which are small and occur in a wide variety of environments with a wide range of responses. Bedrock outcrops themselves could not be resolved by any of the data sources used, nor did any combination of data sources specifically identify rock at the ground surface. The data sources could not simply be combined mathematically to produce a visual image of probable areas of near surface bedrock. Outcrops and near surface bedrock had to be verified visually at the site. Despite these drawbacks, a procedure for locating areas of near surface bedrock within which actual surface outcrops may occur was developed.

  5. Interactive visual optimization and analysis for RFID benchmarking.

    PubMed

    Wu, Yingcai; Chung, Ka-Kei; Qu, Huamin; Yuan, Xiaoru; Cheung, S C

    2009-01-01

    Radio frequency identification (RFID) is a powerful automatic remote identification technique that has wide applications. To facilitate RFID deployment, an RFID benchmarking instrument called aGate has been invented to identify the strengths and weaknesses of different RFID technologies in various environments. However, the data acquired by aGate are usually complex time varying multidimensional 3D volumetric data, which are extremely challenging for engineers to analyze. In this paper, we introduce a set of visualization techniques, namely, parallel coordinate plots, orientation plots, a visual history mechanism, and a 3D spatial viewer, to help RFID engineers analyze benchmark data visually and intuitively. With the techniques, we further introduce two workflow procedures (a visual optimization procedure for finding the optimum reader antenna configuration and a visual analysis procedure for comparing the performance and identifying the flaws of RFID devices) for the RFID benchmarking, with focus on the performance analysis of the aGate system. The usefulness and usability of the system are demonstrated in the user evaluation.

  6. The functional neuroanatomy of object agnosia: a case study.

    PubMed

    Konen, Christina S; Behrmann, Marlene; Nishimura, Mayu; Kastner, Sabine

    2011-07-14

    Cortical reorganization of visual and object representations following neural injury was examined using fMRI and behavioral investigations. We probed the visual responsivity of the ventral visual cortex of an agnosic patient who was impaired at object recognition following a lesion to the right lateral fusiform gyrus. In both hemispheres, retinotopic mapping revealed typical topographic organization and visual activation of early visual cortex. However, visual responses, object-related, and -selective responses were reduced in regions immediately surrounding the lesion in the right hemisphere, and also, surprisingly, in corresponding locations in the structurally intact left hemisphere. In contrast, hV4 of the right hemisphere showed expanded response properties. These findings indicate that the right lateral fusiform gyrus is critically involved in object recognition and that an impairment to this region has widespread consequences for remote parts of cortex. Finally, functional neural plasticity is possible even when a cortical lesion is sustained in adulthood. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Hypersonic Research Vehicle (HRV) real-time flight test support feasibility and requirements study. Part 2: Remote computation support for flight systems functions

    NASA Technical Reports Server (NTRS)

    Rediess, Herman A.; Hewett, M. D.

    1991-01-01

    The requirements are assessed for the use of remote computation to support HRV flight testing. First, remote computational requirements were developed to support functions that will eventually be performed onboard operational vehicles of this type. These functions which either cannot be performed onboard in the time frame of initial HRV flight test programs because the technology of airborne computers will not be sufficiently advanced to support the computational loads required, or it is not desirable to perform the functions onboard in the flight test program for other reasons. Second, remote computational support either required or highly desirable to conduct flight testing itself was addressed. The use is proposed of an Automated Flight Management System which is described in conceptual detail. Third, autonomous operations is discussed and finally, unmanned operations.

  8. Wind turbine remote control using Android devices

    NASA Astrophysics Data System (ADS)

    Rat, C. L.; Panoiu, M.

    2018-01-01

    This paper describes the remote control of a wind turbine system over the internet using an Android device, namely a tablet or a smartphone. The wind turbine workstation contains a LabVIEW program which monitors the entire wind turbine energy conversion system (WECS). The Android device connects to the LabVIEW application, working as a remote interface to the wind turbine. The communication between the devices needs to be secured because it takes place over the internet. Hence, the data are encrypted before being sent through the network. The scope was the design of remote control software capable of visualizing real-time wind turbine data through a secure connection. Since the WECS is fully automated and no full-time human operator exists, unattended access to the turbine workstation is needed. Therefore the device must not require any confirmation or permission from the computer operator in order to control it. Another condition is that Android application does not have any root requirements.

  9. Remote Control and Data Acquisition: A Case Study

    NASA Technical Reports Server (NTRS)

    DeGennaro, Alfred J.; Wilkinson, R. Allen

    2000-01-01

    This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.

  10. A study on haptic collaborative game in shared virtual environment

    NASA Astrophysics Data System (ADS)

    Lu, Keke; Liu, Guanyang; Liu, Lingzhi

    2013-03-01

    A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.

  11. Applications of remote sensor data to geologic and economic analysis on the Bonanza Test Site, Colorado

    NASA Technical Reports Server (NTRS)

    Reeves, R. G. (Compiler)

    1972-01-01

    Recent studies conducted in the Bonanza Test Site, Colorado, area indicated that: (1) more geologic structural information is available from remote sensing data than from conventional techniques; (2) greater accuracy results from using remote sensing data; (3) all major structural features were detected; (4) of all structural interpretations, about 75% were correct; and (5) interpretation of remote sensing data will not supplant field work, but it enables field work to be done much more efficiently.

  12. Examining a knowledge domain: Interactive visualization of the Geographic Information Science and Technology Body of Knowledge 1

    NASA Astrophysics Data System (ADS)

    Stowell, Marilyn Ruth

    This research compared the effectiveness and performance of interactive visualizations of the GIS&T Body of Knowledge 1. The visualizations were created using Processing, and display the structure and content of the Body of Knowledge using various spatial layout methods: the Indented List, Tree Graph, treemap and Similarity Graph. The first three methods utilize the existing hierarchical structure of the BoK text, while the fourth method (Similarity Graph) serves as a jumping off point for exploring content-based visualizations of the BoK. The following questions have guided the framework of this research: (1) Which of the spatial layouts is most effective for completing tasks related to the GIS&T; BoK overall? How do they compare to each other in terms of performance? (2) Is one spatial layout significantly more or less effective than others for completing a particular cognitive task? (3) Is the user able to utilize the BoK as a basemap or reference system and make inferences based on BoK scorecard overlays? (4) Which design aspects of the interface assist in carrying out the survey objectives? Which design aspects of the application detract from fulfilling the objectives? To answer these questions, human subjects were recruited to participate in a survey, during which they were assigned a random spatial layout and were asked questions about the BoK based on their interaction with the visualization tool. 75 users were tested, 25 for each spatial layout. Statistical analysis revealed that there were no statistically significant differences between means for overall accuracy when comparing the three visualizations. In looking at individual questions, Tree Graph and Indented List yielded statistically significant higher scores for questions regarding the structure of the Body of Knowledge, as compared to the treemap. There was a significant strong positive correlation between the time taken to complete the survey and the final survey score. This correlation was particularly strong with treemap, possibly confirming the steeper learning curve with the more complex layout. Users were asked for feedback on the perceived "ease" of using the interface, and though few users said the interface was easy to use, there was a positive correlation between perceived "ease" and overall score. Qualitative feedback revealed that the external controls on the interface were not inviting to use, and the interface overall was not intuitive. Additional human subjects were recruited from the professional GIS community to participate in testing remotely. These results weren't significant due to small sample size, but helped to verify the feedback and results from the controlled testing.

  13. Remote visual detection of impacts on the lunar surface

    NASA Technical Reports Server (NTRS)

    Melosh, H. Jay; Artemjeva, N. A.; Golub, A. P.; Nemchinov, I. V.; Shuvalov, V. V.; Trubetskaya, I. A.

    1993-01-01

    We propose a novel method of remotely observing impacts on the airless Moon that may extend the present data base on meteoroids down to 1 m in diameter. Meteorites or comets of radius approximately 1-100 m are burnt away or dispersed in the atmospheres of the Earth and Venus. However, when such objects strike the Moon they deposit their energy in a small initial volume, forming a plasma plume whose visible and infrared radiation may be visible from the Earth. We consider impacts of model SiO2 projectiles on the surface of an SiO2 model Moon.

  14. Remote sensing of the atmosphere from environmental satellites

    NASA Technical Reports Server (NTRS)

    Allison, L. J.; Wexler, R.; Laughlin, C. R.; Bandeen, W. R.

    1977-01-01

    Various applications of satellite remote sensing of the earth are reviewed, including (1) the use of meteorological satellites to obtain photographic and radiometric data for determining weather conditions; (2) determination of the earth radiation budget from measurements of reflected solar radiation and emitted long wave terrestrial radiation; (3) the use of microwave imagery for measuring ice and snow cover; (4) LANDSAT visual and near infrared observation of floods and crop growth; and (5) the use of the Nimbus 4 backscatter ultraviolet instrument to measure total ozone and vertical ozone distribution. Plans for future activities are also discussed.

  15. Telearch - Integrated visual simulation environment for collaborative virtual archaeology.

    NASA Astrophysics Data System (ADS)

    Kurillo, Gregorij; Forte, Maurizio

    Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.

  16. Development of a Three Dimensional Wireless Sensor Network for Terrain-Climate Research in Remote Mountainous Environments

    NASA Astrophysics Data System (ADS)

    Kavanagh, K.; Davis, A.; Gessler, P.; Hess, H.; Holden, Z.; Link, T. E.; Newingham, B. A.; Smith, A. M.; Robinson, P.

    2011-12-01

    Developing sensor networks that are robust enough to perform in the world's remote regions is critical since these regions serve as important benchmarks compared to human-dominated areas. Paradoxically, the factors that make these remote, natural sites challenging for sensor networking are often what make them indispensable for climate change research. We aim to overcome these challenges by developing a three-dimensional sensor network arrayed across a topoclimatic gradient (1100-1800 meters) in a wilderness area in central Idaho. Development of this sensor array builds upon advances in sensing, networking, and power supply technologies coupled with experiences of the multidisciplinary investigators in conducting research in remote mountainous locations. The proposed gradient monitoring network will provide near real-time data from a three-dimensional (3-D) array of sensors measuring biophysical parameters used in ecosystem process models. The network will monitor atmospheric carbon dioxide concentration, humidity, air and soil temperature, soil water content, precipitation, incoming and outgoing shortwave and longwave radiation, snow depth, wind speed and direction, tree stem growth and leaf wetness at time intervals ranging from seconds to days. The long-term goal of this project is to realize a transformative integration of smart sensor networks adaptively communicating data in real-time to ultimately achieve a 3-D visualization of ecosystem processes within remote mountainous regions. Process models will be the interface between the visualization platforms and the sensor network. This will allow us to better predict how non-human dominated terrestrial and aquatic ecosystems function and respond to climate dynamics. Access to the data will be ensured as part of the Northwest Knowledge Network being developed at the University of Idaho, through ongoing Idaho NSF-funded cyber infrastructure initiatives, and existing data management systems funded by NSF, such as the CUAHSI Hydrologic Information System (HIS). These efforts will enhance cross-disciplinary understanding of natural and anthropogenic influences on ecosystem function and ultimately inform decision-making.

  17. Virtual interactive presence and augmented reality (VIPAR) for remote surgical assistance.

    PubMed

    Shenai, Mahesh B; Dillavou, Marcus; Shum, Corey; Ross, Douglas; Tubbs, Richard S; Shih, Alan; Guthrie, Barton L

    2011-03-01

    Surgery is a highly technical field that combines continuous decision-making with the coordination of spatiovisual tasks. We designed a virtual interactive presence and augmented reality (VIPAR) platform that allows a remote surgeon to deliver real-time virtual assistance to a local surgeon, over a standard Internet connection. The VIPAR system consisted of a "local" and a "remote" station, each situated over a surgical field and a blue screen, respectively. Each station was equipped with a digital viewpiece, composed of 2 cameras for stereoscopic capture, and a high-definition viewer displaying a virtual field. The virtual field was created by digitally compositing selected elements within the remote field into the local field. The viewpieces were controlled by workstations mutually connected by the Internet, allowing virtual remote interaction in real time. Digital renderings derived from volumetric MRI were added to the virtual field to augment the surgeon's reality. For demonstration, a fixed-formalin cadaver head and neck were obtained, and a carotid endarterectomy (CEA) and pterional craniotomy were performed under the VIPAR system. The VIPAR system allowed for real-time, virtual interaction between a local (resident) and remote (attending) surgeon. In both carotid and pterional dissections, major anatomic structures were visualized and identified. Virtual interaction permitted remote instruction for the local surgeon, and MRI augmentation provided spatial guidance to both surgeons. Camera resolution, color contrast, time lag, and depth perception were identified as technical issues requiring further optimization. Virtual interactive presence and augmented reality provide a novel platform for remote surgical assistance, with multiple applications in surgical training and remote expert assistance.

  18. 46 CFR 154.1340 - Temperature measuring devices.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) Each device must actuate an audible and visual alarm at the cargo control station and a remote group... STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment... cargo control station. (c) Except for independent tanks type C, each cargo containment system for a...

  19. 46 CFR 154.1340 - Temperature measuring devices.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Each device must actuate an audible and visual alarm at the cargo control station and a remote group... STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment... cargo control station. (c) Except for independent tanks type C, each cargo containment system for a...

  20. 46 CFR 154.1340 - Temperature measuring devices.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Each device must actuate an audible and visual alarm at the cargo control station and a remote group... STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment... cargo control station. (c) Except for independent tanks type C, each cargo containment system for a...

  1. 46 CFR 154.1340 - Temperature measuring devices.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Each device must actuate an audible and visual alarm at the cargo control station and a remote group... STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment... cargo control station. (c) Except for independent tanks type C, each cargo containment system for a...

  2. 46 CFR 154.1340 - Temperature measuring devices.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) Each device must actuate an audible and visual alarm at the cargo control station and a remote group... STANDARDS FOR SELF-PROPELLED VESSELS CARRYING BULK LIQUEFIED GASES Design, Construction and Equipment... cargo control station. (c) Except for independent tanks type C, each cargo containment system for a...

  3. Evaluation of force-torque displays for use with space station telerobotic activities

    NASA Technical Reports Server (NTRS)

    Hendrich, Robert C.; Bierschwale, John M.; Manahan, Meera K.; Stuart, Mark A.; Legendre, A. Jay

    1992-01-01

    Recent experiments which addressed Space Station remote manipulation tasks found that tactile force feedback (reflecting forces and torques encountered at the end-effector through the manipulator hand controller) does not improve performance significantly. Subjective response from astronaut and non-astronaut test subjects indicated that force information, provided visually, could be useful. No research exists which specifically investigates methods of presenting force-torque information visually. This experiment was designed to evaluate seven different visual force-torque displays which were found in an informal telephone survey. The displays were prototyped in the HyperCard programming environment. In a within-subjects experiment, 14 subjects nullified forces and torques presented statically, using response buttons located at the bottom of the screen. Dependent measures included questionnaire data, errors, and response time. Subjective data generally demonstrate that subjects rated variations of pseudo-perspective displays consistently better than bar graph and digital displays. Subjects commented that the bar graph and digital displays could be used, but were not compatible with using hand controllers. Quantitative data show similar trends to the subjective data, except that the bar graph and digital displays both provided good performance, perhaps do to the mapping of response buttons to display elements. Results indicate that for this set of displays, the pseudo-perspective displays generally represent a more intuitive format for presenting force-torque information.

  4. Development of a Smart Mobile Data Module for Fetal Monitoring in E-Healthcare.

    PubMed

    Houzé de l'Aulnoit, Agathe; Boudet, Samuel; Génin, Michaël; Gautier, Pierre-François; Schiro, Jessica; Houzé de l'Aulnoit, Denis; Beuscart, Régis

    2018-03-23

    The fetal heart rate (FHR) is a marker of fetal well-being in utero (when monitoring maternal and/or fetal pathologies) and during labor. Here, we developed a smart mobile data module for the remote acquisition and transmission (via a Wi-Fi or 4G connection) of FHR recordings, together with a web-based viewer for displaying the FHR datasets on a computer, smartphone or tablet. In order to define the features required by users, we modelled the fetal monitoring procedure (in home and hospital settings) via semi-structured interviews with midwives and obstetricians. Using this information, we developed a mobile data transfer module based on a Raspberry Pi. When connected to a standalone fetal monitor, the module acquires the FHR signal and sends it (via a Wi-Fi or a 3G/4G mobile internet connection) to a secure server within our hospital information system. The archived, digitized signal data are linked to the patient's electronic medical records. An HTML5/JavaScript web viewer converts the digitized FHR data into easily readable and interpretable graphs for viewing on a computer (running Windows, Linux or MacOS) or a mobile device (running Android, iOS or Windows Phone OS). The data can be viewed in real time or offline. The application includes tools required for correct interpretation of the data (signal loss calculation, scale adjustment, and precise measurements of the signal's characteristics). We performed a proof-of-concept case study of the transmission, reception and visualization of FHR data for a pregnant woman at 30 weeks of amenorrhea. She was hospitalized in the pregnancy assessment unit and FHR data were acquired three times a day with a Philips Avalon® FM30 fetal monitor. The prototype (Raspberry Pi) was connected to the fetal monitor's RS232 port. The emission and reception of prerecorded signals were tested and the web server correctly received the signals, and the FHR recording was visualized in real time on a computer, a tablet and smartphones (running Android and iOS) via the web viewer. This process did not perturb the hospital's computer network. There was no data delay or loss during a 60-min test. The web viewer was tested successfully in the various usage situations. The system was as user-friendly as expected, and enabled rapid, secure archiving. We have developed a system for the acquisition, transmission, recording and visualization of RCF data. Healthcare professionals can view the FHR data remotely on their computer, tablet or smartphone. Integration of FHR data into a hospital information system enables optimal, secure, long-term data archiving.

  5. Community-based clinical education increases motivation of medical students to medicine of remote area: comparison between lecture and practice.

    PubMed

    Tani, Kenji; Yamaguchi, Harutaka; Tada, Saaya; Kondo, Saki; Tabata, Ryo; Yuasa, Shino; Kawaminami, Shingo; Nakanishi, Yoshinori; Ito, Jun; Shimizu, Nobuhiko; Obata, Fumiaki; Shin, Teruki; Bando, Hiroyasu; Kohno, Mitsuhiro

    2014-01-01

    In this study, we administered a questionnaire to medical students to evaluate the effect of community-based clinical education on their attitudes to community medicine and medicine in remote area. Questionnaires were given 4 times to all the students from first-year to sixth-year. Of 95 students, 65 students (68.4%) who completed all questionnaires, were used in this study. The intensity of students' attitudes was estimated by using visual analogue scale. The intensity of interest, a sense of fulfillment and passion in medicine of remote area was significantly increased after the community-based practice. On the other hand, the level of understanding in medicine in remote area was increased by the lecture not by the practice. The intensity of desire both to become a generalist and a specialist was significantly increased when the grade went up. Most of sixth-year students desired to have abilities of a generalist and a specialist simultaneously. This study shows that the community-based practice is more meaningful in increasing motivation in medicine in remote area than the lecture, and suggests that it is important to prepare more courses to experience community medicine to increase the number of physicians who desire to work in remote area.

  6. Visual acuity testing in diabetic subjects: the decimal progression chart versus the Freiburg visual acuity test.

    PubMed

    Loumann Knudsen, Lars

    2003-08-01

    To study reproducibility and biological variation of visual acuity in diabetic maculopathy, using two different visual acuity tests, the decimal progression chart and the Freiburg visual acuity test. Twenty-two eyes in 11 diabetic subjects were examined several times within a 12-month period using both visual acuity tests. The most commonly used visual acuity test in Denmark (the decimal progression chart) was compared to the Freiburg visual acuity test (automated testing) in a paired study. Correlation analysis revealed agreement between the two methods (r(2)=0.79; slope=0.82; y-axis intercept=0.01). The mean visual acuity was found to be 15% higher (P<0.0001) with the decimal progression chart than with the Freiburg visual acuity test. The reproducibility was the same in both tests (coefficient of variation: 12% for each test); however, the variation within the 12-month examination period differed significantly. The coefficient of variation was 17% using the decimal progression chart, 35% with the Freiburg visual acuity test. The reproducibility of the two visual acuity tests is comparable under optimal testing conditions in diabetic subjects with macular oedema. However, it appears that the Freiburg visual acuity test is significantly better for detection of biological variation.

  7. A Novel Unsupervised Segmentation Quality Evaluation Method for Remote Sensing Images

    PubMed Central

    Tang, Yunwei; Jing, Linhai; Ding, Haifeng

    2017-01-01

    The segmentation of a high spatial resolution remote sensing image is a critical step in geographic object-based image analysis (GEOBIA). Evaluating the performance of segmentation without ground truth data, i.e., unsupervised evaluation, is important for the comparison of segmentation algorithms and the automatic selection of optimal parameters. This unsupervised strategy currently faces several challenges in practice, such as difficulties in designing effective indicators and limitations of the spectral values in the feature representation. This study proposes a novel unsupervised evaluation method to quantitatively measure the quality of segmentation results to overcome these problems. In this method, multiple spectral and spatial features of images are first extracted simultaneously and then integrated into a feature set to improve the quality of the feature representation of ground objects. The indicators designed for spatial stratified heterogeneity and spatial autocorrelation are included to estimate the properties of the segments in this integrated feature set. These two indicators are then combined into a global assessment metric as the final quality score. The trade-offs of the combined indicators are accounted for using a strategy based on the Mahalanobis distance, which can be exhibited geometrically. The method is tested on two segmentation algorithms and three testing images. The proposed method is compared with two existing unsupervised methods and a supervised method to confirm its capabilities. Through comparison and visual analysis, the results verified the effectiveness of the proposed method and demonstrated the reliability and improvements of this method with respect to other methods. PMID:29064416

  8. Compact probing system using remote imaging for industrial plant maintenance

    NASA Astrophysics Data System (ADS)

    Ito, F.; Nishimura, A.

    2014-03-01

    Laser induced breakdown spectroscopy (LIBS) and endoscope observation were combined to design a remote probing device. We use this probing device to inspect a crack of the inner wall of the heat exchanger. Crack inspection requires speed at first, and then it requires accuracy. Once Eddy Current Testing (ECT) finds a crack with a certain signal level, another method should confirm it visually. We are proposing Magnetic particle Testing (MT) using specially fabricated the Magnetic Particle Micro Capsule (MPMC). For LIBS, a multichannel spectrometer and a Q-switch YAG laser were used. Irradiation area is 270 μm, and the pulse energy was 2 mJ. This pulse energy corresponds to 5-2.2 MW/cm2. A composite-type optical fiber was used to deliver both laser energy and optical image. Samples were prepared to heat a zirconium alloy plate by underwater arc welding in order to demonstrate severe accidents of nuclear power plants. A black oxide layer covered the weld surface and white particles floated on water surface. Laser induced breakdown plasma emission was taken into the spectroscope using this optical fiber combined with telescopic optics. As a result, we were able to simultaneously perform spectroscopic measurement and observation. For MT, the MPMC which gathered in the defective area is observed with this fiber. The MPMC emits light by the illumination of UV light from this optical fiber. The size of a defect is estimated with this amount of emission. Such technology will be useful for inspection repair of reactor pipe.

  9. Simple, almost anywhere, with almost anyone: remote low-cost telementored resuscitative lung ultrasound.

    PubMed

    McBeth, Paul B; Crawford, Innes; Blaivas, Michael; Hamilton, Trevor; Musselwhite, Kimberly; Panebianco, Nova; Melniker, Lawrence; Ball, Chad G; Gargani, Luna; Gherdovich, Carlotta; Kirkpatrick, Andrew W

    2011-12-01

    Apnea (APN) and pneumothorax (PTX) are common immediately life-threatening conditions. Ultrasound is a portable tool that captures anatomy and physiology as digital information allowing it to be readily transferred by electronic means. Both APN and PTX are simply ruled out by visualizing respiratory motion at the visceral-parietal pleural interface known as lung sliding (LS), corroborated by either the M-mode or color-power Doppler depiction of LS. We thus assessed how economically and practically this information could be obtained remotely over a cellular network. Ultrasound images were obtained on handheld ultrasound machines streamed to a standard free internet service (Skype) using an iPhone. Remote expert sonographers directed remote providers (with variable to no ultrasound experience) to obtain images by viewing the transmitted ultrasound signal and by viewing the remote examiner over a head-mounted webcam. Examinations were conducted between a series of remote sites and a base station. Remote sites included two remote on-mountain sites, a small airplane in flight, and a Calgary household, with base sites located in Pisa, Rome, Philadelphia, and Calgary. In all lung fields (20/20) on all occasions, LS could easily and quickly be seen. LS was easily corroborated and documented through capture of color-power Doppler and M-mode images. Other ultrasound applications such as the Focused Assessment with Sonography for Trauma examination, vascular anatomy, and a fetal wellness assessment were also demonstrated. The emergent exclusion of APN-PTX can be immediately accomplished by a remote expert economically linked to almost any responder over cellular networks. Further work should explore the range of other physiologic functions and anatomy that could be so remotely assessed.

  10. Novel application of a Wii remote to measure spasticity with the pendulum test: Proof of concept

    PubMed Central

    Yeh, Chien-Hung; Hung, Chi-Yao; Wang, Yung-Hung; Hsu, Wei-Tai; Chang, Yi-Chung; Yeh, Jia-Rong; Lee, Po-Lei; Hu, Kun; Kang, Jiunn-Horng; Lo, Men-Tzung

    2016-01-01

    Background The pendulum test is a standard clinical test for quantifying the severity of spasticity. In the test, an electrogoniometer is typically used to measure the knee angular motion. The device is costly and difficult to set up such that the pendulum test is normally time consuming. Objective The goal of this study is to determine whether a Nintendo Wii remote can replace the electrogroniometer for reliable assessment of the angular motion of the knee in the pendulum test. Methods The pendulum test was performed in three control participants and 13 hemiplegic stroke patients using both a Wii remote and an electrogoniometer. The correlation coefficient and the Bland–Altman difference plot were used to compare the results obtained from the two devices. The Wilcoxon signed-rank test was used to compare the difference between hemiplegia-affected and nonaffected sides in the hemiplegic stroke patients. Results There was a fair to strong correlation between measurements from the Wii remote and the electrogoniometer (0.513 < R2 < 0.800). Small but consistent differences between the Wii remote and electrogoniometer were identified from the Bland–Altman difference plot. Within the hemiplegic stroke patients, both devices successfully distinguished the hemiplegia-affected (spastic) side from the nonaffected (nonspastic) side (both with p < .0001*). In addition, the intraclass correlation coefficient, standard error of measurement, and minimum detectable differences were highly consistent for both devices. Conclusion Our findings suggest that the Wii remote may serve as a convenient and cost-efficient tool for the assessment of spasticity. PMID:26669955

  11. Novel application of a Wii remote to measure spasticity with the pendulum test: Proof of concept.

    PubMed

    Yeh, Chien-Hung; Hung, Chi-Yao; Wang, Yung-Hung; Hsu, Wei-Tai; Chang, Yi-Chung; Yeh, Jia-Rong; Lee, Po-Lei; Hu, Kun; Kang, Jiunn-Horng; Lo, Men-Tzung

    2016-01-01

    The pendulum test is a standard clinical test for quantifying the severity of spasticity. In the test, an electrogoniometer is typically used to measure the knee angular motion. The device is costly and difficult to set up such that the pendulum test is normally time consuming. The goal of this study is to determine whether a Nintendo Wii remote can replace the electrogroniometer for reliable assessment of the angular motion of the knee in the pendulum test. The pendulum test was performed in three control participants and 13 hemiplegic stroke patients using both a Wii remote and an electrogoniometer. The correlation coefficient and the Bland-Altman difference plot were used to compare the results obtained from the two devices. The Wilcoxon signed-rank test was used to compare the difference between hemiplegia-affected and nonaffected sides in the hemiplegic stroke patients. There was a fair to strong correlation between measurements from the Wii remote and the electrogoniometer (0.513

  12. Remote FLS testing in the real world: ready for "prime time".

    PubMed

    Okrainec, Allan; Vassiliou, Melina; Jimenez, M Carolina; Henao, Oscar; Kaneva, Pepa; Matt Ritter, E

    2016-07-01

    Maintaining the existing FLS test centers requires considerable investment in human and financial resources. It can also be particularly challenging for those outside of North America to become certified due to the limited number of international test centers. Preliminary work suggests that it is possible to reliably score the FLS manual skills component remotely using low-cost videoconferencing technology. Significant work remains to ensure that testing procedures adhere to standards defined by SAGES for this approach to be considered equivalent to standard on-site testing. To validate the integrity and validity of the FLS manual skills examination administered remotely in a real-world environment according to FLS testing protocols and to evaluate participants' experience with the setting. Individuals with various levels of training from the University of Toronto completed a pre- and a post-test questionnaire. Participants presented to one of the two FLS testing rooms available for the study, each connected via Skype to a separate room with a FLS proctor who administered and scored the test remotely (RP). An on-site proctor (OP) was present in the room as a control. An invigilator was also present in the testing room to follow directions from the RP and ensure the integrity of test materials. Twenty-one participants were recruited, and 20 completed the test. There was no significant difference between scores by RP and OP. Interrater reliability between the RP and OP was excellent. One critical error was missed by the RP, but this would not have affected the test outcome. Participants reported being highly satisfied. We demonstrate that proctors located remotely can administer the FLS skills test in a secure and reliable fashion, with excellent interrater reliability compared to an on-site proctor. Remote proctoring of the FLS examination could become a strategy to increase certification rates while containing costs.

  13. Skynet Junior Scholars- Sharing the Universe with Blind/Low Vision Youth

    NASA Astrophysics Data System (ADS)

    Meredith, Kate K.; Hoette, Vivian; Kron, Richard; Heatherly, Sue Ann; Williamson, Kathryn; Gurton, Suzanne; Haislip, Josh; Reichart, Dan

    2015-08-01

    Skynet Junior Scholars, a project funded by the National Science Foundation, aims to engage middle school youth including youth with visual and hearing impairments in investigating the universe with the same tools professionals use. Project deliverables include: 1) Online access to optical and radio telescopes, data analysis tools, and professional astronomers, 2) An age-appropriate web-based interface for controlling remote telescopes, 3) Inquiry-based standards-aligned instructional modules. From an accessibility perspective, the goal of the Skynet Junior Scholars project is to facilitate independent access to the project deliverables to the greatest extent possible given existing accessibility technologies. In this poster we describe our experience in field-testing SJS activities with 29 blind/low vision youth attending a Lion’s Club summer camp. From our observations and preliminary results from pre and post surveys and interviews, we learned that rather than creating a new interest in STEM, we were instead nourishing pre-existing interest giving students their first direct experience in observational astronomy. Additional accessibility features have been added to the SJS program since the initial pilot testing. Full testing is scheduled for July 2015.

  14. Development and Implementation of a New Telerehabilitation System for Audiovisual Stimulation Training in Hemianopia

    PubMed Central

    Tinelli, Francesca; Cioni, Giovanni; Purpura, Giulia

    2017-01-01

    Telerehabilitation, defined as the method by which communication technologies are used to provide remote rehabilitation, although still underused, could be as efficient and effective as the conventional clinical rehabilitation practices. In the literature, there are descriptions of the use of telerehabilitation in adult patients with various diseases, whereas it is seldom used in clinical practice with child and adolescent patients. We have developed a new audiovisual telerehabilitation (AVT) system, based on the multisensory capabilities of the human brain, to provide a new tool for adults and children with visual field defects in order to improve ocular movements toward the blind hemifield. The apparatus consists of a semicircular structure in which visual and acoustic stimuli are positioned. A camera is integrated into the mechanical structure in the center of the panel to control eye and head movements. Patients can use this training system with a customized software on a tablet. From hospital, the therapist has complete control over the training process, and the results of the training sessions are automatically available within a few minutes on the hospital website. In this paper, we report the AVT system protocol and the preliminary results on its use by three adult patients. All three showed improvements in visual detection abilities with long-term effects. In the future, we will test this apparatus with children and their families. Since interventions for impairments in the visual field have a substantial cost for individuals and for the welfare system, we expect that our research could have a profound socio-economic impact avoiding prolonged and intensive hospital stays. PMID:29209271

  15. Implementing Recommendations From Web Accessibility Guidelines: A Comparative Study of Nondisabled Users and Users With Visual Impairments.

    PubMed

    Schmutz, Sven; Sonderegger, Andreas; Sauer, Juergen

    2017-09-01

    The present study examined whether implementing recommendations of Web accessibility guidelines would have different effects on nondisabled users than on users with visual impairments. The predominant approach for making Web sites accessible for users with disabilities is to apply accessibility guidelines. However, it has been hardly examined whether this approach has side effects for nondisabled users. A comparison of the effects on both user groups would contribute to a better understanding of possible advantages and drawbacks of applying accessibility guidelines. Participants from two matched samples, comprising 55 participants with visual impairments and 55 without impairments, took part in a synchronous remote testing of a Web site. Each participant was randomly assigned to one of three Web sites, which differed in the level of accessibility (very low, low, and high) according to recommendations of the well-established Web Content Accessibility Guidelines 2.0 (WCAG 2.0). Performance (i.e., task completion rate and task completion time) and a range of subjective variables (i.e., perceived usability, positive affect, negative affect, perceived aesthetics, perceived workload, and user experience) were measured. Higher conformance to Web accessibility guidelines resulted in increased performance and more positive user ratings (e.g., perceived usability or aesthetics) for both user groups. There was no interaction between user group and accessibility level. Higher conformance to WCAG 2.0 may result in benefits for nondisabled users and users with visual impairments alike. Practitioners may use the present findings as a basis for deciding on whether and how to implement accessibility best.

  16. When the display matters: A multifaceted perspective on 3D geovisualizations

    NASA Astrophysics Data System (ADS)

    Juřík, Vojtěch; Herman, Lukáš; Šašinka, Čeněk; Stachoň, Zdeněk; Chmelík, Jiří

    2017-04-01

    This study explores the influence of stereoscopic (real) 3D and monoscopic (pseudo) 3D visualization on the human ability to reckon altitude information in noninteractive and interactive 3D geovisualizations. A two phased experiment was carried out to compare the performance of two groups of participants, one of them using the real 3D and the other one pseudo 3D visualization of geographical data. A homogeneous group of 61 psychology students, inexperienced in processing of geographical data, were tested with respect to their efficiency at identifying altitudes of the displayed landscape. The first phase of the experiment was designed as non-interactive, where static 3D visual displayswere presented; the second phase was designed as interactive and the participants were allowed to explore the scene by adjusting the position of the virtual camera. The investigated variables included accuracy at altitude identification, time demands and the amount of the participant's motor activity performed during interaction with geovisualization. The interface was created using a Motion Capture system, Wii Remote Controller, widescreen projection and the passive Dolby 3D technology (for real 3D vision). The real 3D visual display was shown to significantly increase the accuracy of the landscape altitude identification in non-interactive tasks. As expected, in the interactive phase there were differences in accuracy flattened out between groups due to the possibility of interaction, with no other statistically significant differences in completion times or motor activity. The increased number of omitted objects in real 3D condition was further subjected to an exploratory analysis.

  17. Remote Excavation System technology evaluation report: Buried Waste Robotics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-09-01

    This document describes the results from the Remote Excavation System demonstration and testing conducted at the Idaho National Engineering Laboratory during June and July 1993. The purpose of the demonstration was to ascertain the feasibility of the system for skimming soil and removing various types of buried waste in a safe manner and within all regulatory requirements, and to compare the performances of manual and remote operation of a backhoe. The procedures and goals of the demonstration were previously defined in The Remote Excavation System Test Plan, which served as a guideline for evaluating the various components of the systemmore » and discussed the procedures used to conduct the tests.« less

  18. Scientific Visualization Tools for Enhancement of Undergraduate Research

    NASA Astrophysics Data System (ADS)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable four-dimensional interactive environment. These tools allow students to make higher order decisions based on large multidimensional sets of data while diminishing the level of frustration that results from dealing with the details of processing large data sets.

  19. Head-coupled remote stereoscopic camera system for telepresence applications

    NASA Astrophysics Data System (ADS)

    Bolas, Mark T.; Fisher, Scott S.

    1990-09-01

    The Virtual Environment Workstation Project (VIEW) at NASA's Ames Research Center has developed a remotely controlled stereoscopic camera system that can be used for telepresence research and as a tool to develop and evaluate configurations for head-coupled visual systems associated with space station telerobots and remote manipulation robotic arms. The prototype camera system consists of two lightweight CCD video cameras mounted on a computer controlled platform that provides real-time pan, tilt, and roll control of the camera system in coordination with head position transmitted from the user. This paper provides an overall system description focused on the design and implementation of the camera and platform hardware configuration and the development of control software. Results of preliminary performance evaluations are reported with emphasis on engineering and mechanical design issues and discussion of related psychophysiological effects and objectives.

  20. An evaluation of remote sensing technologies for the detection of fugitive contamination at selected Superfund hazardous waste sites in Pennsylvania

    USGS Publications Warehouse

    Slonecker, E. Terrence; Fisher, Gary B.

    2014-01-01

    This evaluation was conducted to assess the potential for using both traditional remote sensing, such as aerial imagery, and emerging remote sensing technology, such as hyperspectral imaging, as tools for postclosure monitoring of selected hazardous waste sites. Sixteen deleted Superfund (SF) National Priorities List (NPL) sites in Pennsylvania were imaged with a Civil Air Patrol (CAP) Airborne Real-Time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) sensor between 2009 and 2012. Deleted sites are those sites that have been remediated and removed from the NPL. The imagery was processed to radiance and atmospherically corrected to relative reflectance with standard software routines using the Environment for Visualizing Imagery (ENVI, ITT–VIS, Boulder, Colorado) software. Standard routines for anomaly detection, endmember collection, vegetation stress, and spectral analysis were applied.

  1. Remote infrared signage evaluation for transit stations and intersections.

    PubMed

    Crandall, W; Brabyn, J; Bentzen, B L; Myers, L

    1999-10-01

    Opportunities for education and employment depend upon effective and independent travel. For mainstream society, this is accomplished to a large extent by printed signs. People who are print disabled, visually impaired, or totally blind are at a disadvantage because they do not have access to signage. Remote infrared signage, such as the Talking Signs (TS) system, provides a solution to this need by labeling the environment for distant viewing. The system uses a transmitting "sign" and a hand-held receiver to tell people about their surroundings. In a seamless infrared signage environment, a visually impaired traveler could: walk safely across an intersection to an ATM or fare machine, from fare machine to bus stop, from bus stop to bus; from bus to building, from building to elevator, from elevator to office, from office to restroom, and so forth. This paper focuses on two problems that are among the most challenging and dangerous faced by blind travelers: negotiating complex transit stations and controlled intersections. We report on human factors studies of TS in these critical tasks, examining such issues as how much training is needed to use the system, its impact on performance and safety, benefits for different population subgroups and user opinions of its value. Results indicate that blind people can quickly and easily learn to use remote infrared signage effectively, and that its use improves travel safety, efficiency, and independence.

  2. The D3 Middleware Architecture

    NASA Technical Reports Server (NTRS)

    Walton, Joan; Filman, Robert E.; Korsmeyer, David J.; Lee, Diana D.; Mak, Ron; Patel, Tarang

    2002-01-01

    DARWIN is a NASA developed, Internet-based system for enabling aerospace researchers to securely and remotely access and collaborate on the analysis of aerospace vehicle design data, primarily the results of wind-tunnel testing and numeric (e.g., computational fluid-dynamics) model executions. DARWIN captures, stores and indexes data; manages derived knowledge (such as visualizations across multiple datasets); and provides an environment for designers to collaborate in the analysis of test results. DARWIN is an interesting application because it supports high-volumes of data. integrates multiple modalities of data display (e.g., images and data visualizations), and provides non-trivial access control mechanisms. DARWIN enables collaboration by allowing not only sharing visualizations of data, but also commentary about and views of data. Here we provide an overview of the architecture of D3, the third generation of DARWIN. Earlier versions of DARWIN were characterized by browser-based interfaces and a hodge-podge of server technologies: CGI scripts, applets, PERL, and so forth. But browsers proved difficult to control, and a proliferation of computational mechanisms proved inefficient and difficult to maintain. D3 substitutes a pure-Java approach for that medley: A Java client communicates (though RMI over HTTPS) with a Java-based application server. Code on the server accesses information from JDBC databases, distributed LDAP security services, and a collaborative information system. D3 is a three tier-architecture, but unlike 'E-commerce' applications, the data usage pattern suggests different strategies than traditional Enterprise Java Beans - we need to move volumes of related data together, considerable processing happens on the client, and the 'business logic' on the server-side is primarily data integration and collaboration. With D3, we are extending DARWIN to handle other data domains and to be a distributed system, where a single login allows a user transparent access to test results from multiple servers and authority domains.

  3. A Modified Hopfield Neural Network Algorithm (MHNNA) Using ALOS Image for Water Quality Mapping

    PubMed Central

    Kzar, Ahmed Asal; Mat Jafri, Mohd Zubir; Mutter, Kussay N.; Syahreza, Saumi

    2015-01-01

    Decreasing water pollution is a big problem in coastal waters. Coastal health of ecosystems can be affected by high concentrations of suspended sediment. In this work, a Modified Hopfield Neural Network Algorithm (MHNNA) was used with remote sensing imagery to classify the total suspended solids (TSS) concentrations in the waters of coastal Langkawi Island, Malaysia. The adopted remote sensing image is the Advanced Land Observation Satellite (ALOS) image acquired on 18 January 2010. Our modification allows the Hopfield neural network to convert and classify color satellite images. The samples were collected from the study area simultaneously with the acquiring of satellite imagery. The sample locations were determined using a handheld global positioning system (GPS). The TSS concentration measurements were conducted in a lab and used for validation (real data), classification, and accuracy assessments. Mapping was achieved by using the MHNNA to classify the concentrations according to their reflectance values in band 1, band 2, and band 3. The TSS map was color-coded for visual interpretation. The efficiency of the proposed algorithm was investigated by dividing the validation data into two groups. The first group was used as source samples for supervisor classification via the MHNNA. The second group was used to test the MHNNA efficiency. After mapping, the locations of the second group in the produced classes were detected. Next, the correlation coefficient (R) and root mean square error (RMSE) were calculated between the two groups, according to their corresponding locations in the classes. The MHNNA exhibited a higher R (0.977) and lower RMSE (2.887). In addition, we test the MHNNA with noise, where it proves its accuracy with noisy images over a range of noise levels. All results have been compared with a minimum distance classifier (Min-Dis). Therefore, TSS mapping of polluted water in the coastal Langkawi Island, Malaysia can be performed using the adopted MHNNA with remote sensing techniques (as based on ALOS images). PMID:26729148

  4. Robot for Investigations and Assessments of Nuclear Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kanaan, Daniel; Dogny, Stephane

    RIANA is a remote controlled Robot dedicated for Investigations and Assessments of Nuclear Areas. The development of RIANA is motivated by the need to have at disposal a proven robot, tested in hot cells; a robot capable of remotely investigate and characterise the inside of nuclear facilities in order to collect efficiently all the required data in the shortest possible time. It is based on a wireless medium sized remote carrier that may carry a wide variety of interchangeable modules, sensors and tools. It is easily customised to match specific requirements and quickly configured depending on the mission and themore » operator's preferences. RIANA integrates localisation and navigation systems. The robot will be able to generate / update a 2D map of its surrounding and exploring areas. The position of the robot is given accurately on the map. Furthermore, the robot will be able to autonomously calculate, define and follow a trajectory between 2 points taking into account its environment and obstacles. The robot is configurable to manage obstacles and restrict access to forbidden areas. RIANA allows an advanced control of modules, sensors and tools; all collected data (radiological and measured data) are displayed in real time in different format (chart, on the generated map...) and stored in a single place so that may be exported in a convenient format for data processing. This modular design gives RIANA the flexibility to perform multiple investigation missions where humans cannot work such as: visual inspections, dynamic localization and 2D mapping, characterizations and nuclear measurements of floor and walls, non destructive testing, samples collection: solid and liquid. The benefits of using RIANA are: - reducing the personnel exposures by limiting the manual intervention time, - minimizing the time and reducing the cost of investigation operations, - providing critical inputs to set up and optimize cleanup and dismantling operations. (authors)« less

  5. Design and Implementation of Cloud-Centric Configuration Repository for DIY IoT Applications

    PubMed Central

    Ahmad, Shabir; Kim, Do Hyeun

    2018-01-01

    The Do-It-Yourself (DIY) vision for the design of a smart and customizable IoT application demands the involvement of the general public in its development process. The general public lacks the technical knowledge for programming state-of-the-art prototyping and development kits. The latest IoT kits, for example, Raspberry Pi, are revolutionizing the DIY paradigm for IoT, and more than ever, a DIY intuitive programming interface is required to enable the masses to interact with and customize the behavior of remote IoT devices on the Internet. However, in most cases, these DIY toolkits store the resultant configuration data in local storage and, thus, cannot be accessed remotely. This paper presents the novel implementation of such a system, which not only enables the general public to customize the behavior of remote IoT devices through a visual interface, but also makes the configuration available everywhere and anytime by leveraging the power of cloud-based platforms. The interface enables the visualization of the resources exposed by remote embedded resources in the form of graphical virtual objects (VOs). These VOs are used to create the service design through simple operations like drag-and-drop and the setting of properties. The configuration created as a result is maintained as an XML document, which is ingested by the cloud platform, thus making it available to be used anywhere. We use the HTTP approach for the communication between the cloud and IoT toolbox and the cloud and real devices, but for communication between the toolbox and actual resources, CoAP is used. Finally, a smart home case study has been implemented and presented in order to assess the effectiveness of the proposed work. PMID:29415450

  6. Design and Implementation of Cloud-Centric Configuration Repository for DIY IoT Applications.

    PubMed

    Ahmad, Shabir; Hang, Lei; Kim, Do Hyeun

    2018-02-06

    The Do-It-Yourself (DIY) vision for the design of a smart and customizable IoT application demands the involvement of the general public in its development process. The general public lacks the technical knowledge for programming state-of-the-art prototyping and development kits. The latest IoT kits, for example, Raspberry Pi, are revolutionizing the DIY paradigm for IoT, and more than ever, a DIY intuitive programming interface is required to enable the masses to interact with and customize the behavior of remote IoT devices on the Internet. However, in most cases, these DIY toolkits store the resultant configuration data in local storage and, thus, cannot be accessed remotely. This paper presents the novel implementation of such a system, which not only enables the general public to customize the behavior of remote IoT devices through a visual interface, but also makes the configuration available everywhere and anytime by leveraging the power of cloud-based platforms. The interface enables the visualization of the resources exposed by remote embedded resources in the form of graphical virtual objects (VOs). These VOs are used to create the service design through simple operations like drag-and-drop and the setting of properties. The configuration created as a result is maintained as an XML document, which is ingested by the cloud platform, thus making it available to be used anywhere. We use the HTTP approach for the communication between the cloud and IoT toolbox and the cloud and real devices, but for communication between the toolbox and actual resources, CoAP is used. Finally, a smart home case study has been implemented and presented in order to assess the effectiveness of the proposed work.

  7. Remote magnetic control of a wireless capsule endoscope in the esophagus is safe and feasible: results of a randomized, clinical trial in healthy volunteers.

    PubMed

    Keller, Jutta; Fibbe, Christiane; Volke, Frank; Gerber, Jeremy; Mosse, Alexander C; Reimann-Zawadzki, Meike; Rabinovitz, Elisha; Layer, Peter; Swain, Paul

    2010-11-01

    Remote control of esophageal capsule endoscopes could enhance diagnostic accuracy. To assess the safety and efficacy of remote magnetic manipulation of a modified capsule endoscope (magnetic maneuverable capsule [MMC]; Given Imaging Ltd, Yoqneam, Israel) in the esophagus of healthy humans. Randomized, controlled trial. Academic hospital. This study involved 10 healthy volunteers. All participants swallowed a conventional capsule (ESO2; Given Imaging) and a capsule endoscope with magnetic material, the MMC, which is activated by a thermal switch, in random order (1 week apart). An external magnetic paddle (EMP; Given Imaging) was used to manipulate the MMC within the esophageal lumen. MMC responsiveness was evaluated on a screen showing the MMC film in real time. Safety and tolerability of the procedure (questionnaire), responsiveness of the MMC to the EMP, esophageal transit time, and visualization of the Z-line. No adverse events occurred apart from mild retrosternal pressure (n = 5). The ability to rotate the MMC around its longitudinal axis and to tilt it by defined movements of the EMP was clearly demonstrated in 9 volunteers. Esophageal transit time was highly variable for both capsules (MMC, 111-1514 seconds; ESO2, 47-1474 seconds), but the MMC stayed longer in the esophagus in 8 participants (P < .01). Visualization of the Z-line was more efficient with the ESO2 (inspection of 73% ± 18% of the circumference vs 33% ± 27%, P = .01). Magnetic forces were not strong enough to hold the MMC against peristalsis when the capsule approached the gastroesophageal junction. Remote control of the MMC in the esophagus of healthy volunteers is safe and feasible, but higher magnetic forces may be needed. Copyright © 2010 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

  8. Visual observation of fishes and aquatic habitat [Chapter 17

    Treesearch

    Russell F. Thurow; C. Andrew Dolloff; J. Ellen Marsden

    2012-01-01

    Whether accomplished above the water surface or performed underwater by snorkel, scuba, or hookah divers or remotely operated vehicles (ROVs); direct observation techniques are among the most effective means for obtaining accurate and often unique information on aquatic organisms in their natural surroundings. Many types of studies incorporate direct observation...

  9. 33 CFR 117.1035 - Columbia River.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the remote control stations located at the ends of the bridge. Operation of the bridge shall be as... visually inspect the waterway for marine traffic approaching the bridge. The closing sequence shall not be activated until after marine traffic has cleared the bridge. (3) When the closing sequence is activated, the...

  10. 33 CFR 117.1035 - Columbia River.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the remote control stations located at the ends of the bridge. Operation of the bridge shall be as... visually inspect the waterway for marine traffic approaching the bridge. The closing sequence shall not be activated until after marine traffic has cleared the bridge. (3) When the closing sequence is activated, the...

  11. VIS: Technology for Multicultural Teacher Education.

    ERIC Educational Resources Information Center

    Bruning, Merribeth J.

    1992-01-01

    Video Information Systems (VIS) is fiber optics network that connects campus classrooms to VIS central library. Remotely controlled by instructors, VIS incorporates use of number of audiovisual materials and can be used in cross-cultural training in which visual aids assist in showing cultural differences. VIS assists in education of future…

  12. Uses of GIS for Homeland Security and Emergency Management for Higher Education Institutions

    ERIC Educational Resources Information Center

    Murchison, Stuart B.

    2010-01-01

    Geographic information systems (GIS) are a major component of the geospatial sciences, which are also composed of geostatistical analysis, remote sensing, and global positional satellite systems. These systems can be integrated into GIS for georeferencing, pattern analysis, visualization, and understanding spatial concepts that transcend…

  13. Use of robotics for nondestructive inspection of steel highway bridges and structures : final report.

    DOT National Transportation Integrated Search

    2005-01-01

    This report presents the results of a project to finalize and apply a crawling robotic system for the remote visual inspection of high-mast light poles. The first part of the project focused on finalizing the prototype crawler robot hardware and cont...

  14. Automatic patient respiration failure detection system with wireless transmission

    NASA Technical Reports Server (NTRS)

    Dimeff, J.; Pope, J. M.

    1968-01-01

    Automatic respiration failure detection system detects respiration failure in patients with a surgically implanted tracheostomy tube, and actuates an audible and/or visual alarm. The system incorporates a miniature radio transmitter so that the patient is unencumbered by wires yet can be monitored from a remote location.

  15. 47 CFR 78.51 - Remote control operation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... control system shall be installed and protected in a manner designed to prevent tampering or operation by... transmissions and a carrier operated device which will give a continuous visual indication whenever the... necessary to insure proper operation. (4) The control circuits shall be so designed and installed that short...

  16. 47 CFR 78.51 - Remote control operation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... control system shall be installed and protected in a manner designed to prevent tampering or operation by... transmissions and a carrier operated device which will give a continuous visual indication whenever the... necessary to insure proper operation. (4) The control circuits shall be so designed and installed that short...

  17. A Portable Presentation Package for Audio-Visual Instruction. Technical Documentary Report.

    ERIC Educational Resources Information Center

    Smith, Edgar A.; And Others

    The Portable Presentation Package is a prototype of an audiovisual equipment package designed to facilitate technical training in remote areas, situations in which written communications are difficult, or in situations requiring rapid presentation of instructional material. The major criteria employed in developing the package were (1) that the…

  18. Remote excitation and detection of surface-enhanced Raman scattering from graphene.

    PubMed

    Coca-López, Nicolás; Hartmann, Nicolai F; Mancabelli, Tobia; Kraus, Jürgen; Günther, Sebastian; Comin, Alberto; Hartschuh, Achim

    2018-06-07

    We demonstrate the remote excitation and detection of surface-enhanced Raman scattering (SERS) from graphene using a silver nanowire as a plasmonic waveguide. By investigating a nanowire touching a graphene sheet at only one terminal, we first show the remote excitation of SERS from graphene by propagating surface plasmon polaritons (SPPs) launched by a focused laser over distances on the order of 10 μm. Remote detection of SERS is then demonstrated for the same nanowire by detecting light emission at the distal end of the nanowire that was launched by graphene Raman scattering and carried to the end of the nanowire by SPPs. We then show that the transfer of the excitation and Raman scattered light along the nanowire can also be visualized through spectrally selective back focal plane imaging. Back focal plane images detected upon focused laser excitation at one of the nanowire's tips reveal propagating surface plasmon polaritons at the laser energy and at the energies of the most prominent Raman bands of graphene. With this approach the identification of remote excitation and detection of SERS for nanowires completely covering the Raman scatterer is achieved, which is typically not possible by direct imaging.

  19. Design and development of an IoT-based web application for an intelligent remote SCADA system

    NASA Astrophysics Data System (ADS)

    Kao, Kuang-Chi; Chieng, Wei-Hua; Jeng, Shyr-Long

    2018-03-01

    This paper presents a design of an intelligent remote electrical power supervisory control and data acquisition (SCADA) system based on the Internet of Things (IoT), with Internet Information Services (IIS) for setting up web servers, an ASP.NET model-view- controller (MVC) for establishing a remote electrical power monitoring and control system by using responsive web design (RWD), and a Microsoft SQL Server as the database. With the web browser connected to the Internet, the sensing data is sent to the client by using the TCP/IP protocol, which supports mobile devices with different screen sizes. The users can provide instructions immediately without being present to check the conditions, which considerably reduces labor and time costs. The developed system incorporates a remote measuring function by using a wireless sensor network and utilizes a visual interface to make the human-machine interface (HMI) more instinctive. Moreover, it contains an analog input/output and a basic digital input/output that can be applied to a motor driver and an inverter for integration with a remote SCADA system based on IoT, and thus achieve efficient power management.

  20. National remote computational flight research facility

    NASA Technical Reports Server (NTRS)

    Rediess, Herman A.

    1989-01-01

    The extension of the NASA Ames-Dryden remotely augmented vehicle (RAV) facility to accommodate flight testing of a hypersonic aircraft utilizing the continental United States as a test range is investigated. The development and demonstration of an automated flight test management system (ATMS) that uses expert system technology for flight test planning, scheduling, and execution is documented.

  1. The four-meter confrontation visual field test.

    PubMed Central

    Kodsi, S R; Younge, B R

    1992-01-01

    The 4-m confrontation visual field test has been successfully used at the Mayo Clinic for many years in addition to the standard 0.5-m confrontation visual field test. The 4-m confrontation visual field test is a test of macular function and can identify small central or paracentral scotomas that the examiner may not find when the patient is tested only at 0.5 m. Also, macular sparing in homonymous hemianopias and quadrantanopias may be identified with the 4-m confrontation visual field test. We recommend use of this confrontation visual field test, in addition to the standard 0.5-m confrontation visual field test, on appropriately selected patients to obtain the most information possible by confrontation visual field tests. PMID:1494829

  2. The four-meter confrontation visual field test.

    PubMed

    Kodsi, S R; Younge, B R

    1992-01-01

    The 4-m confrontation visual field test has been successfully used at the Mayo Clinic for many years in addition to the standard 0.5-m confrontation visual field test. The 4-m confrontation visual field test is a test of macular function and can identify small central or paracentral scotomas that the examiner may not find when the patient is tested only at 0.5 m. Also, macular sparing in homonymous hemianopias and quadrantanopias may be identified with the 4-m confrontation visual field test. We recommend use of this confrontation visual field test, in addition to the standard 0.5-m confrontation visual field test, on appropriately selected patients to obtain the most information possible by confrontation visual field tests.

  3. Development of a remote digital augmentation system and application to a remotely piloted research vehicle

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.; Deets, D. A.

    1975-01-01

    A cost-effective approach to flight testing advanced control concepts with remotely piloted vehicles is described. The approach utilizes a ground based digital computer coupled to the remotely piloted vehicle's motion sensors and control surface actuators through telemetry links to provide high bandwidth feedback control. The system was applied to the control of an unmanned 3/8-scale model of the F-15 airplane. The model was remotely augmented; that is, the F-15 mechanical and control augmentation flight control systems were simulated by the ground-based computer, rather than being in the vehicle itself. The results of flight tests of the model at high angles of attack are discussed.

  4. Estimation of optimal pivot point for remote center of motion alignment in surgery.

    PubMed

    Rosa, Benoît; Gruijthuijsen, Caspar; Van Cleynenbreugel, Ben; Sloten, Jos Vander; Reynaerts, Dominiek; Poorten, Emmanuel Vander

    2015-02-01

    The determination of an optimal pivot point ([Formula: see text]) is important for instrument manipulation in minimally invasive surgery. Such knowledge is of particular importance for robotic-assisted surgery where robots need to rotate precisely around a specific point in space in order to minimize trauma to the body wall while maintaining position control. Remote center of motion (RCM) mechanisms are commonly used, where the RCM point is manually and visually aligned. If not positioned appropriately, this misalignment might lead to intolerably high forces on the body wall with increased risk of postoperative complications or instrument damage. An automated method to align the RCM with the [Formula: see text] was developed and tested. Computer vision and a lightweight calibration procedure are used to estimate the optimal pivot point. One or two pre-calibrated cameras viewing the surgical scene are employed. The surgeon is asked to make short pivoting movements, applying as little torque as possible, with an instrument of choice passing through the insertion point while camera images are being recorded. The physical properties of an instrument rotating around a pivot point are exploited in a random sample consensus scheme to robustly estimate the ideal position of the RCM in the image planes. Triangulation is used to estimate the RCM position in 3D. Experiments were performed on a specially designed mockup to test the method. The position of the pivot point is estimated with an average error less than 1.85 mm using two webcams placed from approximately 30 cm to 1 m away from the scene. The entire procedure was completed in a few seconds. In automated method to estimate the ideal position of the RCM was shown to be reliable. The method can be implemented within a visual servoing approach to automatically place the RCM point, or the results can be displayed on a screen to provide guidance to the surgeon. Further work includes the development of an image-guided alignment method and validation with in vivo experiments.

  5. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones.

    PubMed

    Anderson, K; Griffiths, D; DeBell, L; Hancock, S; Duffy, J P; Shutler, J D; Reinhardt, W J; Griffiths, A

    2016-01-01

    This manuscript describes the development of an android-based smartphone application for capturing aerial photographs and spatial metadata automatically, for use in grassroots mapping applications. The aim of the project was to exploit the plethora of on-board sensors within modern smartphones (accelerometer, GPS, compass, camera) to generate ready-to-use spatial data from lightweight aerial platforms such as drones or kites. A visual coding 'scheme blocks' framework was used to build the application ('app'), so that users could customise their own data capture tools in the field. The paper reports on the coding framework, then shows the results of test flights from kites and lightweight drones and finally shows how open-source geospatial toolkits were used to generate geographical information system (GIS)-ready GeoTIFF images from the metadata stored by the app. Two Android smartphones were used in testing-a high specification OnePlus One handset and a lower cost Acer Liquid Z3 handset, to test the operational limits of the app on phones with different sensor sets. We demonstrate that best results were obtained when the phone was attached to a stable single line kite or to a gliding drone. Results show that engine or motor vibrations from powered aircraft required dampening to ensure capture of high quality images. We demonstrate how the products generated from the open-source processing workflow are easily used in GIS. The app can be downloaded freely from the Google store by searching for 'UAV toolkit' (UAV toolkit 2016), and used wherever an Android smartphone and aerial platform are available to deliver rapid spatial data (e.g. in supporting decision-making in humanitarian disaster-relief zones, in teaching or for grassroots remote sensing and democratic mapping).

  6. An AdaBoost Based Approach to Automatic Classification and Detection of Buildings Footprints, Vegetation Areas and Roads from Satellite Images

    NASA Astrophysics Data System (ADS)

    Gonulalan, Cansu

    In recent years, there has been an increasing demand for applications to monitor the targets related to land-use, using remote sensing images. Advances in remote sensing satellites give rise to the research in this area. Many applications ranging from urban growth planning to homeland security have already used the algorithms for automated object recognition from remote sensing imagery. However, they have still problems such as low accuracy on detection of targets, specific algorithms for a specific area etc. In this thesis, we focus on an automatic approach to classify and detect building foot-prints, road networks and vegetation areas. The automatic interpretation of visual data is a comprehensive task in computer vision field. The machine learning approaches improve the capability of classification in an intelligent way. We propose a method, which has high accuracy on detection and classification. The multi class classification is developed for detecting multiple objects. We present an AdaBoost-based approach along with the supervised learning algorithm. The combi- nation of AdaBoost with "Attentional Cascade" is adopted from Viola and Jones [1]. This combination decreases the computation time and gives opportunity to real time applications. For the feature extraction step, our contribution is to combine Haar-like features that include corner, rectangle and Gabor. Among all features, AdaBoost selects only critical features and generates in extremely efficient cascade structured classifier. Finally, we present and evaluate our experimental results. The overall system is tested and high performance of detection is achieved. The precision rate of the final multi-class classifier is over 98%.

  7. Computerized fracture critical and specialized bridge inspection program with NDE applications

    NASA Astrophysics Data System (ADS)

    Fish, Philip E.

    1998-03-01

    Wisconsin Department of Transportation implemented a Fracture Critical & Specialized Inspection Program in 1987. The program has a strong emphasis on Nondestructive Testing (NDT). The program is also completely computerized, using laptop computers to gather field data, digital cameras for pictures, and testing equipment with download features. Final inspection reports with detailed information can be delivered within days of the inspection. The program requires an experienced inspection team and qualified personnel. Individuals performing testing must be licensed ASNT (American Society for Nondestructive Testing) Level III and must be licensed Certified Weld Inspectors (American Welding Society). Several critical steps have been developed to assure that each inspection identifies all possible deficiencies that may be possible on a Fracture Critical or Unique Bridge. They include; review of all existing plans and maintenance history; identification of fracture critical members, identification of critical connection details, welds, & fatigue prone details, development of visual and NDE inspection plan; field inspection procedures; and a detailed formal report. The program has found several bridges with critical fatigue conditions which have resulted in replacement or major rehabilitation. In addition, remote monitoring systems have been installed on structures with serious cracking to monitor for changing conditions.

  8. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  9. Feasibility of Telementoring for Microneurosurgical Procedures Using a Microscope: A Proof-of-Concept Study.

    PubMed

    Ladd, Bryan M; Tackla, Ryan D; Gupte, Akshay; Darrow, David; Sorenson, Jeffery; Zuccarello, Mario; Grande, Andrew W

    2017-03-01

    Our pilot study evaluated the effectiveness of our telementoring-telescripting model to facilitate seamless communication between surgeons while the operating surgeon is using a microscope. As a first proof of concept, 4 students identified 20 anatomic landmarks on a dry human skull with or without telementoring guidance. To assess the ability to communicate operative information, a senior neurosurgery resident evaluated the student's ability and timing to complete a stepwise craniotomy on a cadaveric head, with and without telementoring guidance; a second portion included exposure of the anterior circulation. The mentor was able to annotate directly onto the operator's visual field, which was visible to the operator without looking away from the binocular view. The students showed that they were familiar with half (50% ± 10%) of the structures for identification and none was familiar with the steps to complete a craniotomy before using our system. With the guidance of a remote surgeon projected into the visual field of the microscope, the students were able to correctly identify 100% of the structures and complete a craniotomy. Our system also proved effective in guiding a more experienced neurosurgery resident through complex operative steps associated with exposure of the anterior circulation. Our pilot study showed a platform feasible in providing effective operative direction to inexperienced operators while operating using a microscope. A remote mentor was able to view the visual field of the microscope, annotate on the visual stream, and have the annotated stream appear in the binocular view for the operating mentee. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Zooming into creativity: individual differences in attentional global-local biases are linked to creative thinking

    PubMed Central

    Zmigrod, Sharon; Zmigrod, Leor; Hommel, Bernhard

    2015-01-01

    While recent studies have investigated how processes underlying human creativity are affected by particular visual-attentional states, we tested the impact of more stable attention-related preferences. These were assessed by means of Navon’s global-local task, in which participants respond to the global or local features of large letters constructed from smaller letters. Three standard measures were derived from this task: the sizes of the global precedence effect, the global interference effect (i.e., the impact of incongruent letters at the global level on local processing), and the local interference effect (i.e., the impact of incongruent letters at the local level on global processing). These measures were correlated with performance in a convergent-thinking creativity task (the Remote Associates Task), a divergent-thinking creativity task (the Alternate Uses Task), and a measure of fluid intelligence (Raven’s matrices). Flexibility in divergent thinking was predicted by the local interference effect while convergent thinking was predicted by intelligence only. We conclude that a stronger attentional bias to visual information about the “bigger picture” promotes cognitive flexibility in searching for multiple solutions. PMID:26579030

  11. Rightward biases in free-viewing visual bisection tasks: implications for leftward responses biases on similar tasks.

    PubMed

    Elias, Lorin J; Robinson, Brent; Saucier, Deborah M

    2005-12-01

    Neurologically normal individuals exhibit strong leftward response biases during free-viewing perceptual judgments of brightness, quantity, and size. When participants view two mirror-reversed objects and they are forced to choose which object appears darker, more numerous, or larger, the stimulus with the relevant feature on the left side is chosen 60-75% of the time. This effect could be influenced by inaccurate judgments of the true centre-point of the objects being compared. In order to test this possibility, 10 participants completed three visual bisection tasks on stimuli known to elicit strong leftward response biases. Participants were monitored using a remote eye-tracking device and instructed to stare at the subjective midpoint of objects presented on a computer screen. Although it was predicted that bisection errors would deviate to the left of centre (as is the case in the line bisection literature), the opposite effect was found. Significant rightward bisection errors were evident on two of the three tasks, and the leftward biases seen during forced-choice tasks could be the result of misjudgments to the right of centre on these same tasks.

  12. Augmented Virtuality: A Real-time Process for Presenting Real-world Visual Sensory Information in an Immersive Virtual Environment for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.

    2017-12-01

    Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.

  13. Remote monitoring of Xpert® MTB/RIF testing in Mozambique: results of programmatic implementation of GxAlert.

    PubMed

    Cowan, J; Michel, C; Manhiça, I; Mutaquiha, C; Monivo, C; Saize, D; Beste, J; Creswell, J; Codlin, A J; Gloyd, S

    2016-03-01

    Electronic diagnostic tests, such as the Xpert® MTB/RIF assay, are being implemented in low- and middle-income countries (LMICs). However, timely information from these tests available via remote monitoring is underutilized. The failure to transmit real-time, actionable data to key individuals such as clinicians, patients, and national monitoring and evaluation teams may negatively impact patient care. To describe recently developed applications that allow for real-time, remote monitoring of Xpert results, and initial implementation of one of these products in central Mozambique. In partnership with the Mozambican National Tuberculosis Program, we compared three different remote monitoring tools for Xpert and selected one, GxAlert, to pilot and evaluate at five public health centers in Mozambique. GxAlert software was successfully installed on all five Xpert computers, and test results are now uploaded daily via a USB internet modem to a secure online database. A password-protected web-based interface allows real-time analysis of test results, and 1200 positive tests for tuberculosis generated 8000 SMS result notifications to key individuals. Remote monitoring of diagnostic platforms is feasible in LMICs. While promising, this effort needs to address issues around patient data ownership, confidentiality, interoperability, unique patient identifiers, and data security.

  14. Remote control and navigation tests for application to long-range lunar surface exploration

    NASA Technical Reports Server (NTRS)

    Mastin, W. C.; White, P. R.; Vinz, F. L.

    1971-01-01

    Tests conducted with a vehicle system built at the Marshall Space Flight Center to investigate some of the unknown factors associated with remote controlled teleoperated vehicles on the lunar surface are described. Test data are summarized and conclusions are drawn from these data which indicate that futher testing will be required.

  15. Flight-test experience in digital control of a remotely piloted vehicle.

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.

    1972-01-01

    The development of a remotely piloted vehicle system consisting of a remote pilot cockpit and a ground-based digital computer coupled to the aircraft through telemetry data links is described. The feedback control laws are implemented in a FORTRAN program. Flight-test experience involving high feedback gain limits for attitude and attitude rate feedback variables, filtering of sampled data, and system operation during intermittent telemetry data link loss is discussed. Comparisons of closed-loop flight tests with analytical calculations, and pilot comments on system operation are included.

  16. Remote Associates Test and Alpha Brain Waves

    ERIC Educational Resources Information Center

    Haarmann, Henk J.; George, Timothy; Smaliy, Alexei; Dien, Joseph

    2012-01-01

    Previous studies found that performance on the remote associates test (RAT) improves after a period of incubation and that increased alpha brain waves over the right posterior brain predict the emergence of RAT insight solutions. We report an experiment that tested whether increased alpha brain waves during incubation improve RAT performance.…

  17. Porting the AVS/Express scientific visualization software to Cray XT4.

    PubMed

    Leaver, George W; Turner, Martin J; Perrin, James S; Mummery, Paul M; Withers, Philip J

    2011-08-28

    Remote scientific visualization, where rendering services are provided by larger scale systems than are available on the desktop, is becoming increasingly important as dataset sizes increase beyond the capabilities of desktop workstations. Uptake of such services relies on access to suitable visualization applications and the ability to view the resulting visualization in a convenient form. We consider five rules from the e-Science community to meet these goals with the porting of a commercial visualization package to a large-scale system. The application uses message-passing interface (MPI) to distribute data among data processing and rendering processes. The use of MPI in such an interactive application is not compatible with restrictions imposed by the Cray system being considered. We present details, and performance analysis, of a new MPI proxy method that allows the application to run within the Cray environment yet still support MPI communication required by the application. Example use cases from materials science are considered.

  18. Estimating reforestation by means of remote sensing

    NASA Technical Reports Server (NTRS)

    Dejesusparada, N. (Principal Investigator); Filho, P. H.; Shimabukuro, Y. E.; Dossantos, J. R.

    1981-01-01

    LANDSAT imagery at the scale of 1:250.000 and obtained from bands 5 and 7 as well as computer compatible tapes were used to evaluate the effectiveness of remotely sensed orbital data in inventorying forests in a 462,100 area of Brazil emcompassing the cities of Ribeirao, Altinopolis Cravinhos, Serra Azul, Luis Antonio, Sao Simao, Santa Rita do Passa Quatro, and Santa Rosa do Viterbo. Visual interpretation of LANDSAT imagery shows that 37,766 hectares (1977) and 38,003.75 hectares (1979) were reforested areas of pine and eucalyptus species. An increment of 237.5 hectares was found during this two-year time lapse.

  19. Goddard Atmospheric Composition Data Center: Aura Data and Services in One Place

    NASA Technical Reports Server (NTRS)

    Leptoukh, G.; Kempler, S.; Gerasimov, I.; Ahmad, S.; Johnson, J.

    2005-01-01

    The Goddard Atmospheric Composition Data and Information Services Center (AC-DISC) is a portal to the Atmospheric Composition specific, user driven, multi-sensor, on-line, easy access archive and distribution system employing data analysis and visualization, data mining, and other user requested techniques for the better science data usage. It provides convenient access to Atmospheric Composition data and information from various remote-sensing missions, from TOMS, UARS, MODIS, and AIRS, to the most recent data from Aura OMI, MLS, HIRDLS (once these datasets are released to the public), as well as Atmospheric Composition datasets residing at other remote archive site.

  20. Evaluation of reforestation using remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Filho, P. H.; Shimabukuro, Y. E.; Dossantos, J. R.

    1982-01-01

    The utilization of remotely sensed orbital data for forestry inventory. The study area (approximately 491,100 ha) encompasses the municipalities of Ribeirao Preto, Altinopolis, Cravinhos, Serra Azul, Luis Antonio, Sao Simao, Sant Rita do Passa Quatro and Santa Rosa do Viterbo (Sao Paulo State). Materials used were LANDSAT data from channels 5 and 7 (scale 1:250,000) and CCT's. Visual interpretation of the imagery showed that for 1977 a total of 37,766.00 ha and for 1979 38,003.75 ha were reforested with Pinus and Eucalyptus within the area under study. The results obtained show that LANDSAT data can be used efficiently in forestry inventory studies.

  1. Development and Testing of Physically-Based Methods for Filling Gaps in Remotely Sensed River Data

    DTIC Science & Technology

    2011-09-30

    Filling Gaps in Remotely Sensed River Data Jonathan M. Nelson US Geological Survey National Research Program Geomorphology and Sediment Transport...the research work carried out under this grant are to develop and test two methods for filling in gaps in remotely sensed river data. The first...information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215

  2. Daylighting Digital Dimmer SBIR Phase 2 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Morgan

    The primary focus of the Phase II Development is the implementation of two key technologies, Task To Wall (TTW) Control, and Wand Gesture light dimming control into an easy to use remote for SSL light control, the MoJo Remote. The MoJo Remote product family includes a battery powered wireless remote, a WiFi gateway as well as Mobile Applications for iOS and Android. Specific accomplishments during the second reporting period include: 1. Finalization and implementation of MoJo Remote Accelerometer and capacitive-touch based UI/UX, referred to as the Wand Gesture UI. 2. Issuance of Patent for Wand Gesture UI. 3. Industrial andmore » Mechanical Design for MoJo Remote and MoJo Gateway. 4. Task To Wall implementation and testing in MoJo Remote. 5. Zooming User Interface (ZUI) for the Mobile App implemented on both iOS and Andriod. 6. iOS Mobile app developed to beta level functionality. 7. Initial Development of the Android Mobile Application. 8. Closed loop color control at task (demonstrated at 2016 SSL R&D Workshop). 9. Task To Wall extended to Color Control, working in simulation. 10. Beta testing begun in Late 2017/Early 2018. The MoJo Remote integrates the Patented TTW Control and the Wand Gesture innovative User Interface, and is currently in Beta testing and on the path to commercialization.« less

  3. A collaborative interaction and visualization multi-modal environment for surgical planning.

    PubMed

    Foo, Jung Leng; Martinez-Escobar, Marisol; Peloquin, Catherine; Lobe, Thom; Winer, Eliot

    2009-01-01

    The proliferation of virtual reality visualization and interaction technologies has changed the way medical image data is analyzed and processed. This paper presents a multi-modal environment that combines a virtual reality application with a desktop application for collaborative surgical planning. Both visualization applications can function independently but can also be synced over a network connection for collaborative work. Any changes to either application is immediately synced and updated to the other. This is an efficient collaboration tool that allows multiple teams of doctors with only an internet connection to visualize and interact with the same patient data simultaneously. With this multi-modal environment framework, one team working in the VR environment and another team from a remote location working on a desktop machine can both collaborate in the examination and discussion for procedures such as diagnosis, surgical planning, teaching and tele-mentoring.

  4. Semantically induced distortions of visual awareness in a patient with Balint's syndrome.

    PubMed

    Soto, David; Humphreys, Glyn W

    2009-02-01

    We present data indicating that visual awareness for a basic perceptual feature (colour) can be influenced by the relation between the feature and the semantic properties of the stimulus. We examined semantic interference from the meaning of a colour word (''RED") on simple colour (ink related) detection responses in a patient with simultagnosia due to bilateral parietal lesions. We found that colour detection was influenced by the congruency between the meaning of the word and the relevant ink colour, with impaired performance when the word and the colour mismatched (on incongruent trials). This result held even when remote associations between meaning and colour were used (i.e. the word ''PEA" influenced detection of the ink colour red). The results are consistent with a late locus of conscious visual experience that is derived at post-semantic levels. The implications for the understanding of the role of parietal cortex in object binding and visual awareness are discussed.

  5. Application of remote sensor data to geologic analysis of the Bonanza test site, Colorado

    NASA Technical Reports Server (NTRS)

    Lee, K. (Compiler)

    1972-01-01

    A variety of remote sensor data has aided geologic mapping in central Colorado. This report summarizes the application of sensor data to both regional and local geologic mapping and presents some conclusions on the practical use of remote sensing for solving geologic mapping problems. It is emphasized that this study was not conducted primarily to test or evaluate remote sensing systems or data, but, rather, to apply sensor data as an accessory tool for geologic mapping. The remote sensor data used were acquired by the NASA Earth Observations Aircraft Program. Conclusions reached on the utility of the various sensor data and interpretation techniques for geologic mapping were by-products of attempts to use them.

  6. Driving a Semiautonomous Mobile Robotic Car Controlled by an SSVEP-Based BCI.

    PubMed

    Stawicki, Piotr; Gembler, Felix; Volosyak, Ivan

    2016-01-01

    Brain-computer interfaces represent a range of acknowledged technologies that translate brain activity into computer commands. The aim of our research is to develop and evaluate a BCI control application for certain assistive technologies that can be used for remote telepresence or remote driving. The communication channel to the target device is based on the steady-state visual evoked potentials. In order to test the control application, a mobile robotic car (MRC) was introduced and a four-class BCI graphical user interface (with live video feedback and stimulation boxes on the same screen) for piloting the MRC was designed. For the purpose of evaluating a potential real-life scenario for such assistive technology, we present a study where 61 subjects steered the MRC through a predetermined route. All 61 subjects were able to control the MRC and finish the experiment (mean time 207.08 s, SD 50.25) with a mean (SD) accuracy and ITR of 93.03% (5.73) and 14.07 bits/min (4.44), respectively. The results show that our proposed SSVEP-based BCI control application is suitable for mobile robots with a shared-control approach. We also did not observe any negative influence of the simultaneous live video feedback and SSVEP stimulation on the performance of the BCI system.

  7. Validity of the Remote Food Photography Method against Doubly Labeled Water among Minority Preschoolers

    PubMed Central

    Nicklas, Theresa; Saab, Rabab; Islam, Noemi G.; Wong, William; Butte, Nancy; Schulin, Rebecca; Liu, Yan; Apolzan, John W.; Myers, Candice A.; Martin, Corby K.

    2017-01-01

    Objective To determine the validity of energy intake (EI) estimations made using the Remote Food Photography Method (RFPM) compared to the doubly-labeled water (DLW) method in minority preschool children in a free-living environment. Methods Seven days of food intake and spot urine samples excluding first void collections for DLW analysis were obtained on 39 3-to-5 year old Hispanic and African American children. Using an iPhone, caregivers captured before and after pictures of the child’s intake and pictures were wirelessly transmitted to trained raters who estimated portion size using existing visual estimation procedures and energy and macronutrients were calculated. Paired t-test, mean differences and Bland-Altman limits of agreement were performed. Results The mean EI using the RFPM was 1,191 ± 256 kcal/d and 1,412 ± 220 kcal/d by the DLW method, resulting in a mean underestimate of 222 kcal/d (−15.6%) (p<0.0001) that was consistent regardless of intake. The RFPM underestimated EI by −28.5% in 34 children and overestimated EI by 15.6% in 5 children. Conclusions The RFPM underestimated total EI when compared to the DLW method among preschoolers. Further refinement of the RFPM is needed for assessing EI of young children. PMID:28758370

  8. Advanced Ultrasonic Diagnosis of Extremity Trauma: The Faster Exam

    NASA Technical Reports Server (NTRS)

    Dulchavsky, S. A.; Henry, S. E.; Moed, B. R.; Diebel, L. N.; Marshburn, T.; Hamilton, D. R.; Logan, J.; Kirkpatrick, A. W.; Williams, D. R.

    2002-01-01

    Ultrasound is of prO)len accuracy in abdominal and thoracic trauma and may be useful to diagnose extremity injury in situations where radiography is not available such as military and space applications. We prospectively evaluated the utility of extremity , ultrasound performed by trained, non-physician personnel in patients with extremity trauma, to simulate remote aerospace or military applications . Methods: Patients with extremity trauma were identified by history, physical examination, and radiographic studies. Ultrasound examination was performed bilaterally by nonphysician personnel with a portable ultrasound device using a 10-5 MHz linear probe, Images were video-recorded for later analysis against radiography by Fisher's exact test. The average time of examination was 4 minutes. Ultrasound accurately diagnosed extremity, injury in 94% of patients with no false positive exams; accuracy was greater in mid-shaft locations and least in the metacarpa/metatarsals. Soft tissue/tendon injury was readily visualized . Extremity ultrasound can be performed quickly and accurately by nonphysician personnel with excellent accuracy. Blinded verification of the utility of ultrasound in patients with extremity injury should be done to determine if Extremity and Respiratory evaluation should be added to the FAST examination (the FASTER exam) and verify the technique in remote locations such as military and aerospace applications.

  9. Driving a Semiautonomous Mobile Robotic Car Controlled by an SSVEP-Based BCI

    PubMed Central

    2016-01-01

    Brain-computer interfaces represent a range of acknowledged technologies that translate brain activity into computer commands. The aim of our research is to develop and evaluate a BCI control application for certain assistive technologies that can be used for remote telepresence or remote driving. The communication channel to the target device is based on the steady-state visual evoked potentials. In order to test the control application, a mobile robotic car (MRC) was introduced and a four-class BCI graphical user interface (with live video feedback and stimulation boxes on the same screen) for piloting the MRC was designed. For the purpose of evaluating a potential real-life scenario for such assistive technology, we present a study where 61 subjects steered the MRC through a predetermined route. All 61 subjects were able to control the MRC and finish the experiment (mean time 207.08 s, SD 50.25) with a mean (SD) accuracy and ITR of 93.03% (5.73) and 14.07 bits/min (4.44), respectively. The results show that our proposed SSVEP-based BCI control application is suitable for mobile robots with a shared-control approach. We also did not observe any negative influence of the simultaneous live video feedback and SSVEP stimulation on the performance of the BCI system. PMID:27528864

  10. Computer-assisted upper extremity training using interactive biking exercise (iBikE) platform.

    PubMed

    Jeong, In Cheol; Finkelstein, Joseph

    2012-01-01

    Upper extremity exercise training has been shown to improve clinical outcomes in different chronic health conditions. Arm-operated bicycles are frequently used to facilitate upper extremity training however effective use of these devices at patient homes is hampered by lack of remote connectivity with clinical rehabilitation team, inability to monitor exercise progress in real time using simple graphical representation, and absence of an alert system which would prevent exertion levels exceeding those approved by the clinical rehabilitation team. We developed an interactive biking exercise (iBikE) platform aimed at addressing these limitations. The platform uses a miniature wireless 3-axis accelerometer mounted on a patient wrist that transmits the cycling acceleration data to a laptop. The laptop screen presents an exercise dashboard to the patient in real time allowing easy graphical visualization of exercise progress and presentation of exercise parameters in relation to prescribed targets. The iBikE platform is programmed to alert the patient when exercise intensity exceeds the levels recommended by the patient care provider. The iBikE platform has been tested in 7 healthy volunteers (age range: 26-50 years) and shown to reliably reflect exercise progress and to generate alerts at pre-setup levels. Implementation of remote connectivity with patient rehabilitation team is warranted for future extension and evaluation efforts.

  11. Providing structural modules with self-integrity monitoring

    NASA Astrophysics Data System (ADS)

    Walton, W. B.; Ibanez, P.; Yessaie, G.

    1988-08-01

    With the advent of complex space structures (i.e., U.S. Space Station), the need for methods for remotely detecting structural damage will become greater. Some of these structures will have hundreds of individual structural elements (i.e., strut members). Should some of them become damaged, it could be virtually impossible to detect it using visual or similar inspection techniques. The damage of only a few individual members may or may not be a serious problem. However, should a significant number of the members be damaged, a significant problem could be created. The implementation of an appropriate remote damage detection scheme would greatly reduce the likelihood of a serious problem related to structural damage ever occurring. This report presents the results of the research conducted on remote structural damage detection approaches and the related mathematical algorithms. The research was conducted for the Small Business Innovation and Research (SBIR) Phase 2 National Aeronautics and Space Administration (NASA) Contract NAS7-961.

  12. Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery.

    PubMed

    Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell

    2011-06-01

    This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information.

  13. Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery

    PubMed Central

    Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell

    2013-01-01

    This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information. PMID:24398557

  14. Development of a CCTV system for welder training and monitoring of Space Shuttle Main Engine welds

    NASA Technical Reports Server (NTRS)

    Gordon, S. S.; Flanigan, L. A.; Dyer, G. E.

    1987-01-01

    A Weld Operator's Remote Monitoring System (WORMS) for remote viewing of manual and automatic GTA welds has been developed for use in Space Shuttle Main Engine (SSME) manufacturing. This system utilizes fiberoptics to transmit images from a receiving lens to a small closed-circuit television (CCTV) camera. The camera converts the image to an electronic signal, which is sent to a videotape recorder (VTR) and a monitor. The overall intent of this system is to provide a clearer, more detailed view of welds than is available by direct observation. This system has six primary areas of application: (1) welder training; (2) viewing of joint penetration; (3) viewing visually inaccessible welds; (4) quality control and quality assurance; (5) remote joint tracking and adjustment of variables in machine welds; and (6) welding research and development. This paper describes WORMS and how it applies to each application listed.

  15. Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive

    NASA Technical Reports Server (NTRS)

    Geller, Gary N.

    2004-01-01

    Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.

  16. Review of oil spill remote sensing.

    PubMed

    Fingas, Merv; Brown, Carl

    2014-06-15

    Remote-sensing for oil spills is reviewed. The use of visible techniques is ubiquitous, however it gives only the same results as visual monitoring. Oil has no particular spectral features that would allow for identification among the many possible background interferences. Cameras are only useful to provide documentation. In daytime oil absorbs light and remits this as thermal energy at temperatures 3-8K above ambient, this is detectable by infrared (IR) cameras. Laser fluorosensors are useful instruments because of their unique capability to identify oil on backgrounds that include water, soil, weeds, ice and snow. They are the only sensor that can positively discriminate oil on most backgrounds. Radar detects oil on water by the fact that oil will dampen water-surface capillary waves under low to moderate wave/wind conditions. Radar offers the only potential for large area searches, day/night and foul weather remote sensing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The acquisition, storage, and dissemination of LANDSAT and other LACIE support data

    NASA Technical Reports Server (NTRS)

    Abbotts, L. F.; Nelson, R. M. (Principal Investigator)

    1979-01-01

    Activities performed at the LACIE physical data library are described. These include the researching, acquisition, indexing, maintenance, distribution, tracking, and control of LACIE operational data and documents. Much of the data available can be incorporated into an Earth resources data base. Elements of the data collection that can support future remote sensing programs include: (1) the LANDSAT full-frame image files; (2) the microfilm file of aerial and space photographic and multispectral maps and charts that encompasses a large portion of the Earth's surface; (3) the map/chart collection that includes various scale maps and charts for a good portion of the U.S. and the LACIE area in foreign countries; (4) computer-compatible tapes of good quality LANDSAT scenes; (5) basic remote sensing data, project data, reference material, and associated publications; (6) visual aids to support presentation on remote sensing projects; and (7) research acquisition and handling procedures for managing data.

  18. Providing structural modules with self-integrity monitoring

    NASA Technical Reports Server (NTRS)

    Walton, W. B.; Ibanez, P.; Yessaie, G.

    1988-01-01

    With the advent of complex space structures (i.e., U.S. Space Station), the need for methods for remotely detecting structural damage will become greater. Some of these structures will have hundreds of individual structural elements (i.e., strut members). Should some of them become damaged, it could be virtually impossible to detect it using visual or similar inspection techniques. The damage of only a few individual members may or may not be a serious problem. However, should a significant number of the members be damaged, a significant problem could be created. The implementation of an appropriate remote damage detection scheme would greatly reduce the likelihood of a serious problem related to structural damage ever occurring. This report presents the results of the research conducted on remote structural damage detection approaches and the related mathematical algorithms. The research was conducted for the Small Business Innovation and Research (SBIR) Phase 2 National Aeronautics and Space Administration (NASA) Contract NAS7-961.

  19. Augmented Reality for the Improvement of Remote Laboratories: An Augmented Remote Laboratory

    ERIC Educational Resources Information Center

    Andujar, J. M.; Mejias, A.; Marquez, M. A.

    2011-01-01

    Augmented reality (AR) provides huge opportunities for online teaching in science and engineering, as these disciplines place emphasis on practical training and unsuited to completely nonclassroom training. This paper proposes a new concept in virtual and remote laboratories: the augmented remote laboratory (ARL). ARL is being tested in the first…

  20. Accessing Earth Science Data Visualizations through NASA GIBS & Worldview

    NASA Astrophysics Data System (ADS)

    Cechini, M. F.; Boller, R. A.; Baynes, K.; Wong, M. M.; King, B. A.; Schmaltz, J. E.; De Luca, A. P.; King, J.; Roberts, J. T.; Rodriguez, J.; Thompson, C. K.; Pressley, N. N.

    2017-12-01

    For more than 20 years, the NASA Earth Observing System (EOS) has operated dozens of remote sensing satellites collecting nearly 15 Petabytes of data that span thousands of science parameters. Within these observations are keys the Earth Scientists have used to unlock many things that we understand about our planet. Also contained within these observations are a myriad of opportunities for learning and education. The trick is making them accessible to educators and students in convenient and simple ways so that effort can be spent on lesson enrichment and not overcoming technical hurdles. The NASA Global Imagery Browse Services (GIBS) system and NASA Worldview website provide a unique view into EOS data through daily full resolution visualizations of hundreds of earth science parameters. For many of these parameters, visualizations are available within hours of acquisition from the satellite. For others, visualizations are available for the entire mission of the satellite. Accompanying the visualizations are visual aids such as color legends, place names, and orbit tracks. By using these visualizations, educators and students can observe natural phenomena that enrich a scientific education. This poster will provide an overview of the visualizations available in NASA GIBS and Worldview and how they are accessed. We invite discussion on how the visualizations can be used or improved for educational purposes.

  1. Ocular Coherence Tomography in the Evaluation of Anterior Eye Injuries in Space Flight

    NASA Technical Reports Server (NTRS)

    Fer, Dan M.; Law, Jennifer; Wells, Julia

    2017-01-01

    While Ocular Coherence Tomography (OCT) is not a first-line modality to evaluate anterior eye structures terrestrially, it is a resource already available on the International Space Station (ISS) that can be used in medical contingencies that involve the anterior eye. With remote guidance and subject matter expert (SME) support from the ground, a minimally trained crewmember can now use OCT to evaluate anterior eye pathologies on orbit. OCT utilizes low-coherence interferometry to produce detailed cross-sectional and 3D images of the eye in real time. Terrestrially, it has been used to evaluate macular pathologies and glaucoma. Since 2013, OCT has been used onboard the ISS as one part of a suite of hardware to evaluate the Visual Impairment/Intracranial Pressure risk faced by astronauts, specifically assessing changes in the retina and choroid during space flight. The Anterior Segment Module (ASM), an add-on lens, was also flown for research studies, providing an opportunity to evaluate the anterior eye in real time if clinically indicated. Anterior eye pathologies that could be evaluated using OCT were identified. These included corneal abrasions and ulcers, scleritis, and acute angle closure glaucoma. A remote guider script was written to provide ground specialists with step-by-step instructions to guide ISS crewmembers, who do not get trained on the ASM, to evaluate the anterior eye. The instructions were tested on novice subjects and/or operators, whose feedback was incorporated iteratively. The final remote guider script was reviewed by SME optometrists and NASA flight surgeons. The novel application of OCT technology to space flight allows for the acquisition of objective data to diagnose anterior eye pathologies when other modalities are not available. This demonstrates the versatility of OCT and highlights the advantages of using existing hardware and remote guidance skills to expand clinical capabilities in space flight.

  2. Remote assessment of diabetic foot ulcers using a novel wound imaging system.

    PubMed

    Bowling, Frank L; King, Laurie; Paterson, James A; Hu, Jingyi; Lipsky, Benjamin A; Matthews, David R; Boulton, Andrew J M

    2011-01-01

    Telemedicine allows experts to assess patients in remote locations, enabling quality convenient, cost-effective care. To help assess foot wounds remotely, we investigated the reliability of a novel optical imaging system employing a three-dimensional camera and disposable optical marker. We first examined inter- and intraoperator measurement variability (correlation coefficient) of five clinicians examining three different wounds. Then, to assess of the system's ability to identify key clinically relevant features, we had two clinicians evaluate 20 different wounds at two centers, recording observations on a standardized form. Three other clinicians recorded their observations using only the corresponding three-dimensional images. Using the in-person assessment as the criterion standard, we assessed concordance of the remote with in-person assessments. Measurement variation of area was 3.3% for intraoperator and 11.9% for interoperator; difference in clinician opinion about wound boundary location was significant. Overall agreement for remote vs. in-person assessments was good, but was lowest on the subjective clinical assessments, e.g., value of debridement to improve healing. Limitations of imaging included inability to show certain characteristics, e.g., moistness or exudation. Clinicians gave positive feedback on visual fidelity. This pilot study showed that a clinician viewing only the three-dimensional images could accurately measure and assess a diabetic foot wound remotely. © 2010 by the Wound Healing Society.

  3. Modeling of Aerosol Vertical Profiles Using GIS and Remote Sensing

    PubMed Central

    Wong, Man Sing; Nichol, Janet E.; Lee, Kwon Ho

    2009-01-01

    The use of Geographic Information Systems (GIS) and Remote Sensing (RS) by climatologists, environmentalists and urban planners for three dimensional modeling and visualization of the landscape is well established. However no previous study has implemented these techniques for 3D modeling of atmospheric aerosols because air quality data is traditionally measured at ground points, or from satellite images, with no vertical dimension. This study presents a prototype for modeling and visualizing aerosol vertical profiles over a 3D urban landscape in Hong Kong. The method uses a newly developed technique for the derivation of aerosol vertical profiles from AERONET sunphotometer measurements and surface visibility data, and links these to a 3D urban model. This permits automated modeling and visualization of aerosol concentrations at different atmospheric levels over the urban landscape in near-real time. Since the GIS platform permits presentation of the aerosol vertical distribution in 3D, it can be related to the built environment of the city. Examples are given of the applications of the model, including diagnosis of the relative contribution of vehicle emissions to pollution levels in the city, based on increased near-surface concentrations around weekday rush-hour times. The ability to model changes in air quality and visibility from ground level to the top of tall buildings is also demonstrated, and this has implications for energy use and environmental policies for the tall mega-cities of the future. PMID:22408531

  4. Modeling of Aerosol Vertical Profiles Using GIS and Remote Sensing.

    PubMed

    Wong, Man Sing; Nichol, Janet E; Lee, Kwon Ho

    2009-01-01

    The use of Geographic Information Systems (GIS) and Remote Sensing (RS) by climatologists, environmentalists and urban planners for three dimensional modeling and visualization of the landscape is well established. However no previous study has implemented these techniques for 3D modeling of atmospheric aerosols because air quality data is traditionally measured at ground points, or from satellite images, with no vertical dimension. This study presents a prototype for modeling and visualizing aerosol vertical profiles over a 3D urban landscape in Hong Kong. The method uses a newly developed technique for the derivation of aerosol vertical profiles from AERONET sunphotometer measurements and surface visibility data, and links these to a 3D urban model. This permits automated modeling and visualization of aerosol concentrations at different atmospheric levels over the urban landscape in near-real time. Since the GIS platform permits presentation of the aerosol vertical distribution in 3D, it can be related to the built environment of the city. Examples are given of the applications of the model, including diagnosis of the relative contribution of vehicle emissions to pollution levels in the city, based on increased near-surface concentrations around weekday rush-hour times. The ability to model changes in air quality and visibility from ground level to the top of tall buildings is also demonstrated, and this has implications for energy use and environmental policies for the tall mega-cities of the future.

  5. ScienceDesk Project Overview

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Norvig, Peter (Technical Monitor)

    2000-01-01

    NASA's ScienceDesk Project at the Ames Research Center is responsible for scientific knowledge management which includes ensuring the capture, preservation, and traceability of scientific knowledge. Other responsibilities include: 1) Maintaining uniform information access which is achieved through intelligent indexing and visualization, 2) Collaborating both asynchronous and synchronous science teamwork, 3) Monitoring and controlling semi-autonomous remote experimentation.

  6. Simulating spatial and temporal context of forest management using hypothetical landscapes

    Treesearch

    Eric J. Gustafson; Thomas R. Crow

    1998-01-01

    Spatially explicit models that combine remote sensing with geographic information systems (GIS) offer great promise to land managers because they consider the arrangement of landscape elements in time and space. Their visual and geographic nature facilitate the comparison of alternative landscape designs. Among various activities associated with forest management,...

  7. Assessment of Methane and VOC Emissions from Select Upstream Oil and Gas Production Operations Using Remote Measurements, Interim Report on Recent Survey Studies

    EPA Science Inventory

    This product is visuals for a platform presentation in support of already approved extended abstact for thisconference. Abstract: Environmentally responsible development of oil and gas assets in the United States is facilitated by advancement of sector-specific air pollution em...

  8. Monitoring Global Crop Condition Indicators Using a Web-Based Visualization Tool

    Treesearch

    Bob Tetrault; Bob Baldwin

    2006-01-01

    Global crop condition information for major agricultural regions in the world can be monitored using the web-based application called Crop Explorer. With this application, U.S. and international producers, traders, researchers, and the public can access remote sensing information used by agricultural economists and scientists who predict crop production worldwide. For...

  9. Dialogic Pedagogy in Creative Practice: A Conversation in Examples

    ERIC Educational Resources Information Center

    Archer, Carol; Kelen, Christopher

    2015-01-01

    This paper surveys examples of dialogic pedagogy in creative practices in the areas of Visual Studies and Creative Writing at universities in Hong Kong and Macao. The authors describe their own participant-observer experience of evolving pedagogy for creative practice through on-site and remote interaction, with colleagues and with and between…

  10. Data Mining in Earth System Science (DMESS 2011)

    Treesearch

    Forrest M. Hoffman; J. Walter Larson; Richard Tran Mills; Bhorn-Gustaf Brooks; Auroop R. Ganguly; William Hargrove; et al

    2011-01-01

    From field-scale measurements to global climate simulations and remote sensing, the growing body of very large and long time series Earth science data are increasingly difficult to analyze, visualize, and interpret. Data mining, information theoretic, and machine learning techniques—such as cluster analysis, singular value decomposition, block entropy, Fourier and...

  11. Integrated remote sensing and visualization (IRSV) system for transportation infrastructure operations and management, phase two, volume 3 : advanced consideration in LiDAR technology for bridge evaluation.

    DOT National Transportation Integrated Search

    2012-03-01

    This report describes Phase Two enhancement of terrestrial LiDAR scanning for bridge damage : evaluation that was initially developed in Phase One. Considering the spatial and reflectivity : information contained in LiDAR scans, two detection algorit...

  12. Remotely deployable aerial inspection using tactile sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacLeod, C. N.; Cao, J.; Pierce, S. G.

    For structural monitoring applications, the use of remotely deployable Non-Destructive Evaluation (NDE) inspection platforms offer many advantages, including improved accessibility, greater safety and reduced cost, when compared to traditional manual inspection techniques. The use of such platforms, previously reported by researchers at the University Strathclyde facilitates the potential for rapid scanning of large areas and volumes in hazardous locations. A common problem for both manual and remote deployment approaches lies in the intrinsic stand-off and surface coupling issues of typical NDE probes. The associated complications of these requirements are obviously significantly exacerbated when considering aerial based remote inspection and deployment,more » resulting in simple visual techniques being the preferred sensor payload. Researchers at Bristol Robotics Laboratory have developed biomimetic tactile sensors modelled on the facial whiskers (vibrissae) of animals such as rats and mice, with the latest sensors actively sweeping their tips across the surface in a back and forth motion. The current work reports on the design and performance of an aerial inspection platform and the suitability of tactile whisking sensors to aerial based surface monitoring applications.« less

  13. Use of remote-sensing techniques to survey the physical habitat of large rivers

    USGS Publications Warehouse

    Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffery W.; Kennedy, Gregory W.; Smith, Stephen B.; Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffrey W.; Kennedy, Gregory W.; Smith, Stephen B.

    1997-01-01

    Remote-sensing techniques that can be used to quantitatively characterize the physical habitat in large rivers in the United States where traditional survey approaches typically used in small- and medium-sized streams and rivers would be ineffective or impossible to apply. The state-of-the-art remote-sensing technologies that we discuss here include side-scan sonar, RoxAnn, acoustic Doppler current profiler, remotely operated vehicles and camera systems, global positioning systems, and laser level survey systems. The use of these technologies will permit the collection of information needed to create computer visualizations and hard copy maps and generate quantitative databases that can be used in real-time mode in the field to characterize the physical habitat at a study location of interest and to guide the distribution of sampling effort needed to address other habitat-related study objectives. This report augments habitat sampling and characterization guidance provided by Meador et al. (1993) and is intended for use primarily by U.S. Geological Survey National Water Quality Assessment program managers and scientists who are documenting water quality in streams and rivers of the United States.

  14. Information recovery through image sequence fusion under wavelet transformation

    NASA Astrophysics Data System (ADS)

    He, Qiang

    2010-04-01

    Remote sensing is widely applied to provide information of areas with limited ground access with applications such as to assess the destruction from natural disasters and to plan relief and recovery operations. However, the data collection of aerial digital images is constrained by bad weather, atmospheric conditions, and unstable camera or camcorder. Therefore, how to recover the information from the low-quality remote sensing images and how to enhance the image quality becomes very important for many visual understanding tasks, such like feature detection, object segmentation, and object recognition. The quality of remote sensing imagery can be improved through meaningful combination of the employed images captured from different sensors or from different conditions through information fusion. Here we particularly address information fusion to remote sensing images under multi-resolution analysis in the employed image sequences. The image fusion is to recover complete information by integrating multiple images captured from the same scene. Through image fusion, a new image with high-resolution or more perceptive for human and machine is created from a time series of low-quality images based on image registration between different video frames.

  15. Headlines: Planet Earth: Improving Climate Literacy with Short Format News Videos

    NASA Astrophysics Data System (ADS)

    Tenenbaum, L. F.; Kulikov, A.; Jackson, R.

    2012-12-01

    One of the challenges of communicating climate science is the sense that climate change is remote and unconnected to daily life--something that's happening to someone else or in the future. To help face this challenge, NASA's Global Climate Change website http://climate.nasa.gov has launched a new video series, "Headlines: Planet Earth," which focuses on current climate news events. This rapid-response video series uses 3D video visualization technology combined with real-time satellite data and images, to throw a spotlight on real-world events.. The "Headlines: Planet Earth" news video products will be deployed frequently, ensuring timeliness. NASA's Global Climate Change Website makes extensive use of interactive media, immersive visualizations, ground-based and remote images, narrated and time-lapse videos, time-series animations, and real-time scientific data, plus maps and user-friendly graphics that make the scientific content both accessible and engaging to the public. The site has also won two consecutive Webby Awards for Best Science Website. Connecting climate science to current real-world events will contribute to improving climate literacy by making climate science relevant to everyday life.

  16. The Importance of Earth Observations and Data Collaboration within Environmental Intelligence Supporting Arctic Research

    NASA Technical Reports Server (NTRS)

    Casas, Joseph

    2017-01-01

    Within the IARPC Collaboration Team activities of 2016, Arctic in-situ and remote earth observations advanced topics such as :1) exploring the role for new and innovative autonomous observing technologies in the Arctic; 2) advancing catalytic national and international community based observing efforts in support of the National Strategy for the Arctic Region; and 3) enhancing the use of discovery tools for observing system collaboration such as the U.S. National Oceanic and Atmospheric Administration (NOAA) Arctic Environmental Response Management Application (ERMA) and the U.S. National Aeronautics and Space Administration (NASA) Arctic Collaborative Environment (ACE) project geo reference visualization decision support and exploitation internet based tools. Critical to the success of these earth observations for both in-situ and remote systems is the emerging of new and innovative data collection technologies and comprehensive modeling as well as enhanced communications and cyber infrastructure capabilities which effectively assimilate and dissemination many environmental intelligence products in a timely manner. The Arctic Collaborative Environment (ACE) project is well positioned to greatly enhance user capabilities for accessing, organizing, visualizing, sharing and producing collaborative knowledge for the Arctic.

  17. Digital Earth system based river basin data integration

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Li, Wanqing; Lin, Chao

    2014-12-01

    Digital Earth is an integrated approach to build scientific infrastructure. The Digital Earth systems provide a three-dimensional visualization and integration platform for river basin data which include the management data, in situ observation data, remote sensing observation data and model output data. This paper studies the Digital Earth system based river basin data integration technology. Firstly, the construction of the Digital Earth based three-dimensional river basin data integration environment is discussed. Then the river basin management data integration technology is presented which is realized by general database access interface, web service and ActiveX control. Thirdly, the in situ data stored in database tables as records integration is realized with three-dimensional model of the corresponding observation apparatus display in the Digital Earth system by a same ID code. In the next two parts, the remote sensing data and the model output data integration technologies are discussed in detail. The application in the Digital Zhang River basin System of China shows that the method can effectively improve the using efficiency and visualization effect of the data.

  18. Primary care spirometry: test quality and the feasibility and usefulness of specialist reporting

    PubMed Central

    White, Patrick; Wong, Wun; Fleming, Tracey; Gray, Barry

    2007-01-01

    Background Provision of spirometry for chronic obstructive pulmonary disease (COPD) is a new requirement in primary care. Effective spirometry requires that tests and interpretations meet international criteria. Aim To assess the feasibility and usefulness of remote specialist reporting of primary care spirometry. Design of study Comparison of reporting by primary care clinicians and respiratory specialists of consecutive primary care spirometry tests. Setting South London primary care teams with patient lists ≥6000. Method Feasibility of remote reporting of spirometry was assessed by the frequency of electronic mailing of tests. Usefulness of remote reporting was defined by the frequency that specialist reports made a clinically significant addition. Usefulness was assessed by measuring agreement (κ) between primary care reports and those of specialists. Clinically significant disagreements were analysed with respect to test quality, diagnosis, and severity. Results Six practices emailed 312 tests over 3 months. Forty-nine tests sent without indices or curves (flow volume and time volume) were excluded. Mean age of patients tested was 65 years and 52% were female. Mean predicted forced expiratory volume in the first second (FEV1) was 69%. Clinically significant disagreements were identified in the interpretation of acceptability (quality) of 67/212 (32%) tests (κ = 0.07; 95% confidence interval [CI] = 0 to 0.24), of diagnosis in 49/168 (29%) tests (κ = 0.39; 95% CI = 0.25 to 0.55), and of severity in 62/191 (32%) tests (κ = 0.53; 95% CI = 0.43 to 0.63). Conclusion Remote reporting of primary care spirometry was feasible. Its usefulness was confirmed by the high rate of additional clinically significant information to the reports of primary care clinicians. The quality of primary care spirometry was so unsatisfactory that remote reporting of tests may be a means of establishing adequate spirometry. PMID:17761057

  19. UVMAS: Venus ultraviolet-visual mapping spectrometer

    NASA Astrophysics Data System (ADS)

    Bellucci, G.; Zasova, L.; Altieri, F.; Nuccilli, F.; Ignatiev, N.; Moroz, V.; Khatuntsev, I.; Korablev, O.; Rodin, A.

    This paper summarizes the capabilities and technical solutions of an Ultraviolet Visual Mapping Spectrometer designed for remote sensing of Venus from a planetary orbiter. The UVMAS consists of a multichannel camera with a spectral range 0.19 << 0.49 μm which acquires data in several spectral channels (up to 400) with a spectral resolution of 0.58 nm. The instantaneous field of view of the instrument is 0.244 × 0.244 mrad. These characteristics allow: a) to study the upper clouds dynamics and chemistry; b) giving constraints on the unknown absorber; c) observation of the night side airglow.

  20. A Graphical Operator Interface for a Telerobotic Inspection System

    NASA Technical Reports Server (NTRS)

    Kim, W. S.; Tso, K. S.; Hayati, S.

    1993-01-01

    Operator interface has recently emerged as an important element for efficient and safe operatorinteractions with the telerobotic system. Recent advances in graphical user interface (GUI) andgraphics/video merging technologies enable development of more efficient, flexible operatorinterfaces. This paper describes an advanced graphical operator interface newly developed for aremote surface inspection system at Jet Propulsion Laboratory. The interface has been designed sothat remote surface inspection can be performed by a single operator with an integrated robot controland image inspection capability. It supports three inspection strategies of teleoperated human visual inspection, human visual inspection with automated scanning, and machine-vision-based automated inspection.

Top